-
Notifications
You must be signed in to change notification settings - Fork 325
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
xtuner 微调internLM2.5出错 #952
Comments
解决了吗? |
看起来不维护了,也不打算解决了? |
一样的问题,怎么解决 |
解决了xdm 把pytorch版本回退到2.4.1就可以了 |
什么?项目不维护了?连问题也不回答了? |
我也是这个问题我试一下 |
mmengine不支持2.5.x的pytorch,应该是命名冲突了,改下mmengine的源码或者把pytorch降级一下就行 |
为什么我改成 2.4.1 了也不行啊还是一样的报错 |
兄弟们可以试试按照这个pip 装。这是我跑通后导出的依赖包文件。 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
命令:(xtuner-env) root@autodl-container-d293479255-f53de588:~/autodl-tmp/data# xtuner train sh/internlm2_5_chat_7b_qlora_oasst1_e3_copy.py --deepspeed deepspeed_zero2
报错信息:10/18 16:45:32 - mmengine - WARNING - WARNING: command error: ''Adafactor is already registered in optimizer at torch.optim''!
10/18 16:45:32 - mmengine - WARNING -
Arguments received: ['xtuner', 'train', 'sh/internlm2_5_chat_7b_qlora_oasst1_e3_copy.py', '--deepspeed', 'deepspeed_zero2']. xtuner commands use the following syntax:
请教如何解决
The text was updated successfully, but these errors were encountered: