Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

torch编译错误 #956

Open
tcxia opened this issue Oct 24, 2024 · 0 comments
Open

torch编译错误 #956

tcxia opened this issue Oct 24, 2024 · 0 comments

Comments

@tcxia
Copy link

tcxia commented Oct 24, 2024

10/24 18:04:14 - mmengine - WARNING - WARNING: command error: 'module 'torch.compiler' has no attribute 'is_compiling''!
10/24 18:04:14 - mmengine - WARNING -
Arguments received: ['xtuner', 'train', '/mnt/pfs/jinfeng_team/LA/xiatianci/xtuner/xtuner/configs/llama/llama3_8b/llama3_8b_full_alpaca_e3.py']. xtuner commands use the following syntax:

    xtuner MODE MODE_ARGS ARGS

    Where   MODE (required) is one of ('list-cfg', 'copy-cfg', 'log-dataset', 'check-custom-dataset', 'train', 'test', 'chat', 'convert', 'preprocess', 'mmbench', 'eval_refcoco')
            MODE_ARG (optional) is the argument for specific mode
            ARGS (optional) are the arguments for specific command
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant