Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixed - RuntimeError: FlashAttention only supports fp16 and bf16 data type #101

Closed
wants to merge 1 commit into from

Conversation

weicheng113
Copy link
Contributor

@weicheng113 weicheng113 commented Oct 27, 2023

The fix is copied from here .

Below are the information and solutions I went through.

huggingface/peft#790
huggingface/transformers#26066

@yukang2017
Copy link
Member

Hi @weicheng113 ,

Many thanks for your contribution. Would you please tell me that what are the versions of your peft, flash-attention, and transformers, and deepspeed? Are there any requirements for the versions of them?

Regards,
Yukang Chen

@weicheng113
Copy link
Contributor Author

weicheng113 commented Oct 27, 2023

Hello Yukang,

You are right. I realized the error could be because I used a different versions of lib than the one used in your official repo. Sorry for that. My library versions are below. Most of them are the latest versions I think. I will close the pull request. Thanks.

https://github.com/weicheng113113/LongLoRA/blob/experiment/pyproject.toml

By the way, I saw you did not fix some library versions in the requirement.txt. There might be a need to fix them just in case.

@yukang2017
Copy link
Member

Hi @weicheng113 ,

I think you could open an PR to fix the library version in requirement.txt. I will merge it into the main branch. In this way, your name would be included in the contributors, to show my thanks for your contribution.

Regards,
Yukang Chen

@weicheng113
Copy link
Contributor Author

Thanks @yukang2017 for the opportunity. I will have a try. Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants