Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(nlp): init basic bert code #55

Merged
merged 1 commit into from
Jan 22, 2022
Merged

feat(nlp): init basic bert code #55

merged 1 commit into from
Jan 22, 2022

Conversation

mmmwhy
Copy link
Owner

@mmmwhy mmmwhy commented Jan 19, 2022

我一开始直接去看了 huggingface 的 transformer ,看完了之后感觉太庞大了,一头雾水。后续参考了 bert4pytorchRead_Bert_Code 的代码,在自己的理解上,加入了注释得到了本 pr 的代码。

使用了 self.load_state_dict(state_dict, strict=True) 确保加载参数无误

@mmmwhy mmmwhy changed the title feat(nlp): write attention code in pure_attention repo wip: feat(nlp): write attention code in pure_attention repo Jan 19, 2022
@mmmwhy mmmwhy force-pushed the wip-fy branch 29 times, most recently from 2f02cbb to 7eb76c8 Compare January 22, 2022 10:11
@mmmwhy mmmwhy force-pushed the wip-fy branch 4 times, most recently from c491dbd to a834514 Compare January 22, 2022 10:14
@mmmwhy
Copy link
Owner Author

mmmwhy commented Jan 22, 2022

image

@mmmwhy mmmwhy merged commit 6a446fb into master Jan 22, 2022
@mmmwhy mmmwhy changed the title wip: feat(nlp): write attention code in pure_attention repo feat(nlp): init basic bert code Jan 22, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant