Skip to content

Conversation

@ArjunParthasarathy
Copy link

Description

@AlanAboudib this adds support for a BERT encoder and iterator, specifically for the Masked LM use case.

Affected Dependencies

Now requires HuggingFace Transformers library to be installed.

How has this been tested?

I used a structure very similar to the BPTT Example Notebook to verify that my encoder and iterator work in training a BERT model. I was able to get good test and validation scores for my trained model on the Wikitext-2 dataset.

Copy link
Contributor

@AlanAboudib AlanAboudib left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just a question?

Copy link
Contributor

@AlanAboudib AlanAboudib left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great work @Dat-Boi-Arjun . Would you also please submit your jupyter notebook for training in examples/local

@review-notebook-app
Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants