You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
After use my own corpus to do domain-adaptive pretraining, the vocab.txt is the same size with the initialized model(BERT-base). In short, the domain-adaptive pretraining does not extend the vocabulary of the new domain? Therefore same specific
vocabulary of the new domain still not exist in the domain-adaptive pretraining result vocab.txt. Is that?
The text was updated successfully, but these errors were encountered:
After use my own corpus to do domain-adaptive pretraining, the
vocab.txt
is the same size with the initialized model(BERT-base). In short, the domain-adaptive pretraining does not extend the vocabulary of the new domain? Therefore same specificvocabulary of the new domain still not exist in the domain-adaptive pretraining result
vocab.txt
. Is that?The text was updated successfully, but these errors were encountered: