Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The download speed is so slow... #5

Open
David-19940718 opened this issue Apr 7, 2023 · 4 comments
Open

The download speed is so slow... #5

David-19940718 opened this issue Apr 7, 2023 · 4 comments

Comments

@David-19940718
Copy link

image

Is it any solution to address?

@rentainhe
Copy link
Collaborator

image

Is it any solution to address?

You can check the network situation, this may be a network issue.

@David-19940718
Copy link
Author

image
Is it any solution to address?

You can check the network situation, this may be a network issue.

Hello, can u provide the bert weight for me? I can't fixed the network issue.

@David-19940718
Copy link
Author

Actually, I haved download the pretrained weight from huggingface, unfortunately, it raise some error:
Some weights of the model checkpoint at bert-base-uncased were not used when initializing BertModel: ['cls.predictions.transform.dense.weight', 'cls.predictions.transform.dense.bias', 'cls.seq_relationship.bias', 'cls.seq_relationship.weight', 'cls.predictions.bias', 'cls.predictions.transform.LayerNorm.weight', 'cls.predictions.decoder.weight', 'cls.predictions.transform.LayerNorm.bias']

  • This IS expected if you are initializing BertModel from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
  • This IS NOT expected if you are initializing BertModel from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).
    _IncompatibleKeys(missing_keys=[], unexpected_keys=['label_enc.weight'])

@SlongLiu
Copy link
Contributor

Actually, I haved download the pretrained weight from huggingface, unfortunately, it raise some error: Some weights of the model checkpoint at bert-base-uncased were not used when initializing BertModel: ['cls.predictions.transform.dense.weight', 'cls.predictions.transform.dense.bias', 'cls.seq_relationship.bias', 'cls.seq_relationship.weight', 'cls.predictions.bias', 'cls.predictions.transform.LayerNorm.weight', 'cls.predictions.decoder.weight', 'cls.predictions.transform.LayerNorm.bias']

  • This IS expected if you are initializing BertModel from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
  • This IS NOT expected if you are initializing BertModel from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).
    _IncompatibleKeys(missing_keys=[], unexpected_keys=['label_enc.weight'])

You may load models under unstrict mode. Just ignoring the incompatible keys is fine. See our examples for reference: https://github.com/IDEA-Research/Grounded-Segment-Anything/blob/main/grounded_sam.ipynb

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants