We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hello, first of all thank you for the good work.
I have one question: Is the performance of the MixViT_L model on the got10k dataset the performance learned on the full dataset? (AO : 75.7%)
The text was updated successfully, but these errors were encountered:
Hi, the MixViT-L model, which obtains the AO of 75.7% on got10k-test, is trained only on the got10 dataset.
Sorry, something went wrong.
Thank you for answer. Is the backbone a convmae large model?
This model employs ViT-L as backbone. (the MixViT_L(ConvMAE) uses convmae backbone.)
MixViT_L(ConvMAE)
The model was trained on GOT 10k full the AO comes to be 57% with lower IOU what can be the reason? it is different than the paper?
No branches or pull requests
Hello, first of all thank you for the good work.
I have one question: Is the performance of the MixViT_L model on the got10k dataset the performance learned on the full dataset? (AO : 75.7%)
The text was updated successfully, but these errors were encountered: