Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GraphCast: Fine-tuning (Impact: moderate, effort: high) #560

Open
Tracked by #499
mnabian opened this issue Jun 18, 2024 · 2 comments
Open
Tracked by #499

GraphCast: Fine-tuning (Impact: moderate, effort: high) #560

mnabian opened this issue Jun 18, 2024 · 2 comments
Assignees

Comments

@mnabian
Copy link
Collaborator

mnabian commented Jun 18, 2024

No description provided.

@oublalkhalid
Copy link

oublalkhalid commented Jan 4, 2025

Hello @mnabian,

Thank you for your excellent implementation using modulus! I was wondering if it's possible to inherit the pretrained checkpoint from the main JAX model. I'm exploring alternatives to leverage the pre-trained model for finetuning in this context.

Have there been any attempts to convert JAX model weights for use in the torch modulus implementation? Thanks in advance!

Best,
Koublal

@Flionay
Copy link

Flionay commented Jan 9, 2025

Looking forward to further progress on the GraphCast model, or demonstration of training results using Modulus Vs DeepMind jax.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants