Skip to content

Make the model config override the pretrained config #170

@jlamypoirier

Description

@jlamypoirier

🎯 Goal (What & Why)

Currently, a pretrained config overrides an arbitrary part of the user-specified config. This causes a lot of troubles:

I suggest flipping things around so the specified model config overrides the pretrained config. This should give us the behaviour we want in most cases:

  • Pretrained config, no base model config: All architecture parameters are imported, and so are relevant non-architecture parameters (ex. window_size). Other non-architecture parameters take the Fast-LLM default.
  • Pretrained config, base model config with non-architecture parameters: Parameters explicitly specified in the base model config are taken, others are as above.
  • Pretrained config, base model config with architecture parameters: We probably want to enforce matching values, and raise an error for any mismatch. (This would be an improvement because right now wrong values are silently ignored.)
  • No pretrained config: Same as before.

🚀 Execution Plan

We can use Fast-LLM's override mechanism as in #168.
However, we'll also need to adapt the update mechanism to get the behaviour we want for nested configs.
It could also be difficult to achieve backward compatibility.

📌 Acceptance Criteria (Must-Haves for Completion)

  • Things should work as described above

🛠️ Project Management

  • Assign the project to the Fast-LLM project.
  • Set the Estimate field (in days) in the GitHub project.
  • Use the Size field to categorize the PR size (Small/Medium/Large).
  • Assign an owner when opening the issue.

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions