-
-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Pull requests: axolotl-ai-cloud/axolotl
Author
Label
Projects
Milestones
Reviews
Assignee
Sort
Pull requests list
fix(pydantic): set allowed values for
adapter config
#3415
opened Feb 16, 2026 by
NanoCode012
Loading…
Compatibility: torch 2.10 baseline and transformers>=5 interoperability
#3414
opened Feb 16, 2026 by
cccat6
Loading…
Fix: load gemma3 text only via dynamic weights
#3409
opened Feb 13, 2026 by
NanoCode012
•
Draft
2 tasks
Fix FSDP2 sharding and validate AO version for LR groups
#3403
opened Feb 11, 2026 by
bekk02
Loading…
Fix: excess_length_strategy truncation method
ready to merge
#3401
opened Feb 10, 2026 by
rlronan
Loading…
feat: add auto-chunking support for streaming pretraining datasets
#3389
opened Feb 3, 2026 by
madScientist10
Loading…
3 tasks done
fix: pass revision parameter to tokenizer and processor loaders
ready to merge
#3388
opened Feb 3, 2026 by
madScientist10
Loading…
New "muonclip" Muon implementation (FSDP/DSZ3 compatible, faster, slow Muon clip support)
waiting on upstream
#3265
opened Nov 14, 2025 by
lhl
Loading…
MoE Grouped MM support (5X+ MoE training perf gains)
waiting on upstream
#3260
opened Nov 12, 2025 by
lhl
Loading…
Allow muon optimizer with DeepSpeed Zero 1-2
waiting on upstream
#3258
opened Nov 11, 2025 by
lhl
Loading…
include tool in default message_property_mappings
hold
don't merge this yet
#3228
opened Oct 23, 2025 by
winglian
Loading…
Previous Next
ProTip!
Find all pull requests that aren't related to any open issues with -linked:issue.