Abulikemu Abuduweili, Yuyang Pang, Feihan Li, Changliu Liu, Scaling Law of Neural Koopman Operators, arXiv preprint, 2026
Koopman operator theory lifts nonlinear dynamics
We prove that the prediction error of a neural Koopman model with latent dimension
-
First term — statistical error: decreases as the number of training samples
$m$ grows. More data yields better generalization. -
Second term — approximation error: decreases as the latent dimension
$n$ increases. A larger latent space captures more of the true Koopman spectrum.
The two terms reveal a data-model tradeoff: increasing
where
-
Multi-step prediction loss: reconstruction error over
$K$ future steps with discounted weighting. -
Inverse control loss: reconstructs
$u_t$ from consecutive latent states via the pseudo-inverse of$B$ , preserving controllability. - Covariance regularization: penalizes off-diagonal latent covariance entries, encouraging decorrelated dimensions.
@article{abuduweili2026scaling,
title={Scaling Law of Neural Koopman Operators},
author={Abuduweili, Abulikemu and Pang, Yuyang and Li, Feihan and Liu, Changliu},
journal={arXiv preprint arXiv:2602.19943},
year={2026}
}Requirements: Python >= 3.8, PyTorch >= 1.12, NumPy, SciPy, pandas, tqdm. Optional: PyBullet (Franka env), wandb (tracking), Isaac Lab (MPC evaluation).
pip install torch numpy scipy pandas pybullet tqdmTrain a Koopman model:
python scripts/train_model.py \
--env_name Franka \
--sample_size 60000 \
--encode_dim 4 \
--layer_depth 3 \
--hidden_dim 256 \
--use_residual \
--use_control_loss \
--use_covariance_lossModels are saved to log/<project>/best_models/. Run scripts/run_experiments.sh for full hyperparameter sweeps, or scripts/run_corr_experiments.sh for the corrected evaluation/ reproduce all paper figures.
scripts/
train_model.py # Main training entrypoint
run_experiments.sh # Full hyperparameter sweep
run_corr_experiments.sh # Corrected scaling-law sweep for G1/Go2
utility/
dataset.py # Dataset collectors for all 8 environments
network.py # KoopmanNet architecture (encoder + linear dynamics)
lqr.py # LQR utilities
rbf.py # RBF basis functions
control/
mpc_tracking.py # Isaac Lab MPC tracking evaluation for G1/Go2
evaluation/
evaluate_prediction.ipynb # Prediction metrics and plots
evaluate_tracking.ipynb # Tracking metrics from Isaac MPC results
evaluate_correlation.ipynb # Scaling law and correlation analysis
evaluate_covariance.ipynb # Covariance-related analysis
figs/ # Paper figures

