paramflow
is a flexible and user-friendly parameter and configuration management library designed for
machine learning workflows and applications requiring profiles and layered parameters. It enables seamless
merging of parameters from multiple sources, auto-generates a command-line argument parser,
and allows for easy parameter overrides.
- Layered configuration: Merge parameters from files, environment variables, and command-line arguments.
- Immutable dictionary: Provides a read-only dictionary with attribute-style access.
- Profile support: Manage multiple sets of parameters with profile-based layering.
- Layered meta-parameters:
paramflow
configures itself using a layered approach. - Automatic type conversion: Converts types during merging based on target parameter types.
- Command-line argument parsing: Automatically generates an
argparse
parser from parameter definitions. - Nested Configuration: Allows for nested configuration and merging.
pip install paramflow
Install with .env
support:
pip install "paramflow[dotenv]"
[default]
learning_rate = 0.001
batch_size = 64
import paramflow as pf
params = pf.load('params.toml')
print(params.learning_rate) # 0.001
Running the script with --help
displays both meta-parameters and parameters:
python app.py --help
Meta-parameters control how paramflow.load
reads its own configuration. Layering order:
paramflow.load
arguments- Environment variables (default prefix:
P_
) - Command-line arguments (
argparse
)
Via command-line:
python print_params.py --profile dqn-adam
Via environment variable:
P_PROFILE=dqn-adam python print_params.py
Parameters are merged from multiple sources in the following order:
- Configuration files (
.toml
,.yaml
,.ini
,.json
,.env
) - Environment variables (default prefix:
P_
) - Command-line arguments (
argparse
)
You can specify the order explicitly (env
and args
are reserved names):
params = pf.load('params.toml', 'env', '.env', 'args')
Override parameters via command-line arguments:
python print_params.py --profile dqn-adam --learning_rate 0.0002
[default]
learning_rate = 0.00025
batch_size = 32
optimizer_class = 'torch.optim.RMSprop'
optimizer_kwargs = { momentum = 0.95 }
random_seed = 13
[adam]
learning_rate = 1e-4
optimizer_class = 'torch.optim.Adam'
optimizer_kwargs = {}
python app.py --profile adam
This overrides:
learning_rate
→1e-4
optimizer_class
→torch.optim.Adam
optimizer_kwargs
→{}
Profiles can be used to manage configurations for different environments.
[default]
debug = true
database_url = "mysql://localhost:3306/myapp"
[dev]
database_url = "mysql://dev:3306/myapp"
[prod]
debug = false
database_url = "mysql://prod:3306/myapp"
export P_PROFILE=dev
python app.py