We are in the process of uploading the full code base. In the meantime, feel free to reach out if there are any questions!
This repo contains the python implementations of ET-GP-UCB using GPyTorch [1] and BoTorch [2]. This README.md
will guide you through the process of reproducing the results from the paper "Event-Triggered Time-Varying Bayesian Optimization" published at Transactions of Machine Learning Research (TMLR).
If you find our code or paper useful, please consider citing
@article{brunzema2025time,
title={Event-Triggered Time-Varying {Bayesian} Optimization},
author={Brunzema, Paul and Von Rohr, Alexander and Solowjow, Friedrich and Trimpe, Sebastian},
journal={Transactions on Machine Learning Research (TMLR)},
year={2025},
}
The structure of this folder is the following:
.
├── objective_functions # Objective functions
├── results # result files (empty)
├── src # Source files
├── utils # Utilities
├── examples # Example notebooks
├── run_{...}.ipynb # jupyter-notebooks as run-scrips
├── explore_temperature_data.ipynb # visual exploration and filtering of temperature data
├── ...
├── requirements.txt # .txt-file with package specifications
├── LICENCE # MIT Licence
└── README.md
Create an environment called et-bo
with Python 3.9.12, pull the content from this repo intro the environment, and install all needed packages with:
cd et-bo
source .venv/bin/activate
pip install -r requirements.txt
To reproduce the experiments with synthetic data, we provide jupyter-notebooks.
- with a constant rate of change:
run_2D_synthetic_data.ipynb
- with a misspecified rate of change:
run_2D_misspecified_eps.ipynb
Note that due to the upper bound on the upload size, the objective functions have to be recreated for each random seed. This is done automatically, but increases the runtime significantly. Therefore, for each of the experiment in the paper, expect a runtime up to 10h. (Of course you can also decrease the number of runs in the respective jupyter-notebooks.)
To reproduce the application experiments, some additional steps are necessary.
The dataset with the temperature data was too large for the upload. Therefore, in the following, we will guide you though the steps of downloading and saving the dataset, to reproduce the results in Fig. 3 (b) and (c).
- Go to the Website of Intel Berkeley Research lab dataset.
- Download the temperature data (150 MB uncompressed). Then:
3. Uncompress the data (if still compressed).
4. Rename the file to
temperature_data.txt
. - Download the sensor placement data here. Then:
- Copy the data and save it in a file called
coordinates_sensors.txt
.
- Copy the data and save it in a file called
- Now save both files in
.../et-bo/objective_functions/applications/
.
To check if everything worked, open the jupyter-notebook explore_temperatur_data.ipynb
. In this notebook, there is a visual exploration of the temperature data as well as the data preprocessing including subsampling the data in the required 10min intervals.
To reproduce the results in
- Fig. 3 (b) open the jupyter-notebook
run_application_temperature_data_days7and8.ipynb
- Fig. 3 (c) open the jupyter-notebook
run_application_temperature_data_days5and6.ipynb
You can reproduce the results of the policy search example in Fig. 3 (d) using the jupyter-notebook run_application_policy_search.ipynb
.