Pytorch implementation of RFD (see arXiv)
Provides an implementation of the SquaredExponential covariance model
with an auto_fit function, which requires only
- A
model_factorywhich returns the same but randomly initialized model every time it is called - A
lossfunction e.g.torch.nn.functional.nll_losswhich accepts a prediction and a true value - data, which can be passed to
torch.utils.DataLoaderwith different batch size parameters such that it returns(x,y)tuples when iterated on - a
csvfilename which acts as the cache for the covariance model ofthis unique (model, data, loss) combination.
Such a covariance model can then be passed to RFD which implements the
pytorch optimizer interface. The end result can be used like torch.optim.Adam
from benchmaking.classification.mnist.models.cnn3 import CNN3
import torch
import torchvision as tv
from pyrfd import RFD, SquaredExponential
cov_model = SquaredExponential()
cov_model.auto_fit(
model_factory=CNN3,
loss=torch.nn.functional.nll_loss,
data= tv.datasets.MNIST(
root="mnistSimpleCNN/data",
train=True,
transform=tv.transforms.ToTensor()
),
cache="cache/CNN3_mnist.csv",
# should be unique for (models, data, loss)
)
rfd = RFD(
CNN3().parameters(),
covariance_model=cov_model
)@inproceedings{benningRandomFunctionDescent2024,
title = {Random {{Function Descent}}},
booktitle = {Advances in {{Neural Information Processing Systems}}},
author = {Benning, Felix and D{\"o}ring, Leif},
year = {2024},
month = dec,
volume = {37},
primaryclass = {cs, math, stat},
publisher = {Curran Associates, Inc.},
address = {Vancouver, Canada},
}