This project hosts a collection of custom neural network architectures I implemented with PyTorch. I am especially interested in Graph Neural Networks and Transformer architectures. See below for a list of the implementations so far.
Footnotes
-
Veličković, P., Cucurull, G., Casanova, A., Romero, A., Liò, P., & Bengio, Y. (2017). Graph Attention Networks. doi:10.48550/ARXIV.1710.10903 [arxiv] ↩
-
Kipf, T. N., & Welling, M. (2016). Semi-Supervised Classification with Graph Convolutional Networks. doi:10.48550/ARXIV.1609.02907 [arxiv] ↩
-
Defferrard, M., Bresson, X., & Vandergheynst, P. (2016). Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering. doi:10.48550/ARXIV.1606.09375 [arxiv] ↩
-
Dauphin, Y. N., Fan, A., Auli, M., & Grangier, D. (2016). Language Modeling with Gated Convolutional Networks. doi:10.48550/ARXIV.1612.08083 [arxiv] ↩
-
Lim, B., Arik, S. O., Loeff, N., & Pfister, T. (2019). Temporal Fusion Transformers for Interpretable Multi-horizon Time Series Forecasting. doi:10.48550/ARXIV.1912.09363 [arxiv] ↩ ↩2
-
Ramsauer, H., Schäfl, B., Lehner, J., Seidl, P., Widrich, M., Adler, T., … Hochreiter, S. (2020). Hopfield Networks is All You Need. doi:10.48550/ARXIV.2008.02217 [arxiv] ↩
-
Klambauer, G., Unterthiner, T., Mayr, A., & Hochreiter, S. (2017). Self-Normalizing Neural Networks. doi:10.48550/ARXIV.1706.02515 [arxiv] ↩