This repository contains laboratory materials for the "Stochastic Methods in Machine Learning" course at AGH University of Science and Technology in Krakow.
This course explores various problems at the intersection of optimization and machine learning.
| Lab | Title | Description |
|---|---|---|
| 1 | Gradient Descent | Implements gradient descent algorithm to train a linear regression model from scratch. |
| 2 | Gradient Descent Extensions | Covers gradient descent extensions including Momentum, AdaGrad, and Adam. Students test these optimizers on standard benchmark test functions: Sphere, Rosenbrock, Rastrigin. |
| 3 | Adversarial Examples | Investigates the vulnerability of neural networks to adversarial attacks, implementing the Fast Gradient Sign Method (FGSM) to generate perturbations that cause misclassification. |
| 4 | Model-Based Offline Optimization | Explores optimization of black-box functions using pre-collected datasets without additional function evaluations. Involves training neural network surrogate models to approximate benchmark functions and implementing gradient-based optimization techniques on these surrogate models to find optimal solutions. |
If you find any mistakes or have suggestions for improvements:
- Create an Issue: Open a new issue in the repository describing the problem or suggestion in detail.
- Submit a Pull Request: If you have a fix or improvement, feel free to fork the repository and submit a pull request with your changes.
Your contributions help improve the quality of these materials for all students.