Skip to content

aman2626786/Gradient-Descent-Full-Review

Repository files navigation

Gradient-Descent-Full-Review


Gradient Descent Step-by-Step (Part 1)

Description:
This notebook provides a step-by-step implementation of the Gradient Descent algorithm, focusing on using loops to iteratively update model parameters. It is designed to help beginners understand how gradient descent works at a fundamental level before moving on to more optimized approaches.

Key highlights of this notebook:

  • A basic introduction to Gradient Descent and its importance in optimization.
  • Implementation of gradient descent using loops for better clarity and understanding.
  • Visualizations and mathematical explanations to show how parameters update over iterations.
  • A hands-on approach for learning through Python and NumPy.

This notebook is ideal for those who are new to machine learning and want to build a solid foundation in optimization techniques before implementing more advanced models.


Gradient Descent Step-by-Step (Part 2)

Description:
This second part of the Gradient Descent Step-by-Step series takes a deeper dive into the optimization process by introducing a self-designed algorithm. Instead of using a conventional gradient descent implementation, this notebook explores a custom approach to updating weights, potentially improving efficiency or addressing specific limitations of standard gradient descent.

Key highlights of this notebook:

  • A refined or customized approach to gradient descent, named SELF ALGORITHM.
  • A comparison with the standard gradient descent approach to highlight improvements or differences.
  • Further insights into learning rate adjustments, convergence speed, and optimization techniques.
  • Python implementation with practical examples and visualizations for better understanding.

This notebook is great for those who already understand the basics of gradient descent and want to explore different variations or custom optimizations to improve machine learning model performance.


About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors