π Professional educational project for understanding and implementing Gradient Descent and Stochastic Gradient Descent (SGD) from scratch in Python.
This repository demonstrates the mathematical foundation and practical implementation of:
- Gradient Descent (GD)
- Stochastic Gradient Descent (SGD)
- Mini-Batch Gradient Descent
- Convergence analysis
- Optimization visualization
gradient descent
stochastic gradient descent
sgd solver python
optimization algorithm
machine learning optimization
gradient descent from scratch
mini batch gradient descent
convex optimization
loss function minimization
python optimization implementation
For parameters:
Update rule:
Where:
-
$$\eta$$ β learning rate -
$$J(\theta)$$ β loss function -
$$\nabla J(\theta)$$ β gradient
Instead of full dataset gradient:
Where:
- Gradient computed on a single sample
- Faster but noisier updates
Balances:
- Stability
- Speed
- Computational efficiency
Gradient-based optimization is used in:
- Neural networks
- Linear regression
- Logistic regression
- Deep learning
- Large-scale optimization
This project explains the algorithm at a mathematical and implementation level.
gradient-descent-sgd-solver/
β
βββ README.md
βββ LICENSE
βββ requirements.txt
β
βββ src/
β βββ gradient_descent.py
β βββ sgd.py
β βββ loss_functions.py
β βββ optimizer.py
β
βββ examples/
β βββ demo.py
β
βββ docs/
β βββ theory.md
β
βββ images/
β βββ convergence_plot.png
β
βββ index.html
Clean structure improves:
β Discoverability
β Professional appearance
β Portfolio quality
import numpy as np
def gradient_descent(X, y, lr=0.01, epochs=1000):
m, n = X.shape
theta = np.zeros(n)
for _ in range(epochs):
predictions = X @ theta
error = predictions - y
gradient = (1/m) * X.T @ error
theta -= lr * gradient
return thetapip install -r requirements.txtRun example:
python examples/demo.pyAdd:
- Loss curve plot
- Parameter convergence graph
- 3D loss surface
Example:
import matplotlib.pyplot as plt
plt.plot(loss_history)
plt.xlabel("Epoch")
plt.ylabel("Loss")
plt.title("Convergence")
plt.show()