A simple implementation of univariate linear regression using gradient descent, built with NumPy and Matplotlib.
-
Data Generation: Creates synthetic data with linear relationship
$y = a + bx + noise$ - Cost Function: Calculates Mean Squared Error between predictions and true values
-
Gradient Descent: Iteratively finds optimal parameters
$\theta_0$ and$\theta_1$ - Visualization: Shows the entire learning process through various plots
Shows the raw data points and an initial line estimate.
Visualizes the cost function as 3D surface and contour plot - shows the "valley" where the minimum lies.
Left: Cost reduction over iterations. Right: Final line fit to the data.
Compares different learning rates and shows their impact on convergence speed.