Skip to main content
ML Quest
Python Idle

Forget sklearn. Forget autograd. Today you build the optimization engine that powers nearly every machine-learning model — gradient descent — with nothing but numpy. You'll generate synthetic data, define a cost function, compute gradients by hand, and watch your weights converge step by step. By the end, you won't just use gradient descent. You'll understand it.

~25 minscenario
Loading Python runtime...
Goals: 4 tests
weights array should exist
final cost should be less than initial cost
cost_history should have 100+ entries
learned slope should be approximately 3.0 (within +/-0.5)
Python loading...