My background
It started randomly, like most detours in my life. Back in uni, I was knee-deep in a degree that sounded good on paper but left me staring at screens, hating every equation. Math? Forgot about it after high school teachers turned it into a nightmare—stressful pop quizzes, zero explanations. First MathAcademy, heavy math, a lot of exercises, but no programming exercises. I stopped learning theory (for a few weeks) and dove into programming instead. Self-taught, trial-by-fire. Deep down, those neural net tutorials kept nagging: “What’s really happening under the hood with backprop?”
I can hack together models with PyTorch, feeling like a tourist. “Autograd just works,” I’d say. But I wanted to own it. To feel the click of understanding, not just copy-paste. One late night scrolling X, seeing folks share their from-scratch frameworks, envy hit. I wanted that win too. A good example of that is chibigrad by @sumitdotml.
Now
Fast-forward to the beginning of September. Enough. I cracked open a notebook, sketched a Tensor class on a whim. No grand plan, just “make addition backprop.” With a little help of others work and a chat with LLM, a simple chain worked. Gradients flowed. I urged LLM to not output code for me. I wanted to learn. Since I was so focused on learning I turned off also Cursor Tab cause it was freaking annoying.
I’ve been posting snippets on X—mostly for accountability. Seeing replies, “Tried this, failed that,” keeps me going. No more fading interest. After ~20 hours (and one fried brain), here’s my raw take. Learning math is worth it. Learning backprop was worth it. Kinda obvious, but I need to repeat that to myself from time to time.
Review
Building autograd from scratch? Brain-bender at first, but the gradient flow “aha” moments make it gold. Worth every hour if you crave owning the math behind DL.
Bullet-pointing the chaos, clarity and math nuggets that stuck.
- Chain rule revival: Seeing the output in a console. That recursive in action? I had NO derivatives at school. Test: . Nailed 56 for . Felt like reclaiming lost territory.
- Op-by-op wins: No overwhelm—add first (, ), then ReLU ( if else ). Clear staircase: forward graphs it, backward multiplies Jacobians. Pow tripped me (), but deriving it fresh? Therapeutic.
- X crowd boost: Posted my first backward fail; replies poured in—“Visited set, dummy!” Community’s zero-judgment, all “what’s your next op?” vibes. Motivates like seeing MathAcademy streaks.
- Hooks you deep: Set “one deriv per day” goal. Gamified with tests: XOR MLP hit 98% accuracy. Missed a day debugging matmul shapes? Back stronger. Plus, rediscovered broadcasting—NumPy’s silent hero for vectorized .
- Shape shame: Matmul broadcasting? ∂(AB)/∂A needs reshapes I forgot. Hours lost to “dimensions mismatch.” And time? If you’re not solo-coding marathons, it drags—feels like those old math quizzes, but self-inflicted.
- TDD helped: I’m a big fan of this approach. I constantly set the rules: first test, understand the math on paper, try to connect the dots. No code implementation. Once I felt like I’m slowly getting it, I started adding code. First asking LLM for pseudocode and explanations. Then looking at other autograds. I got confident in a few days that another operation is within my reach.
From chain rule chains to ReLU’s, it’s less “hate subject” now, more “build with it.”
Useful Resources
- Math Academy - The platform that taught me derivatives and built my math foundation
- Building a Neural Network from Scratch - Great video walkthrough for understanding the fundamentals
- chibigrad by @sumitdotml - Clean, educational autograd implementation
- My autograd implementation - The code I built while learning. I’m already a little ahead in a future compared to this blogpost content. I’ll probably write about CNNs in upcoming weeks
- The Chain Rule - LibreTexts - Pure math foundation for understanding derivatives and the chain rule
Closing words
Random urges pull you back sometimes. That old love for tinkering, buried under “too late” excuses. But nah—too late for what? Code a tensor today; tomorrow, it powers your net.