Skip to content

Pchambet/Deep-Learning-from-Scratch

Repository files navigation

Deep Learning from Scratch — by Pierre Chambet

Deep Learning from Scratch

CI

From first principles to real images — one neuron, one layer, one insight at a time.
Part of WIL — Wide-Range Ideas Laboratory
LinkedIn · GitHub


“Don't just run .fit(). Build the thing, understand it, and then trust it.”


About

I'm Pierre Chambet, a data and deep learning engineer-in-the-making who decided to rebuild deep learning from scratch — not by copying frameworks, but by understanding every equation, line, and gradient.

This repo is a learning-in-public lab. It documents the full path from a hand-coded neuron in NumPy to a convolutional network on MNIST — explained, derived, and visualized with care. It's both a portfolio of understanding and a teaching resource: math → code → intuition → result.


Where to begin

This repository holds two things: a course (a clear path from neuron to CNN) and a lab (a space to explore). Both live under the same roof — pick the door that fits your mood.


Two ways in

Course Lab
For Learning, following a clear path Exploring, experimenting, going deeper
Format PDF episodes + notebooks, step by step Case studies, scripts, open-ended play
Start here Ep. I or birth_of_a_neuron lab/mnist or lab/cnn

Course — The main path

A guided journey: theory → gradients → code. One episode at a time. No rush. No fluff.

LinkedIn Series (7 episodes)

Episode Title What you get Link
I Theory of a Neuron 10-page PDF — linear function, sigmoid, log-loss PDF
II The Art of Descent 12-page PDF — chain rule, ∂ℓ/∂w, ∂ℓ/∂b PDF · Notebook
III Birth of a Neuron 18-page PDF + Colab — neuron coded by hand PDF · Colab
IV All Eyes on You 9-page PDF — training loop on real images (cats vs dogs) PDF
V The Rise of Intelligence 26-page PDF — full neural network theory, forward & backprop PDF
VI Alive 20-page PDF + Colab — 2-layer network coded from scratch PDF · Colab
VII Horizon of Depth 18-page PDF + Colab — generalized L-layer network PDF · Colab

Reply with NEURON (Ep. I), GRADIENT (Ep. II), BIRTH (Ep. III), or RISE (Ep. V) on the LinkedIn posts to receive the PDF via DM. #DeepLearningJourney

Course notebooks

# Notebook Focus Tied to
birth_of_a_neuron Neuron coded by hand (toxic plants) Ep. III · Colab
01 Single Neuron Linear model, sigmoid Ep. I theme
02 Gradients Single Neuron ∂L/∂w, ∂L/∂b, chain rule Ep. II
04 Training Loop Forward → loss → backward → update (cats vs dogs)
05 From One Neuron to a Brain First 2-layer ANN from scratch (nonlinear boundary) Ep. V
06 Alive 2-layer network from scratch (circles + cats vs dogs) Ep. VI · Colab
07 Horizon of Depth L-layer network (circles, moons, spirals, cats vs dogs) Ep. VII · Colab
08 Two-Layer Network 2-layer network on images
11 MNIST MLP Baseline Dense network on MNIST
12 MNIST CNN Baseline CNN, feature maps

Extended guides (PDF)

File Theme
main.pdf Full picture — neurons to the training loop
mnist.pdf Dense networks on MNIST
CNN.pdf Understanding convolutions

Lab — Go further

Where the course leaves off, the Lab begins. Case studies, scripts, experiments — room to breathe, break things, and learn by doing.

👉 Enter the Lab

Project What's inside
MNIST Case Study Full MLP pipeline — normalization, training curves, evaluation
CNN Case Study Convolutions on MNIST — filters, pooling, architecture

Quickstart

git clone https://github.com/Pchambet/Deep-Learning-from-Scratch.git
cd Deep-Learning-from-Scratch
python -m venv .venv && source .venv/bin/activate  # Windows: .venv\Scripts\activate
pip install -r requirements.txt
jupyter lab notebooks/birth_of_a_neuron.ipynb

Python compatibility

  • Core notebooks and scripts: Python 3.10+
  • TensorFlow notebooks/scripts (MNIST/CNN): Python 3.10–3.12 recommended
  • If you are on macOS Apple Silicon, install from requirements.txt (includes tensorflow-macos + tensorflow-metal markers)

Quality checks

make quality          # compile + pytest + smoke test
make test             # run pytest (utilities, two_layer, birth_of_a_neuron)
make precommit        # run formatting/lint hooks
make episode5-demo    # run Episode V demo (make_circles)

Optional one-time setup:

pip install pre-commit
pre-commit install

No install needed: Colab — birth_of_a_neuron


Repository structure

Deep-Learning-from-Scratch/
├── notebooks/           # Course — birth_of_a_neuron, 01, 02, 04–08, 11, 12
├── tests/               # pytest — utilities, two_layer_network, birth_of_a_neuron
├── pdf/                 # Built guides (Ep. I–VII, main, mnist, CNN)
├── latex/               # LaTeX sources — episode_04–07, main, mnist, cnn
├── lab/                 # Lab — case studies
│   ├── mnist/           # MNIST MLP (notebook + train_mlp.py)
│   └── cnn/             # CNN (notebook + train_cnn.py)
├── src/                 # utilities.py, two_layer_network.py
├── data/                # trainset.hdf5, testset.hdf5 (cats vs dogs)
├── assets/              # Figures, banners, photos
├── Makefile             # make latex → build all PDFs
├── requirements.txt
└── README.md

Philosophy

"Learning isn't remembering — it's rebuilding."

No black boxes. Every weight, every gradient, every update — traced and understood. That's the point.


For recruiters

In five minutes, this repo shows that I:

  • Understand the math behind neural networks
  • Implement and debug deep learning models end-to-end
  • Communicate complex ideas clearly and visually
  • Learn independently, structure my work, and deliver clean results

Suggested entry points:


Contribute / Connect

Found an error or an idea worth exploring? Open an issue or a PR. Learning in public too? Let's connect.

LinkedIn GitHub


Deep Learning from Scratch — built with patience, mathematics, and curiosity.
Part of WIL™ — Wide-Range Ideas Laboratory · © 2026 Pierre Chambet

Releases

No releases published

Packages

 
 
 

Contributors

Languages