GitHunt

๐Ÿง  Micrograd-Mini

Micrograd-Mini is a tiny scalar-valued autograd engine with a basic neural network framework โ€” built from scratch, inspired by Andrej Karpathy's micrograd.

It supports reverse-mode automatic differentiation (backpropagation) and is capable of training a simple neural network (MLP) on tasks like XOR. Great for learning the internals of backprop!


๐Ÿ”ฅ Features

  • โœ… Reverse-mode autodiff (backpropagation)
  • โœ… Dynamic computation graph (DAG)
  • โœ… Custom Value class with gradients
  • โœ… Tanh and ReLU activations
  • โœ… Basic Neuron, Layer, and MLP implementation
  • โœ… Trains on XOR dataset

๐Ÿ“ Project Structure

micrograd-mini/
    โ”œโ”€โ”€ engine.py # Core autodiff engine (Value class)
    โ”œโ”€โ”€ nn.py # Neural network components
    โ”œโ”€โ”€ train.py # Training loop for XOR
    โ”œโ”€โ”€ example.py # Better example using XOR dataset 
โ””โ”€โ”€ README.md # Project info

๐Ÿš€ Getting Started

๐Ÿ”ง Requirements

  • Python 3.7+
  • No external libraries needed (pure Python)

โ–ถ๏ธ Run Training

python train.py

๐Ÿ“ˆ Example Output

--- Final Predictions after Training ---

Input: [0.0, 0.0] => Predicted: 0.01 | Target: 0.0
Input: [0.0, 1.0] => Predicted: 0.98 | Target: 1.0
Input: [1.0, 0.0] => Predicted: 0.97 | Target: 1.0
Input: [1.0, 1.0] => Predicted: 0.03 | Target: 0.0

Training complete! ๐ŸŽฏ

๐Ÿง  Learn by Building

Want to really understand backpropagation and gradients?

  • Dive into engine.py and explore the Value class

  • Inspect how operations dynamically build a graph

  • See how .backward() traverses it for gradient computation โ€” just like real frameworks!

๐Ÿ™ Attribution

This project is heavily inspired by micrograd by Andrej Karpathy, licensed under the MIT License.


๐Ÿชช License

This project is licensed under the MIT License. See the LICENSE file for details.


โœจ Author

Built with โค๏ธ by Muawiya, as part of a deep dive into AI, neural nets, and autodiff fundamentals.

๐ŸŒ Connect With Me