Muawiya-contact/-micrograd-mini
# The is a micrograd-mini. Inspired by Andrej Karpathy.
๐ง Micrograd-Mini
Micrograd-Mini is a tiny scalar-valued autograd engine with a basic neural network framework โ built from scratch, inspired by Andrej Karpathy's micrograd.
It supports reverse-mode automatic differentiation (backpropagation) and is capable of training a simple neural network (MLP) on tasks like XOR. Great for learning the internals of backprop!
๐ฅ Features
- โ Reverse-mode autodiff (backpropagation)
- โ Dynamic computation graph (DAG)
- โ
Custom
Valueclass with gradients - โ Tanh and ReLU activations
- โ Basic Neuron, Layer, and MLP implementation
- โ Trains on XOR dataset
๐ Project Structure
micrograd-mini/
โโโ engine.py # Core autodiff engine (Value class)
โโโ nn.py # Neural network components
โโโ train.py # Training loop for XOR
โโโ example.py # Better example using XOR dataset
โโโ README.md # Project info
๐ Getting Started
๐ง Requirements
- Python 3.7+
- No external libraries needed (pure Python)
โถ๏ธ Run Training
python train.py
๐ Example Output
--- Final Predictions after Training ---
Input: [0.0, 0.0] => Predicted: 0.01 | Target: 0.0
Input: [0.0, 1.0] => Predicted: 0.98 | Target: 1.0
Input: [1.0, 0.0] => Predicted: 0.97 | Target: 1.0
Input: [1.0, 1.0] => Predicted: 0.03 | Target: 0.0
Training complete! ๐ฏ
๐ง Learn by Building
Want to really understand backpropagation and gradients?
-
Dive into engine.py and explore the Value class
-
Inspect how operations dynamically build a graph
-
See how .backward() traverses it for gradient computation โ just like real frameworks!
๐ Attribution
This project is heavily inspired by micrograd by Andrej Karpathy, licensed under the MIT License.
๐ชช License
This project is licensed under the MIT License. See the LICENSE file for details.
โจ Author
Built with โค๏ธ by Muawiya, as part of a deep dive into AI, neural nets, and autodiff fundamentals.
๐ Connect With Me
- ๐บ YouTube: @Coding_Moves
- ๐ป GitHub: Muawiya
- ๐ง LeetCode: Moavia_Amir
- ๐ Kaggle: Moavia Amir