# The is a micrograd-mini. Inspired by Andrej Karpathy.
-
Updated
May 23, 2025 - Python
# The is a micrograd-mini. Inspired by Andrej Karpathy.
This repository is a clean, educational reimplementation of Andrej Karpathy's micrograd project. It includes: - A minimal autograd engine (`Value` class) built from scratch - Neuron, Layer, and MLP abstractions built directly on top of `Value` - Manual training loop using scalar-based backpropagation along with Visualization
Add a description, image, and links to the scalar-autograd topic page so that developers can more easily learn about it.
To associate your repository with the scalar-autograd topic, visit your repo's landing page and select "manage topics."