Skip to content

MiniTorchBRAutograd from scratch

A lightweight PyTorch-inspired deep learning framework built on NumPy — learn how backpropagation really works.

MiniTorchBR neural network computation graph

Quick Install

bash
pip install minitorchbr

30-Second Example

python
import numpy as np
from MiniTorch.core.variable import Variable

# Create tensors with gradient tracking
x = Variable(np.array([[1.0, 2.0, 3.0]]))
w = Variable(np.random.randn(3, 1))

# Forward pass — graph is built automatically
y = x @ w          # matmul
loss = (y ** 2).sum()

# Backward pass
loss.backward()

print(w.grad)      # dL/dw computed via reverse-mode AD

Released under the MIT License.