Reverse-mode Autograd
Full automatic differentiation engine with dynamic computation graphs. Every operation tracks its gradient function for efficient backprop.
A lightweight PyTorch-inspired deep learning framework built on NumPy — learn how backpropagation really works.
pip install minitorchbrimport numpy as np
from MiniTorch.core.variable import Variable
# Create tensors with gradient tracking
x = Variable(np.array([[1.0, 2.0, 3.0]]))
w = Variable(np.random.randn(3, 1))
# Forward pass — graph is built automatically
y = x @ w # matmul
loss = (y ** 2).sum()
# Backward pass
loss.backward()
print(w.grad) # dL/dw computed via reverse-mode AD