Skip to content

Operations API

All ops live in MiniTorch.ops.*. Each is a Function subclass supporting forward and backward passes.

Arithmetic

OpImportDescription
Addfrom MiniTorch.ops.add import AddElement-wise addition
Subfrom MiniTorch.ops.sub import SubElement-wise subtraction
Mulfrom MiniTorch.ops.mul import MulElement-wise multiplication
Divfrom MiniTorch.ops.div import DivElement-wise division
Powfrom MiniTorch.ops.pow import PowElement-wise power
Negfrom MiniTorch.ops.neg import NegNegation
MatMulfrom MiniTorch.ops.matmul import MatMulMatrix multiplication

These are also available via operator overloading on Variable (e.g. a + b, a @ b).

Reductions

OpImportDescription
Sumfrom MiniTorch.ops.sum import SumSum all elements (scalar output)
Meanfrom MiniTorch.ops.mean import MeanMean of all elements
python
x = Variable(np.array([[1.0, 2.0], [3.0, 4.0]]))
s = x.sum()    # Variable(10.0)
s.backward()
print(x.grad)  # [[1. 1.] [1. 1.]]

Activations

ReLU

python
from MiniTorch.ops.relu import relu

y = relu(x)   # max(0, x) element-wise

Gradient: 1 where x > 0, else 0.

Sigmoid

python
from MiniTorch.ops.sigmoid import sigmoid

y = sigmoid(x)   # 1 / (1 + exp(-x))

Gradient: sigmoid(x) * (1 - sigmoid(x)).

Loss Functions

MSE Loss

python
from MiniTorch.ops.mse import mse_loss

loss = mse_loss(predictions, targets)
# equivalent to ((predictions - targets) ** 2).mean()

Softmax Cross-Entropy

python
from MiniTorch.ops.softmax_cross_entropy import softmax_cross_entropy

loss = softmax_cross_entropy(logits, labels)
  • logits: Variable of shape (batch, num_classes)
  • labels: integer class indices, shape (batch,)

The op fuses softmax and cross-entropy for numerical stability.

Released under the MIT License.