Aicraft
Skip to main content

Autograd API

#include "aicraft/autograd.h"

Reverse-mode automatic differentiation with 22 differentiable operations.

Core Function

ac_backward

void ac_backward(AcTensor *loss);

Compute gradients of loss with respect to all tensors with requires_grad = true. Performs reverse-mode autodiff through the computational graph.

How It Works

  1. Each operation records itself in a DAG (directed acyclic graph)
  2. ac_backward() topologically sorts the graph
  3. Gradients are propagated backwards through each node
  4. O(1) cycle detection prevents infinite loops

Gradient Access

AcTensor *x = ac_tensor_rand((int[]){1, 784}, 2);
x->requires_grad = true;

AcTensor *y = ac_forward_seq(net, 2, x);
ac_backward(y);

// Access gradient
AcTensor *grad = ac_grad(x);

Supported Operations

OpForwardBackward
Adda + b∂L/∂a = 1, ∂L/∂b = 1
Mula * b∂L/∂a = b, ∂L/∂b = a
MatMula @ b∂L/∂a = ∂L/∂y @ bᵀ
ReLUmax(0, x)∂L/∂x · (x > 0)
Sigmoidσ(x)∂L/∂x · σ(x)(1 - σ(x))
Softmaxsoftmax(x)Jacobian-vector product
SumΣx∂L/∂x = 1
Meanmean(x)∂L/∂x = 1/n
Exp∂L/∂x · eˣ
Logln(x)∂L/∂x · 1/x
Neg-x∂L/∂x = -1
Reshapereshape(x)reshape(∂L/∂y)
Transposexᵀ(∂L/∂y)ᵀ

No-Grad Context

ac_no_grad_begin();
// Operations here won't be tracked
AcTensor *pred = ac_forward_seq(net, 2, x);
ac_no_grad_end();