TensorGrad User Guide

Core Concepts

Variables and Dimensions

In TensorGrad, tensors are created with symbolic dimensions using SymPy symbols:

from sympy import symbols
 from tensorgrad import Variable

 # Create dimension symbols
 i, j = symbols("i j")

 # Create a matrix variable
 X = Variable("X", i, j)

 # Create a symmetric matrix
 S = Variable("S", i=i, j=i).with_symmetries("i j")

Basic Operations

TensorGrad supports standard tensor operations:

import tensorgrad.functions as F

 # Matrix multiplication (using @)
 C = A @ B

 # Element-wise operations
 y = F.exp(x)
 z = F.relu(x)

 # Reductions
 s = F.sum(x, ["i"])
 m = F.max(x, ["i"])

 # Softmax
 p = F.softmax(x, dim="i")

Gradients

Computing gradients is a core feature:

# L2 loss example
 XWmY = X @ W - Y
 loss = F.frobenius2(XWmY)  # ||XW - Y||²
 grad = loss.grad(W)

 # Cross entropy example
 logits = Variable("logits", ["C"])
 target = Variable("target", ["C"])
 ce = F.cross_entropy(logits, target, dim="C")
 grad = ce.grad(logits)

Expectations

TensorGrad can compute expectations of arbitrary functions with respect to Gaussian tensors:

from tensorgrad.extras.expectation import Expectation

 # Define mean and covariance
 mu = Variable("mu", i, j)
 covar = Variable("covar", i, j, i2=i, j2=j)

 # Compute expectation
 E = Expectation(expr, wrt=X, mu=mu, covar=covar)

Advanced Features

Tensor Networks

TensorGrad provides a graph-based syntax for complex tensor operations:

# Using graph syntax for tensor contractions
 expr = F.graph("""
     A -i- X0 -j- B
     X0 -k- C
     B -l- D
 """, A=A, X0=X, B=B, C=C, D=D)

Neural Network Components

Common neural network operations are supported:

# Convolution
 data = Variable("data", ["b", "c", "w", "h"])
 unfold = Convolution("w", "j", "w2") @ Convolution("h", "i", "h2")
 kernel = Variable("kernel", ["c", "i", "j", "c2"])
 conv = data @ unfold @ kernel

 # Attention mechanism
 query = W_q @ X
 key = W_k @ X
 value = W_v @ X
 logits = F.dot(query, key, ["inner"])
 attention = F.softmax(logits, dim="seq_k")

Integration with PyTorch

TensorGrad expressions can be evaluated using PyTorch tensors:

import torch

 # Create actual tensors
 values = {
     X: torch.randn(3, 4, names=('i', 'j')),
     W: torch.randn(4, 2, names=('j', 'k'))
 }

 # Evaluate the expression
 result = expr.evaluate(values)

Visualization

TensorGrad can visualize tensor networks in multiple formats:

# Generate TikZ diagram
 from tensorgrad.serializers.to_tikz import to_tikz
 tikz_code = to_tikz(expr)

 # Generate Graphviz diagram
 from tensorgrad.serializers.to_graphviz import to_graphviz
 dot_code = to_graphviz(expr)

For complete API documentation, see the API Reference.