iTranslated by AI
Thoughts on Matrix Product States (13) — Symbolic Computation
Purpose
While I was conducting experiments for Thinking about Matrix Product States (12) — MNIST Classification with Tensor Networks, the manual calculations were so overwhelming that I couldn't do them by hand. Instead, I used sympy for symbolic computation to verify the results. Since I rarely do this and tend to forget the process, I am documenting it here as a memo.
Implementation
Import required modules
from __future__ import annotations
import numpy as np
import sympy
from opt_einsum import contract
Defining a function to construct tensors with symbols
I implemented this on a whim, so to be honest, I don't remember it very well. Since it can feel a bit awkward when indices start from 0 in mathematical notations, I added a flag on the fly.
def create_symbol_tensor(
name_base: str, shape: int | tuple[int, ...], start_at_index_1: bool = False
) -> np.ndarray:
def create_tensor_names(name_base, shape):
tensor_names = []
def loop(indices: tuple[int], depth: int = 0):
for i in range(shape[depth]):
if depth + 1 < len(shape):
idx = i + 1 if start_at_index_1 else i
loop((*indices, idx), depth + 1)
continue
idx = i + 1 if start_at_index_1 else i
suffix = "-".join(map(str, indices + (idx,)))
name = name_base + "_{" + suffix + "}"
tensor_names.append(name)
loop((), 0)
return tensor_names
if isinstance(shape, int):
shape = (shape,)
tensor_names = create_tensor_names(name_base, shape)
symbols = sympy.symbols(" ".join(tensor_names))
for sym in symbols:
sym.name = sym.name.replace("-", ",")
return np.array(symbols).reshape(shape)
Experiment
Matrix computation
I want to try multiplying
A = create_symbol_tensor("a", (4, 5), True)
x = create_symbol_tensor("x", 5, True)
y = contract("ij,j->i", A, x)
for i in range(len(y)):
display(y[i])
a_{1,1} x_{1} + a_{1,2} x_{2} + a_{1,3} x_{3} + a_{1,4} x_{4} + a_{1,5} x_{5}
a_{2,1} x_{1} + a_{2,2} x_{2} + a_{2,3} x_{3} + a_{2,4} x_{4} + a_{2,5} x_{5}
a_{3,1} x_{1} + a_{3,2} x_{2} + a_{3,3} x_{3} + a_{3,4} x_{4} + a_{3,5} x_{5}
a_{4,1} x_{1} + a_{4,2} x_{2} + a_{4,3} x_{3} + a_{4,4} x_{4} + a_{4,5} x_{5}
Looks good.
Tensor computation
I am using opt_einsum.contract here just because I want to use numpy.einsum.
shapes = [(2,1), (1,2,1), (1, 2)]
x = [create_symbol_tensor(f"x^{i+1}", (2,), True) for i in range(len(shapes))]
G = [create_symbol_tensor(f"G^{i+1}", shapes[i], True) for i in range(len(shapes))]
contracted_tensor = contract("A,B,C,Aα,αBβ,βC->", *x, *G)
display(contracted_tensor.item())
\left(G^1_{1,1} x^1_{1} + G^1_{2,1} x^1_{2}\right) \left(G^2_{1,1,1} x^2_{1} + G^2_{1,2,1} x^2_{2}\right) \left(G^3_{1,1} x^3_{1} + G^3_{1,2} x^3_{2}\right)
display(sympy.expand(contracted_tensor.item()))
G^1_{1,1} G^2_{1,1,1} G^3_{1,1} x^1_{1} x^2_{1} x^3_{1} + G^1_{1,1} G^2_{1,1,1} G^3_{1,2} x^1_{1} x^2_{1} x^3_{2} + G^1_{1,1} G^2_{1,2,1} G^3_{1,1} x^1_{1} x^2_{2} x^3_{1} + G^1_{1,1} G^2_{1,2,1} G^3_{1,2} x^1_{1} x^2_{2} x^3_{2} + G^1_{2,1} G^2_{1,1,1} G^3_{1,1} x^1_{2} x^2_{1} x^3_{1} + G^1_{2,1} G^2_{1,1,1} G^3_{1,2} x^1_{2} x^2_{1} x^3_{2} + G^1_{2,1} G^2_{1,2,1} G^3_{1,1} x^1_{2} x^2_{2} x^3_{1} + G^1_{2,1} G^2_{1,2,1} G^3_{1,2} x^1_{2} x^2_{2} x^3_{2}
Even though it is a contraction of small tensors, we can see that it results in quite a few terms.
Usually, I only write it as a mathematical expression like
so I rarely think about calculations on a computer. However, it makes me feel that if I don't properly estimate the number of terms or the range of values, an overflow might accidentally occur.
Summary
Symbolic computation is quite useful when you want to visually verify something that would be disheartening to calculate by hand.
Discussion