JoeyT1994 / ITensorNumericalAnalysis.jl

MIT License
4 stars 1 forks source link

Proposal for an interface based on symbolic expressions #10

Open mtfishman opened 1 year ago

mtfishman commented 1 year ago

Related to #9, here is an outline for a proposal for an interface based around a single function (I call it qtn below for "quantized tensor network" but open to suggestions) which at a high level accepts symbolic expressions for functions that we want to compile to a quantized tensor network (a tensor network representing the function on a uniform grid in a specified domain using a bit representation):

using ITensors
using Symbolics
using SymbolicUtils
using SymbolicUtils.Rewriters

# Convert a symbolic expression into a quantized tensor network (QTN)
function qtn(eq::Num, s::Vector{<:Index})
  # Check that the expression has only the input variable `x`.
  @assert isone(length(Symbolics.get_variables(eq)))
  # Expand and simplify the input symbolic expression
  eq = expand(eq)
  eq = simplify(eq)
  return qtn(Symbolics.value(eq), s)
end

function qtn(eq::SymbolicUtils.BasicSymbolic, s::Vector{<:Index})
  op = !istree(eq) ? identity : operation(eq)
  args = !istree(eq) ? [eq] : arguments(eq)
  return qtn(op, args, s)
end

# Linear input: `qtn(x, s)`
function qtn(f::typeof(identity), args, s::Vector{<:Index})
  arg = only(args)
  return 0
end

# Exp input: `qtn(exp(x), s)`
function qtn(f::typeof(exp), args, s::Vector{<:Index})
  arg = only(args)
  @show f
  @show arg, x
  @show istree(arg)
  return 1
end

# Polynomial term input: `qtn(x^2, s)`
function qtn(f::typeof(^), args, s::Vector{<:Index})
  @show f
  @show args, x
  @show istree.(args)
  return 2
end

# Constant input: `qtn(2, s)`
function qtn(eq::Number, s::Vector{<:Index})
  @show eq
  return 3
end

# Addition input: `qtn(exp(x) + exp(2x), s)`
function qtn(f::typeof(+), args, s::Vector{<:Index})
  @show f
  @show args, x
  @show istree.(args)
  return sum(arg -> qtn(arg, s), args)
end

@variables x
s = siteinds("Qubit", 4) # Change to `functioninds` (or some other name) to insert the correct tags

# Interface for making MPS from function expressions
ψ = qtn(2, s)
ψ = qtn(x, s)
ψ = qtn(exp(x), s)
ψ = qtn(exp(2x), s)
ψ = qtn(exp(x) + exp(2x), s)
ψ = qtn(x^2, s)

# Define rewrite rules
cos_rule = @rule cos(~x) => (exp(im * ~x) + exp(-im * ~x)) / 2
cosh_rule = @rule cosh(~x) => (exp(~x) + exp(-~x)) / 2
sin_rule = @rule sin(~x) => (exp(im * ~x) - exp(-im * ~x)) / (2im)
sinh_rule = @rule sinh(~x) => (exp(~x) - exp(-~x)) / 2
exp_rule = @rule exp(~x + ~y) => exp(~x) * exp(~y)
qtn_rules = Chain([cos_rule, cosh_rule, sin_rule, sinh_rule, exp_rule])
@show simplify(cos(2x); rewriter=qtn_rules)

This is based around Symbolics.jl, which seems like a good choice for this package because:

  1. It has flexible rewrite/equation simplification rules (as shown at the bottom), so we can make our own rules to try to rewrite an input expression into a form that we know how to compile into a tensor network.
  2. It seems pretty easy to use and works with general Julia functions.
  3. It supports symbolic expressions for differentiation and equation solving, which will be helpful for writing a high level interface for differential equations.

Some extensions of this syntax could be to multivariable functions, for example:

ψ = qtn(exp(x + y), [x, y], s)

Note I input the variables as another argument, which in principle we could extract from the expression with Symbolics.get_variables(eq) but that helps determine which ordering the variables go in the function.

That expression could make use of the simplification rule:

@show simplify(exp(x + y); rewriter=qtn_rules)

which we could call internally as a pre-proccessing step before trying to compile it down to a tensor network using basic rules that we know.

Some other neat features we could make use of are detecting polynomials and extracting the coefficients, see for example Symbolics.degree, Symbolic.coeff, and the function collect_powers in this tutorial.

Additionally, as I mentioned above, Symbolics.jl has representations of differential operators, so we could have differential operator compilers:

qtn(Differential(x), s; order=5) # Make the finite difference operator for the first derivative to 5th order

We could make use of FiniteDifferences.jl to get coefficients for higher order finite difference operators, and use adder and subtractor MPOs along with those cofficients to generate arbitrary operators (see Section 5.3.1 of https://arxiv.org/abs/1909.06619).

Additionally, we can have an interface to construct a QFT/DFT tensor network in similar way:

using FFTW
qtn(fft, s)

This could lead to even fancier technology, for example a function to compile an entire differential equation in terms of tensor networks, and turn it into a tensor network problem to solve (like a boundary value problem solved with a linear solver from ITensorNetworks.jl, or an eigenvalue problem solved with DMRG). For example:

k = 12.5 # Some wavenumber to target
@variables x f(x)
D = Differential(x)
prob = qtn_problem((D^2)(f) ~ -k^2 * f, s; order=5)

which could then get solved by tensor network solvers in ITensorNetworks.jl. That would be closely related to the interface of ModelingToolkit.jl, though of course that doesn't use tensor networks to represent the discretization of the operator and functions, but more standard finite difference approaches.

A number of functions for constructing tensor networks for various functions and operators (differential operators, QFT/DFT MPO, linear interpolation MPOs from https://arxiv.org/abs/1802.07259, etc.) exist in https://github.com/mtfishman/ITensorQTT.jl/tree/main, which we should basically be able to just port over but instead output a path graph/tree tensor network based off of the tensor network types in ITensorNetworks.jl and whatever quantized tensor network type we design in this package, which as mentioned in #9 will also store metadata about the location of the bits in the tensor network as well as the domain of the function.

mtfishman commented 1 year ago

Also note that along with symbolic expressions we can also support users passing arbitrary Julia functions:

qtn(sin, s)

f(x) = sin(cos(x))
qtn(f, s)

# Could also be anonymous
qtn(x -> sin(x^2), s)

which we can try to compile as best we can. We can even try to simplify those using Symbolics.jl, since we can do:

function qtn(f::Function, s::Vector{<:Index})
  @variables x
  eq = f(x) # Turn it into an expression
  return qtn(eq, s) # Run it through the `qtn` symbolic expression functionality
end

If we decide we don't know what to do with a function or expression, we could punt it off to a general QTN compiler backend, for example something that uses randomized SVD or interpolation methods like TCI which can do their best to compile the function to a QTN by evaluating it at points.

mtfishman commented 1 year ago

Some other ideas for trying to make the compiler more general would be having a backend to perform Taylor series or polynomial expansions of a function and then compile the resulting polynomial to a tensor network. That functionality is in ITensorQTT.jl here: https://github.com/mtfishman/ITensorQTT.jl/blob/f9ca92354582c6090ed40757b8ec130112eb2463/src/function_to_mps.jl#L53-L71 which is based on the fit function of Polynomials.jl. Additionally, we could look into packages like TaylorSeries.jl for a symbolic approach.

Relatedly, it could be powerful to expand in other ways, such as with a Fourier expansion, which we could do numerically with packages like FFTW.jl, or see if there is a way to do that symbolically in Julia. The ApproxFun.jl package could be good to try out for performing various series expansions of general functions in terms of ones we can handle (i.e. they support Taylor, Fourier, Chebyshev, etc.).