Open mtfishman opened 9 months ago
ITensorFunctions.jl is another possibility.
That could work, especially if we name the core type TensorFunction
or ITensorFunction
.
We've changed it to TensorNetworkFunctionals
(adding "als" to make it sound less like tools/utils as well as the ODE/PDE sense), I'll close unless @mtfishman or @emstoudenmire want to modify/change
Hmm I'm not sure I like Functionals
, since a "functional" has a particular definition in math and computer science (https://en.wikipedia.org/wiki/Functional_(mathematics)) which I don't think matches with the usage in this package.
I agree with Matt. To me, a functional is a function of another function — so it has a rather narrow and special meaning.
I was thinking the main concepts in this library would be functions of continuous variables and linear operators, such as integrals or Fourier transforms, acting on those functions. (Hopefully later the field will figure out how to do nonlinear operations such as squaring more efficiently also.)
I'm also leaning against [I]TensorFunctions
or [I]TensorNetworkFunctions
since that just makes it sound like it is defining functions/operations for tensors or tensor networks (i.e. contraction, addition, etc.), and not necessarily about using tensor networks as a data format/representation of continuous functions, so I'm back at preferring ITensorNumericalAnalysis.jl
. Maybe [I]TensorFunctionRepresentations
could work as well.
Yeah it's a difficult one to name. I agree ITensorNetworkFunctionals
means an object which maps a function
to a scalar which is perhaps not quite aligned with what we want: we went with Functionals
for now just because I think it's better than QTTITensorNetworks
.
TensorNetworkFunctionRepresentations
is quite nice
[I]TensorNumericalAnalysis.jl
is my current favorite of the suggestions at the moment (but still has a flavor of "utils library" which is holding me back). Or shorting TensorNetworkFunctionRepresentations
to [I]TensorNetworkFRs
(likewise [I]TensorNetworkNA
)
Yes my only issue with a name like TensorNetworkFunctionRepresentations
is that it is very long.
When I really think about this approach deeply, I think the really "new" thing here is a technique for plugging continuous variables into tensors. So being able to make $T{ij}(x,y)$ not just $T{ij}$. So I could even imagine ITensorContinuousIndices
which even suggests a design idea that it could be good to have an interface where to a user the bits indices are effectively "grouped" into a single large index even though they are split under the hood. (I think this idea is already being pursued some degree in terms of the function interface.)
But in addition to continuous inputs, there are also tools here for creating various families and functions, or doing transformations etc. So something like NumericalAnalysis does capture those broader aspects as well.
One point of reference is that we can think about this package as being analogous to packages like Chebfun (or Julia's equivalent, ApproxFun.jl), which represents functions in some chosen discrete basis, and is based around objects which can be called like functions of continuous variables. Ours is just a different different choice of discrete basis which is a tensor network. Not sure that helps with the name, since ITensorFun.jl
or ITensorNetworkFun.jl
sounds strange (maybe ITensorFunctionApproximation.jl
or ITensorApproxFun.jl
?), but maybe that point of reference can help with thinking about interfaces for this package.
Another idea is [I]TensorFunctionDiscretization.jl
.
Also repositories for Julia packages are supposed to end in .jl
.
Okay I have renamed ITensorNumericalAnalysis.jl
for now. Still not set on a name but lets keep with that until we converge on something
A few other name ideas I thought of are: ITensorContinuousVars. ITensorContinuum. ITensorContinuousInputs.
Yeah I think emphasizing continuousvars
is a good idea. It is a pretty much the fundamental part of the library whilst the application (to differential equation solving, function representation etc) is more flexible.
I have made it (in PR #20) so that one generates the relevant indices by a function:
s = continuous_siteinds(g::AbstractGraph; kwargs...)
where kwargs
specifies things like the map_dimension
(number of continuous inds being represented), the base
of the decomposition and which indices in the IndsNetwork
correspond to which digit
and which dimension
.
The returned structure s::IndsNetworkMap
(maybe we rename it to ContinuousIndsNetwork
) keeps track of the graph_structure and the mapping between digits and ITensor
indices. This is then what is passed around to functions for building operators and functions, e.g:
fx = sin_itn(s::IndsNetworkMap; kwargs...)
and d_dx = third_derivative_operator(s::IndsNetworkMap; kwargs...)
That design sounds like a nice one. Glad you agree continuous could be a good basis for a package name. I'm not sure of the best name (it should be 'catchy' hopefully, without being confusing) and we can discuss more.
Rename
ITensorNumericalAnalysis.jl
.Any other name suggestions?