bsc-quantic / Tensors.jl

Tensors and Einstein Summation in Julia
https://bsc-quantic.github.io/Tensors.jl/
Apache License 2.0
0 stars 1 forks source link

Crashes on `svd` function #14

Closed jofrevalles closed 1 year ago

jofrevalles commented 1 year ago

Summary

The svd function is not working for a given Tensor. Also, we lack tests for this function.

Example

julia> using Tensors

julia> using Tenet

julia> using LinearAlgebra

julia> tensor = Tensor(rand(Complex{Float64}, 4, 4), (:l, :r))
4×4 Tensor{ComplexF64, 2, Matrix{ComplexF64}}:
  0.389308+0.742522im  0.865515+0.587856im  0.0232815+0.143727im   0.31782+0.876011im
 0.0493157+0.804223im  0.576638+0.937815im   0.399075+0.126036im  0.779201+0.57991im
  0.939025+0.216369im  0.481996+0.605187im   0.306571+0.86251im   0.564622+0.782464im
  0.311294+0.837215im  0.275528+0.144134im   0.612791+0.501738im  0.794527+0.431123im

julia> svd(tensor)
ERROR: MethodError: reducing over an empty collection is not allowed; consider supplying `init` to the reducer
Stacktrace:
  [1] reduce_empty(op::Base.MappingRF{Tensors.var"#36#38"{Tensor{ComplexF64, 2, Matrix{ComplexF64}}}, Base.BottomRF{typeof(Base.mul_prod)}}, #unused#::Type{Union{}})
    @ Base ./reduce.jl:356
  [2] reduce_empty_iter
    @ ./reduce.jl:379 [inlined]
  [3] reduce_empty_iter
    @ ./reduce.jl:378 [inlined]
  [4] foldl_impl(op::Base.MappingRF{Tensors.var"#36#38"{Tensor{ComplexF64, 2, Matrix{ComplexF64}}}, Base.BottomRF{typeof(Base.mul_prod)}}, nt::Base._InitialValue, itr::Tuple{})
    @ Base ./reduce.jl:49
  [5] mapfoldl_impl(f::Tensors.var"#36#38"{Tensor{ComplexF64, 2, Matrix{ComplexF64}}}, op::typeof(Base.mul_prod), nt::Base._InitialValue, itr::Tuple{})
    @ Base ./reduce.jl:44
  [6] mapfoldl(f::Function, op::Function, itr::Tuple{}; init::Base._InitialValue)
    @ Base ./reduce.jl:170
  [7] mapfoldl
    @ ./reduce.jl:170 [inlined]
  [8] #mapreduce#263
    @ ./reduce.jl:302 [inlined]
  [9] mapreduce
    @ ./reduce.jl:302 [inlined]
 [10] #prod#269
    @ ./reduce.jl:584 [inlined]
 [11] prod(f::Function, a::Tuple{})
    @ Base ./reduce.jl:584
 [12] svd(t::Tensor{ComplexF64, 2, Matrix{ComplexF64}}; left_inds::Tuple{}, kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
    @ Tensors ~/.julia/packages/Tensors/T8F9j/src/Numerics.jl:50
 [13] svd(t::Tensor{ComplexF64, 2, Matrix{ComplexF64}})
    @ Tensors ~/.julia/packages/Tensors/T8F9j/src/Numerics.jl:36
 [14] top-level scope
    @ REPL[5]:1
mofeing commented 1 year ago

You have to provide the left_inds. Current implementation doesn’t work without specifying indices.

jofrevalles commented 1 year ago

Right now it does not work even when left_inds are specified:

julia> using Tensors

julia> using LinearAlgebra

julia> tensor = Tensor(rand(Complex{Float64}, 4, 4), (:l, :r))
4×4 Tensor{ComplexF64, 2, Matrix{ComplexF64}}:
 0.571712+0.0823416im   0.224218+0.731893im  0.280031+0.675686im    0.873721+0.505655im
 0.533186+0.15338im     0.786835+0.137324im   0.74039+0.462276im    0.845743+0.84615im
 0.755815+0.298562im   0.0352349+0.695124im  0.952211+0.00878195im  0.751377+0.107136im
 0.541379+0.124625im    0.809924+0.193885im  0.786122+0.0119519im    0.84552+0.654829im

julia> svd(tensor; left_inds=(labels(tensor)[1],))
ERROR: UndefVarError: uuid4 not defined
Stacktrace:
 [1] svd(t::Tensor{ComplexF64, 2, Matrix{ComplexF64}}; left_inds::Tuple{Symbol}, kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
   @ Tensors ~/.julia/packages/Tensors/T8F9j/src/Numerics.jl:60
 [2] top-level scope
   @ REPL[19]:1

Here it seems we are just missing using UUIDs, which is not even in the Project.toml of Tensors.jl.

Moreover, I think that we should throw an error if we do not specify left_inds, and we should move it to be an argument instead of a keyword argument.