NilsNiggemann / PMFRG.jl

MIT License
1 stars 1 forks source link

Fails with N<=7 #13

Closed mmesiti closed 10 months ago

mmesiti commented 10 months ago

The following script fails:

using SpinFRGLattices, PMFRG
using SpinFRGLattices.SquareLattice

NLen = 4 # Number of nearest neighbor bonds up to which correlations are treated in the lattice. For NLen = 5, all correlations C_{ij} are zero if sites i and j are separated by more than 5 nearest neighbor bonds.
J1 = 1
J2 = 0.1
couplings = [J1, J2] # Construct a vector of couplings: nearest neighbor coupling is J1 (J2) and further couplings to zero. For finite further couplings simply provide a longer array, i.e [J1,J2,J3,...]

System = getSquareLattice(NLen, couplings) # create a structure that contains all information about the geometry of the problem. 

Par = Params( #create a group of all parameters to pass them to the FRG Solver
    System, # geometry, this is always required
    OneLoop(), # method. OneLoop() is the default
    T = 0.5, # Temperature for the simulation.
    N = <put a number here <=7>, # Number of positive Matsubara frequencies for the four-point vertex.
    accuracy = 1e-3, #absolute and relative tolerance of the ODE solver.
    # For further optional arguments, see documentation of 'NumericalParams'
)

mainFile = "temp/" * PMFRG.generateFileName(Par, "_testFile") # specify a file name for main Output
flowpath = "temp/flows/" # specify path for vertex checkpoints

Solution, saved_values = SolveFRG(
    Par,
    MainFile = mainFile,
    CheckpointDirectory = flowpath,
    method = DP5(),
    VertexCheckpoints = [],
    CheckPointSteps = 3,
)
mmesiti commented 10 months ago

For N=7, output

One Loop: T= 0.500000000000000
Checkpoints saved at /home/michele/PMFRG/temp/flows/SquareLattice_NLen=4_N=7_T=0.500000000000000(v_3)
Starting solve
Time taken for output saving: 

T= 0.5 at t step: 3.253, Λ = exp(t) = 25.878    chi_1 = 0.0 chi_2 = -0.0     f_int = (-0.0,)
Max Va(2, 2, 7, 1) = 0.003741325540777
Max Vb(1, 6, 7, 1) = 0.007563084444919
Max Vc(2, 7, 1, 6) = -1.002387720929764
ERROR: LoadError: BoundsError: attempt to access 7-element Vector{Int64} at index [0]
Stacktrace:
  [1] getindex(A::Vector{Int64}, i1::Int64)
    @ Base ./essentials.jl:13
  [2] (::PMFRG.var"#givefreqs#100"{Vector{Int64}, Int64})()
    @ PMFRG ~/PMFRG/PMFRG.jl/src/Solver.jl:249
  [3] writeOutput(f_int::Vector{Float64}, gamma::Matrix{Float64}, Va::Array{Float64, 4}, Vb::Array{Float64, 4}, Vc::Array{Float64, 4}, obs::PMFRG.Observables{Float64}, Lam::Float64, Par::OneLoopParams{Float64, Geometry{Rvec_2D, StructArrays.StructArray{sumElements, 2, NamedTuple{(:ki, :kj, :m, :xk), NTuple{4, Matrix{Int64}}}, Int64}, Float64}})
    @ PMFRG ~/PMFRG/PMFRG.jl/src/Solver.jl:263
  [4] writeOutput(State::RecursiveArrayTools.ArrayPartition{Float64, Tuple{Vector{Float64}, Matrix{Float64}, Array{Float64, 4}, Array{Float64, 4}, Array{Float64, 4}}}, saved_values::DiffEqCallbacks.SavedValues{Float64, PMFRG.Observables}, Lam::Float64, Par::OneLoopParams{Float64, Geometry{Rvec_2D, StructArrays.StructArray{sumElements, 2, NamedTuple{(:ki, :kj, :m, :xk), NTuple{4, Matrix{Int64}}}, Int64}, Float64}})
    @ PMFRG ~/PMFRG/PMFRG.jl/src/Solver.jl:219
[...]
mmesiti commented 10 months ago

for various N, relevant line only:

N=7  ERROR: LoadError: BoundsError: attempt to access 6-element Vector{Int64} at index [0]
N=6  ERROR: LoadError: BoundsError: attempt to access 6-element Vector{Int64} at index [0]
N=5  ERROR: LoadError: BoundsError: attempt to access 6-element Vector{Int64} at index [-1]
N=4  ERROR: LoadError: BoundsError: attempt to access 6-element Vector{Int64} at index [-1]
N=3  ERROR: LoadError: BoundsError: attempt to access 6-element Vector{Int64} at index [-2]
mmesiti commented 10 months ago

Likely, due to https://github.com/NilsNiggemann/PMFRG.jl/blob/master/src/Solver.jl#L237-L239

NilsNiggemann commented 10 months ago

Yes, that line is the culprit - it is a bit sloppy and essentially assumes that the number of frequencies is not too small - I thought I replaced that code already but seems that it's still there. It should be an easy fix though, we just add a default fallback if the crude logic otherwise returns negative indices (it's only for some output printing so it's probably not worth to overengineer the logic). I can submit a pull request with a fix soon.