adrhill / SparseConnectivityTracer.jl

Fast operator-overloading Jacobian & Hessian sparsity detection.
MIT License
19 stars 1 forks source link

Benchmarks with BundleAdjustmentModels.jl #134

Open amontoison opened 2 weeks ago

amontoison commented 2 weeks ago

@adrhill @gdalle I tried to run some benchmarks on the NLS problems using BundleAdjustmentModels.jl, but I'm encountering issues with the sparsity detection of any Hessian. It takes an extremely long time. This could be a valuable set of tests for you because the problems are quite large. For the largest problems, my computer also crashes during the sparsity pattern detection of some Jacobian.

Note that I replaced norm(r) with sqrt(dot(r, r)) in the code (see this PR) to use version 5.x of SparseConnectivityTracer.jl. Guillaume, the Jacobian/Hessian of these problems could also be useful for SparseMatrixColorings.jl.

using BundleAdjustmentModels
using ADNLPModels
using NLPModels
using Printf

debug = false
check_jacobian = false
df = problems_df()
# df = df[ ( df.nequ .≤ 600000 ) .& ( df.nvar .≤ 100000 ), :]
nproblems = size(df)[1]

@printf("| %21s | %7s | %7s | %22s | %15s |\n", "problem", "nequ", "nvar", "backend", "coord_residual")
for i = 1:nproblems
  name_pb = df[i, :name]

  # Smaller problem of the collection
  debug && (i == 1) && (name_pb = "problem-49-7776-pre")
  debug && (i ≠ 1) && continue

  ## BundleAdjustementModels
  nls = BundleAdjustmentModel(name_pb)
  meta_nls = nls_meta(nls)
  Fx = similar(nls.meta.x0, meta_nls.nequ)
  rows = Vector{Int}(undef, meta_nls.nnzj)
  cols = Vector{Int}(undef, meta_nls.nnzj)
  vals = similar(nls.meta.x0, meta_nls.nnzj)
  x = rand(nls.meta.nvar)  # nls.meta.x0

  # warm-start
  residual(nls, nls.meta.x0)
  residual!(nls, nls.meta.x0, Fx)
  jac_structure_residual!(nls, rows, cols)
  jac_coord_residual!(nls, x, vals)

  # benchmarks
  start_timer = time()
  jac_coord_residual!(nls, x, vals)
  end_timer = time()
  timer_coord_residual = end_timer - start_timer
  @printf("| %21s | %7s | %7s | %22s | %6.5f seconds |\n", name_pb, nls.nls_meta.nequ, nls.meta.nvar, "BundleAdjustmentModels", timer_coord_residual)

  ## ADNLPModels
  function F!(Fx, x)
    residual!(nls, x, Fx)
  end
  nls2 = ADNLSModel!(F!, nls.meta.x0, meta_nls.nequ, nls.meta.lvar, nls.meta.uvar, jacobian_residual_backend = ADNLPModels.SparseADJacobian,
                                                                                   jacobian_backend = ADNLPModels.EmptyADbackend,
                                                                                   hessian_backend = ADNLPModels.EmptyADbackend,
                                                                                   hessian_residual_backend = ADNLPModels.EmptyADbackend)
  meta_nls2 = nls_meta(nls2)
  rows2 = Vector{Int}(undef, meta_nls2.nnzj)
  cols2 = Vector{Int}(undef, meta_nls2.nnzj)
  vals2 = similar(nls2.meta.x0, meta_nls2.nnzj)

  # Warm-start
  jac_structure_residual!(nls2, rows2, cols2)
  jac_coord_residual!(nls2, x, vals2)

  # benchmarks
  start_timer = time()
  jac_coord_residual!(nls2, x, vals2)
  end_timer = time()
  timer2_coord_residual = end_timer - start_timer
  @printf("| %21s | %7s | %7s | %22s | %6.5f seconds |\n", name_pb, nls2.nls_meta.nequ, nls2.meta.nvar, "ADNLPModels", timer2_coord_residual)

  if check_jacobian
    println(meta_nls.nnzj)
    println(meta_nls2.nnzj)
    J_nls = sparse(rows, cols, vals)
    J_nls2 = sparse(rows2, cols2, vals2)
    norm(J_nls - J_nls2) |> println
    norm(J_nls - J_nls2, Inf) |> println
  end
end

Note that you need hessian_residual_backend = ADNLPModels.SparseADHessian to test the Hessian.

adrhill commented 2 weeks ago

I'll investigate your benchmarks as soon as possible, but I just want to quickly comment on the following:

For the largest problems, my computer crashes during the sparsity pattern detection of the Jacobian.

We've encountered some compile time problems in SCT v0.5.X which we've since fixed on the main branch (more specifically in https://github.com/adrhill/SparseConnectivityTracer.jl/issues/120#issuecomment-2168186240 and in #119). We're probably going to tag v0.6.0 as soon as we merge #131.

adrhill commented 2 weeks ago

This could be a valuable set of tests for you because the problems are quite large.

Absolutely! We're actively looking for more large-scale problems to test and benchmark on!

amontoison commented 2 weeks ago

I will open a different discussion / issue but it should be also quite easy to benchmark the OPF problems of rosetta-opf https://github.com/lanl-ansi/rosetta-opf/pull/65 The problems are quite large too and you can also verify the sparsity pattern with JuMP.

gdalle commented 2 weeks ago

We've encountered some compile time problems in SCT v0.5.X which we've since fixed on the main branch

Note that this was mainly due to the britgas problem formulation being a literal compiler nightmare, with 4 thousand lines of unrolled for loops. I think equally large problems with a more realistic formulation (using actual for loops or vectorized code) would be much less of an issue. That's why I started https://github.com/JuliaSmoothOptimizers/OptimizationProblems.jl/pull/332 but I shudder at the idea of doing this for every single problem.