ExpHP / rsp2

phonons in rust
Apache License 2.0
2 stars 1 forks source link

thread 'main' panicked at 'no lines!?' #66

Closed colin-daniels closed 5 years ago

colin-daniels commented 6 years ago

Inputs.

[ 107.108s][TRACE] Computing most negative eigensolutions.
[ 108.958s][WARN] trace: precomputing OPinv for shift-invert
[ 109.016s][WARN] /usr/lib/python3.7/site-packages/scipy/sparse/linalg/dsolve/linsolve.py:295: SparseEfficiencyWarning: splu requires CSC matrix format
[ 109.017s][WARN]   warn('splu requires CSC matrix format', SparseEfficiencyWarning)
[ 109.017s][WARN] trace: shift-invert call 1
[ 114.213s][WARN] trace: shift-invert call 2
[ 120.706s][WARN] trace: shift-invert call 3
[ 125.300s][WARN] trace: shift-invert call 4
[ 129.433s][WARN]  Good -- Bad (Old Wrong OrthoFail OrthoBad)
[ 129.433s][WARN]   3   --  0  ( 0    0       0        0    )
[ 129.433s][WARN]   0   --  3  ( 3    0       0        0    )
[ 129.433s][WARN]   0   --  3  ( 3    0       0        0    )
[ 129.433s][WARN]   0   --  3  ( 3    0       0        0    )
[ 129.435s][WARN] trace: trying non-shift-invert
[ 650.603s][TRACE] Done diagonalizing dynamical matrix
[ 650.605s][TRACE] ============================
[ 650.605s][TRACE] Finished diagonalization
[ 650.667s][TRACE] Computing eigensystem info
[ 650.668s][TRACE] computing EvAcousticness
[ 650.668s][TRACE] computing EvPolarization
[ 650.668s][TRACE] not computing EvLayerAcousticness due to missing requirement SiteLayers
[ 650.668s][TRACE] not computing UnfoldProbs due to missing requirement SiteLayers
[ 650.668s][TRACE] computing EvRamanTensors
thread 'main' panicked at 'no lines!?', libcore/option.rs:960:5
stack backtrace:
   0: std::sys::unix::backtrace::tracing::imp::unwind_backtrace
             at libstd/sys/unix/backtrace/tracing/gcc_s.rs:49
   1: std::sys_common::backtrace::print
             at libstd/sys_common/backtrace.rs:71
             at libstd/sys_common/backtrace.rs:59
   2: std::panicking::default_hook::{{closure}}
             at libstd/panicking.rs:211
   3: std::panicking::default_hook
             at libstd/panicking.rs:227
   4: std::panicking::rust_panic_with_hook
             at libstd/panicking.rs:511
   5: std::panicking::continue_panic_fmt
             at libstd/panicking.rs:426
   6: rust_begin_unwind
             at libstd/panicking.rs:337
   7: core::panicking::panic_fmt
             at libcore/panicking.rs:92
   8: core::option::expect_failed
             at libcore/option.rs:960
   9: rsp2_tasks::cmd::ev_analyses::columns::aligned_dot_column
  10: rsp2_tasks::cmd::ev_analyses::<impl rsp2_tasks::cmd::ev_analyses::gamma_system_analysis::GammaSystemAnalysis>::make_columns
  11: rsp2_tasks::cmd::write_eigen_info_for_machines
  12: rsp2_tasks::cmd::<impl rsp2_tasks::cmd::trial::TrialDir>::run_relax_with_eigenvectors
  13: <rsp2_lammps_wrap::low_level::mpi_helper::MpiOnDemand<D>>::install
  14: rsp2_tasks::entry_points::wrap_main_with_lammps_on_demand
  15: rsp2_tasks::entry_points::wrap_main
  16: rsp2_tasks::entry_points::_rsp2_acgsd
  17: rsp2_tasks::entry_points::rsp2
  18: std::rt::lang_start::{{closure}}
  19: std::panicking::try::do_call
             at libstd/rt.rs:59
             at libstd/panicking.rs:310
  20: __rust_maybe_catch_panic
             at libpanic_unwind/lib.rs:105
  21: std::rt::lang_start_internal
             at libstd/panicking.rs:289
             at libstd/panic.rs:392
             at libstd/rt.rs:58
  22: main
  23: __libc_start_main
  24: _start
[ 650.919s][INFO] successfully leaked tempdir at /tmp/rsp2-.IxFgIk1TLsFv
RUST_BACKTRACE=1 OMP_NUM_THREADS=1 rsp2 -c input.yaml -o rsp2-test  --force  2156.95s user 11.13s system 333% cpu 10:51.03 total
colin-daniels commented 6 years ago

Input that fails faster (8 seconds).

colin-daniels commented 6 years ago

Maybe this one https://gist.github.com/db7a5ffbf1d9cb9d2f19cd3a8af3068f

Edit: All of them relaxed.tar.gz

ExpHP commented 6 years ago

Note to self: Do not close without replacing this error message with something clearer.

(it happens when the python script returns no eigenvectors)

ExpHP commented 6 years ago

on the second input, k=12 is super slow, and k=300 completed almost instantly. Looks like perhaps all we need is more lanczos vectors for speedier convergence of non-shift-invert

Edit: Nope, that's not it. Something else must have been running in the background and I didn't realize it. (besides, now that I think about it, BOTH the shift-invert and non-shift-invert calls sped up, but I only changed the non-shift-invert one)

ExpHP commented 6 years ago

This is bizarre.

The number of lanczos vectors is irrelevant. Simply requesting more solutions apparently has an impact on convergence speed.

from scipy.sparse.linalg import eigsh, ArpackNoConvergence
def num_sols(*args, **kw):
     try: return len(eigsh(*args, **kw)[0])
     except ArpackNoConvergence as e:
         return len(e.eigenvalues)

m = # load gamma-dynmat from a failed run of 3agnr_l21

# Consistently finishes quickly and prints 300
print(num_sols(m, which='SA', k=300, ncv=601))

# Consistently takes several minutes and prints 12
print(num_sols(m, which='SA', k=12, ncv=601))

I looked through scipy's code and k doesn't affect anything else besides the NEV and NCV arguments to ARPACK. Whatever is going on here must be happening somewhere in ARPACK.

ExpHP commented 6 years ago

Timings of running sparse diagonalization on that input for various values of k, averaged over 20-ish something runs for each k.

A few distinct portions of the are seen in the graph of runtime vs k:

I want to obtain timings for various sizes of matrices, as well as for matrices with strongly imaginary modes vs matrices without imaginary modes, but currently do not have saved matrices of reasonable size.

3agnr_l21 timings ``` # 3agnr_l21 # Columns: # * no. eigenvectors requested # * no. eigenvectors found / # requested # * time (s) 1 0.00 0.508470 2 0.00 1.450328 3 0.00 1.938898 4 0.00 2.480602 5 0.00 2.986075 6 0.00 3.700691 7 0.00 4.492003 8 0.00 4.983302 9 0.00 5.640850 10 0.00 6.086328 11 0.00 6.735389 12 0.00 7.505434 13 0.00 8.543090 14 0.00 9.172353 15 1.00 8.667335 16 1.00 5.767140 17 1.00 4.019376 18 1.00 3.343360 19 1.00 2.468636 20 1.00 2.047024 21 1.00 1.686071 22 1.00 1.491720 23 1.00 1.379469 24 1.00 1.189349 25 1.00 1.101559 26 1.00 0.990862 27 1.00 0.944390 28 1.00 0.856163 29 1.00 0.746313 30 1.00 0.692268 31 1.00 0.682501 32 1.00 0.675753 33 1.00 0.632583 34 1.00 0.628588 35 1.00 0.621367 36 1.00 0.600776 37 1.00 0.600099 38 1.00 0.626857 39 1.00 0.598621 40 1.00 0.559405 41 1.00 0.545271 42 1.00 0.529965 43 1.00 0.536049 44 1.00 0.520075 45 1.00 0.514466 46 1.00 0.499824 47 1.00 0.495433 48 1.00 0.486652 49 1.00 0.467096 50 1.00 0.459052 51 1.00 0.460269 52 1.00 0.463918 53 1.00 0.462300 54 1.00 0.441236 55 1.00 0.436475 56 1.00 0.420033 57 1.00 0.409534 58 1.00 0.402257 59 1.00 0.388610 60 1.00 0.396931 61 1.00 0.384896 62 1.00 0.395254 63 1.00 0.402904 64 1.00 0.391312 65 1.00 0.395205 66 1.00 0.392441 67 1.00 0.405117 68 1.00 0.399002 69 1.00 0.398986 70 1.00 0.410946 71 1.00 0.419945 72 1.00 0.411955 73 1.00 0.422747 74 1.00 0.413774 75 1.00 0.430649 76 1.00 0.422488 77 1.00 0.431646 78 1.00 0.428288 79 1.00 0.432388 80 1.00 0.435259 81 1.00 0.432559 82 1.00 0.422890 83 1.00 0.421938 84 1.00 0.419406 85 1.00 0.424936 86 1.00 0.422897 87 1.00 0.419456 88 1.00 0.428291 89 1.00 0.423150 90 1.00 0.433043 91 1.00 0.424550 92 1.00 0.434614 93 1.00 0.439345 94 1.00 0.440125 95 1.00 0.442960 96 1.00 0.436559 97 1.00 0.439090 98 1.00 0.427608 99 1.00 0.443772 100 1.00 0.447756 101 1.00 0.441866 102 1.00 0.454506 103 1.00 0.444758 104 1.00 0.440793 105 1.00 0.444915 106 1.00 0.455417 107 1.00 0.464816 108 1.00 0.469926 109 1.00 0.457558 110 1.00 0.453348 111 1.00 0.456367 112 1.00 0.473747 113 1.00 0.487301 114 1.00 0.494375 115 1.00 0.497556 116 1.00 0.482784 117 1.00 0.498102 118 1.00 0.521863 119 1.00 0.504113 120 1.00 0.496892 121 1.00 0.500914 122 1.00 0.491063 123 1.00 0.490673 124 1.00 0.491226 125 1.00 0.509790 126 1.00 0.538299 127 1.00 0.495930 128 1.00 0.510692 129 1.00 0.519220 130 1.00 0.536277 131 1.00 0.528729 132 1.00 0.500255 133 1.00 0.507408 134 1.00 0.522301 135 1.00 0.528601 136 1.00 0.524753 137 1.00 0.504401 138 1.00 0.516691 139 1.00 0.522262 140 1.00 0.516131 141 1.00 0.534864 142 1.00 0.541703 143 1.00 0.543003 144 1.00 0.510810 145 1.00 0.516352 146 1.00 0.512349 147 1.00 0.524105 148 1.00 0.525820 149 1.00 0.541304 150 1.00 0.534895 151 1.00 0.532072 152 1.00 0.532532 153 1.00 0.522502 154 1.00 0.532328 155 1.00 0.537206 156 1.00 0.545803 157 1.00 0.587747 158 1.00 0.577089 159 1.00 0.596378 160 1.00 0.597768 161 1.00 0.598622 162 1.00 0.567680 163 1.00 0.565364 164 1.00 0.566146 165 1.00 0.573962 166 1.00 0.579884 167 1.00 0.575160 168 1.00 0.583519 169 1.00 0.586764 170 1.00 0.600571 171 1.00 0.611526 172 1.00 0.610073 173 1.00 0.611243 174 1.00 0.618160 175 1.00 0.613955 176 1.00 0.612335 177 1.00 0.610860 178 1.00 0.589767 179 1.00 0.586439 180 1.00 0.601889 181 1.00 0.638371 182 1.00 0.620934 183 1.00 0.666928 184 1.00 0.637884 185 1.00 0.639185 186 1.00 0.629152 187 1.00 0.619526 188 1.00 0.622072 189 1.00 0.660682 190 1.00 0.651647 191 1.00 0.642471 192 1.00 0.678454 193 1.00 0.659349 194 1.00 0.657050 195 1.00 0.669928 196 1.00 0.679502 197 1.00 0.685333 198 1.00 0.681107 199 1.00 0.698861 200 1.00 0.688828 201 1.00 0.699798 202 1.00 0.702877 203 1.00 0.709411 204 1.00 0.720194 205 1.00 0.734059 206 1.00 0.715676 207 1.00 0.675354 208 1.00 0.680779 209 1.00 0.681283 210 1.00 0.694629 211 1.00 0.695675 212 1.00 0.707893 213 1.00 0.712782 214 1.00 0.729757 215 1.00 0.741813 216 1.00 0.731052 217 1.00 0.731956 218 1.00 0.744592 219 1.00 0.746771 220 1.00 0.758330 221 1.00 0.762050 222 1.00 0.765184 223 1.00 0.779592 224 1.00 0.784375 225 1.00 0.782222 226 1.00 0.802617 227 1.00 0.794409 228 1.00 0.809080 229 1.00 0.831475 230 1.00 0.836995 231 1.00 0.786129 232 1.00 0.759648 233 1.00 0.741414 234 1.00 0.755524 235 1.00 0.737747 236 1.00 0.749774 237 1.00 0.788822 238 1.00 0.788365 239 1.00 0.804466 240 1.00 0.818607 241 1.00 0.801469 242 1.00 0.822522 243 1.00 0.821691 244 1.00 0.835522 245 1.00 0.831959 246 1.00 0.869827 247 1.00 0.816057 248 1.00 0.835737 249 1.00 0.846116 250 1.00 0.860309 251 1.00 0.852374 252 1.00 0.877507 253 1.00 0.870664 254 1.00 0.885314 255 1.00 0.874863 256 1.00 0.892787 257 1.00 0.897094 258 1.00 0.886494 259 1.00 0.897234 260 1.00 0.920457 261 1.00 0.910824 262 1.00 0.914767 263 1.00 0.910253 264 1.00 0.907794 265 1.00 0.877059 266 1.00 0.863767 267 1.00 0.882984 268 1.00 0.905266 269 1.00 0.902407 270 1.00 0.901783 271 1.00 0.874206 272 1.00 0.865620 273 1.00 0.864321 274 1.00 0.863102 275 1.00 0.865507 276 1.00 0.902334 277 1.00 0.877386 278 1.00 1.029352 279 1.00 0.964619 280 1.00 0.993774 281 1.00 0.942291 282 1.00 1.079518 283 1.00 1.084767 284 1.00 1.162321 285 1.00 1.184474 286 1.00 1.143590 287 1.00 1.206003 288 1.00 1.270785 289 1.00 1.367611 290 1.00 1.392884 291 1.00 1.349830 292 1.00 1.286284 293 1.00 1.267896 294 1.00 1.151766 295 1.00 1.075837 296 1.00 1.096500 297 1.00 1.113085 298 1.00 1.188989 299 1.00 1.121974 300 1.00 1.124875 301 1.00 1.129642 302 1.00 1.149320 303 1.00 1.141788 304 1.00 1.183679 305 1.00 1.129121 306 1.00 0.998357 307 1.00 0.976831 308 1.00 0.992951 309 1.00 1.067141 310 1.00 1.115746 311 1.00 1.150469 312 1.00 1.139332 313 1.00 1.023541 314 1.00 1.150525 315 1.00 1.139836 316 1.00 1.189507 317 1.00 1.054796 318 1.00 0.598890 319 1.00 0.610406 320 1.00 0.612698 321 1.00 0.641602 322 1.00 0.661895 323 1.00 0.659504 324 1.00 0.640518 325 1.00 0.621664 326 1.00 0.649522 327 1.00 0.693734 328 1.00 0.659872 329 1.00 0.672440 330 1.00 0.679015 331 1.00 0.712346 332 1.00 0.737574 333 1.00 0.622837 334 1.00 0.631348 335 1.00 0.658604 336 1.00 0.663732 337 1.00 0.676168 338 1.00 0.678671 339 1.00 0.670156 340 1.00 0.692737 341 1.00 0.688229 342 1.00 0.649007 343 1.00 0.663832 344 1.00 0.697337 345 1.00 0.686782 346 1.00 0.769728 347 1.00 0.685664 348 1.00 0.715858 349 1.00 0.685267 350 1.00 0.701120 351 1.00 0.712373 352 1.00 0.665757 353 1.00 0.679301 354 1.00 0.694316 355 1.00 0.671483 356 1.00 0.668677 357 1.00 0.690877 358 1.00 0.696175 359 1.00 0.697532 360 1.00 0.637185 361 1.00 0.621080 362 1.00 0.645921 363 1.00 0.682709 364 1.00 0.685657 365 1.00 0.707653 366 1.00 0.645409 367 1.00 0.620084 368 1.00 0.610969 369 1.00 0.614380 370 1.00 0.611745 371 1.00 0.622861 372 1.00 0.625353 373 1.00 0.620407 374 1.00 0.644850 375 1.00 0.702063 376 1.00 0.669586 377 1.00 0.636302 378 1.00 0.641637 379 1.00 0.648007 380 1.00 0.624829 381 1.00 0.613650 382 1.00 0.633625 383 1.00 0.625740 384 1.00 0.656011 385 1.00 0.616761 386 1.00 0.621714 387 1.00 0.620665 388 1.00 0.623385 389 1.00 0.618192 390 1.00 0.628618 391 1.00 0.624198 392 1.00 0.627515 393 1.00 0.684002 394 1.00 0.712924 395 1.00 0.741204 396 1.00 0.654706 397 1.00 0.625561 398 1.00 0.630889 399 1.00 0.616149 400 1.00 0.625014 401 1.00 0.622146 402 1.00 0.616681 403 1.00 0.614843 404 1.00 0.627946 405 1.00 0.631969 406 1.00 0.612574 407 1.00 0.615771 408 1.00 0.619990 409 1.00 0.618745 410 1.00 0.632529 411 1.00 0.612593 412 1.00 0.624185 413 1.00 0.622300 414 1.00 0.617286 415 1.00 0.620416 416 1.00 0.623980 417 1.00 0.636133 418 1.00 0.653614 419 1.00 0.653308 420 1.00 0.620432 421 1.00 0.628057 422 1.00 0.631663 423 1.00 0.617156 424 1.00 0.621835 425 1.00 0.629240 426 1.00 0.620595 427 1.00 0.618655 428 1.00 0.615291 429 1.00 0.630203 430 1.00 0.615210 431 1.00 0.622408 432 1.00 0.629572 433 1.00 0.631359 434 1.00 0.631717 435 1.00 0.625298 436 1.00 0.688802 437 1.00 0.690375 438 1.00 0.642644 439 1.00 0.646159 440 1.00 0.640462 441 1.00 0.632286 442 1.00 0.635075 443 1.00 0.649067 444 1.00 0.661964 445 1.00 0.678760 446 1.00 0.638623 447 1.00 0.656529 448 1.00 0.636113 449 1.00 0.628960 450 1.00 0.620112 451 1.00 0.628912 452 1.00 0.626414 453 1.00 0.653945 454 1.00 0.676674 455 1.00 0.680968 456 1.00 0.663176 457 1.00 0.643536 458 1.00 0.628602 459 1.00 0.625165 460 1.00 0.630292 461 1.00 0.637163 462 1.00 0.631706 463 1.00 0.632792 464 1.00 0.637007 465 1.00 0.635475 466 1.00 0.625817 467 1.00 0.628093 468 1.00 0.631001 469 1.00 0.626060 470 1.00 0.674425 471 1.00 0.657378 472 1.00 0.645039 473 1.00 0.650800 474 1.00 0.645081 475 1.00 0.645534 476 1.00 0.636350 477 1.00 0.658454 478 1.00 0.651068 479 1.00 0.652989 480 1.00 0.654975 481 1.00 0.734638 482 1.00 0.758914 483 1.00 0.651910 484 1.00 0.654566 485 1.00 0.637436 486 1.00 0.651450 487 1.00 0.675648 488 1.00 0.645680 489 1.00 0.628518 490 1.00 0.637230 491 1.00 0.633052 492 1.00 0.653308 493 1.00 0.698085 494 1.00 0.724575 495 1.00 0.648488 496 1.00 0.663880 497 1.00 0.638852 498 1.00 0.637241 499 1.00 0.629814 500 1.00 0.633189 501 1.00 0.630859 502 1.00 0.633738 503 1.00 0.637160 504 1.00 0.637589 505 1.00 0.632903 506 1.00 0.627763 507 1.00 0.633440 508 1.00 0.643728 509 1.00 0.647543 510 1.00 0.643896 511 1.00 0.652696 512 1.00 0.642827 513 1.00 0.644082 514 1.00 0.652944 515 1.00 0.646232 516 1.00 0.657250 517 1.00 0.646171 518 1.00 0.648547 519 1.00 0.649220 520 1.00 0.636942 521 1.00 0.651301 522 1.00 0.641496 523 1.00 0.691845 524 1.00 0.719307 525 1.00 0.667723 526 1.00 0.659338 527 1.00 0.660244 528 1.00 0.663609 529 1.00 0.686166 530 1.00 0.665551 531 1.00 0.680144 532 1.00 0.670778 533 1.00 0.657130 534 1.00 0.649768 535 1.00 0.653953 536 1.00 0.662183 537 1.00 0.668591 538 1.00 0.660092 539 1.00 0.674322 540 1.00 0.659235 541 1.00 0.676062 542 1.00 0.681861 543 1.00 0.703293 544 1.00 0.670666 545 1.00 0.691259 546 1.00 0.686636 547 1.00 0.644190 548 1.00 0.645667 549 1.00 0.649858 550 1.00 0.650058 551 1.00 0.640514 552 1.00 0.640443 553 1.00 0.643668 554 1.00 0.651362 555 1.00 0.645869 556 1.00 0.648513 557 1.00 0.647370 558 1.00 0.643819 559 1.00 0.649859 560 1.00 0.651278 561 1.00 0.640410 562 1.00 0.643152 563 1.00 0.648805 564 1.00 0.646693 565 1.00 0.647399 566 1.00 0.649174 567 1.00 0.648932 568 1.00 0.651688 569 1.00 0.651131 570 1.00 0.652599 571 1.00 0.673396 572 1.00 0.728943 573 1.00 0.664728 574 1.00 0.682180 575 1.00 0.645206 576 1.00 0.683558 577 1.00 0.651920 578 1.00 0.652008 579 1.00 0.648602 580 1.00 0.652925 581 1.00 0.649950 582 1.00 0.646645 583 1.00 0.647921 584 1.00 0.661911 585 1.00 0.670066 586 1.00 0.655822 587 1.00 0.653252 588 1.00 0.642466 589 1.00 0.703847 590 1.00 0.681328 591 1.00 0.652621 592 1.00 0.716323 593 1.00 0.686055 594 1.00 0.661412 595 1.00 0.667054 596 1.00 0.694408 597 1.00 0.670360 598 1.00 0.668796 599 1.00 0.654595 600 1.00 0.673930 601 1.00 0.689180 602 1.00 0.693758 603 1.00 0.678386 604 1.00 0.682355 605 1.00 0.667450 606 1.00 0.685868 607 1.00 0.728325 608 1.00 0.682105 609 1.00 0.675946 610 1.00 0.658253 611 1.00 0.655040 612 1.00 0.668289 613 1.00 0.665720 614 1.00 0.672703 615 1.00 0.666635 616 1.00 0.708055 617 1.00 0.677693 618 1.00 0.665741 619 1.00 0.651918 620 1.00 0.652275 621 1.00 0.658659 622 1.00 0.671620 623 1.00 0.677316 624 1.00 0.672573 625 1.00 0.683747 626 1.00 0.683572 627 1.00 0.711564 628 1.00 0.704491 629 1.00 0.703665 630 1.00 0.687402 631 1.00 0.702519 632 1.00 0.693307 633 1.00 0.682351 634 1.00 0.692373 635 1.00 0.700110 ```
ExpHP commented 6 years ago

A couple of weeks ago, I tried to improve the error by throwing an error on the python end, but I just realized two things today:

ExpHP commented 5 years ago

The EOF while parsing error was trivial to fix (b9c4f9d0b4a4b908585f83a81df578e82b8dcc15). I can't believe I left the issue open this long.

I'll make a new issue about the performance stuff.