msprev / fzf-bibtex

a BibTeX source for fzf
BSD 3-Clause "New" or "Revised" License
129 stars 15 forks source link

Parsing error when large author field #11

Closed dloewenstein closed 5 years ago

dloewenstein commented 5 years ago

Hi,

received the following error as referenced before in #3 but this time (at least as it seems) due to very large author field in the bibtex entry.

panic: runtime error: index out of range

goroutine 1 [running]:
github.com/msprev/fzf-bibtex/bibtex.parseEntry(0xc42001d002, 0x7e6, 0xc42001d002)
    /home/dloewe/go/src/github.com/msprev/fzf-bibtex/bibtex/bibtex.go:33 +0x440
github.com/msprev/fzf-bibtex/bibtex.Parse(0xc4200add60, 0xc42005e450, 0x1, 0x1, 0x5179f0, 0x517a20)
    /home/dloewe/go/src/github.com/msprev/fzf-bibtex/bibtex/bibtex.go:16 +0x144
github.com/msprev/fzf-bibtex/cache.RefreshAndDo(0x50f7ad, 0x4, 0xc42005e450, 0x1, 0x1, 0x50f6fe, 0x3, 0x5179f0, 0x517a20)
    /home/dloewe/go/src/github.com/msprev/fzf-bibtex/cache/cache.go:81 +0x1a1
github.com/msprev/fzf-bibtex/cache.ReadAndDo(0x50f7ad, 0x4, 0xc42005e450, 0x1, 0x1, 0x50f6fe, 0x3, 0x5179f0, 0x517a20)
    /home/dloewe/go/src/github.com/msprev/fzf-bibtex/cache/cache.go:94 +0x4a0
main.ls(0x50f7ad, 0x4, 0xc42005e450, 0x1, 0x1)
    /home/dloewe/go/src/github.com/msprev/fzf-bibtex/cmd/bibtex-ls/main.go:30 +0x80
main.main()
    /home/dloewe/go/src/github.com/msprev/fzf-bibtex/cmd/bibtex-ls/main.go:23 +0x6c

Original bibtex entry:

@article{10.1093/eurheartj/eht150,
    author = {Authors/Task Force Members and Brignole, Michele and Auricchio, Angelo and Baron-Esquivias, Gonzalo and Bordachar, Pierre and Boriani, Giuseppe and Breithardt, Ole-A and Cleland, John and Deharo, Jean-Claude and Delgado, Victoria and Elliott, Perry M. and Gorenek, Bulent and Israel, Carsten W. and Leclercq, Christophe and Linde, Cecilia and Mont, Lluís and Padeletti, Luigi and Sutton, Richard and Vardas, Panos E. and ESC Committee for Practice Guidelines (CPG) and Zamorano, Jose Luis and Achenbach, Stephan and Baumgartner, Helmut and Bax, Jeroen J. and Bueno, Héctor and Dean, Veronica and Deaton, Christi and Erol, Cetin and Fagard, Robert and Ferrari, Roberto and Hasdai, David and Hoes, Arno W. and Kirchhof, Paulus and Knuuti, Juhani and Kolh, Philippe and Lancellotti, Patrizio and Linhart, Ales and Nihoyannopoulos, Petros and Piepoli, Massimo F. and Ponikowski, Piotr and Sirnes, Per Anton and Tamargo, Juan Luis and Tendera, Michal and Torbicki, Adam and Wijns, William and Windecker, Stephan and Document Reviewers and Kirchhof, Paulus and Blomstrom-Lundqvist, Carina and Badano, Luigi P. and Aliyev, Farid and Bänsch, Dietmar and Baumgartner, Helmut and Bsata, Walid and Buser, Peter and Charron, Philippe and Daubert, Jean-Claude and Dobreanu, Dan and Faerestrand, Svein and Hasdai, David and Hoes, Arno W. and Le Heuzey, Jean-Yves and Mavrakis, Hercules and McDonagh, Theresa and Merino, Jose Luis and Nawar, Mostapha M. and Nielsen, Jens Cosedis and Pieske, Burkert and Poposka, Lidija and Ruschitzka, Frank and Tendera, Michal and Van Gelder, Isabelle C. and Wilson, Carol M.},
    title = "{2013 ESC Guidelines on cardiac pacing and cardiac resynchronization therapy: The Task Force on cardiac pacing and resynchronization therapy of the European Society of Cardiology (ESC). Developed in collaboration with the European Heart Rhythm Association (EHRA)}",
    journal = {European Heart Journal},
    volume = {34},
    number = {29},
    pages = {2281-2329},
    year = {2013},
    month = {06},
    issn = {0195-668X},
    doi = {10.1093/eurheartj/eht150},
    url = {https://doi.org/10.1093/eurheartj/eht150},
    eprint = {http://oup.prod.sis.lan/eurheartj/article-pdf/34/29/2281/17895881/eht150.pdf},
}

Sequence of steps to reproduce error:

  1. Parse above entry with bibtool
  2. Run bibtex-ls on parsed file
dloewenstein commented 5 years ago

The problem is resolved when removing roughly 50% of the authors, doesn't really seem to matter which ones.

msprev commented 5 years ago

Thanks -- now fixed with the latest commit. This was a general issue with long bibtex fields. I've increased the limit by 10x, so should now cover all reasonable use cases.

dloewenstein commented 5 years ago

Thank you! I can confirm that everything is working as it should now 👍