amiaopensource / vrecord

Vrecord is open-source software for capturing a video signal and turning it into a digital file.
https://github.com/amiaopensource/vrecord
154 stars 45 forks source link

"Memory allocation failed" #235

Closed CSchloss385 closed 6 years ago

CSchloss385 commented 6 years ago

Sometimes when I'm digitizing a very long tape, vrecord is unable to analyze the QCTools XML following the ingest. I think this is because the XML is too large and the computer (or at least the Mac Mini that I'm using) doesn't have the capacity to analyze such a large file. When this happens, I get an error that says "Memory allocation failed: growing nodeset hit limit." I don't think there is any way to really fix this error since it depends on the memory of the computer you are using, but it would be helpful for a clearer error message so the user can know what vrecord means by "memory allocation failed."

retokromer commented 6 years ago

Thank you, @CSchloss385, for reporting this! It‘s not an error by vrecord, but by the OS. I’ll try to reproduce it next week on an old Mac mini. Could you possibly tell me which OS version you are running?

CSchloss385 commented 6 years ago

I am currently running OS X El Capitan, Version 10.11.6. Thanks @retokromer for looking into this!

nathanavant commented 6 years ago

Just thought I would add a data point - I am seeing this same error after digitizing longer tapes. I'm running High Sierra on a Late 2014 Mac Mini.

retokromer commented 6 years ago

Hmm… It seems to be the hardcoded limit in the xpath.c engine of libxml2:

/*
 * XPATH_MAX_NODESET_LENGTH:
 * when evaluating an XPath expression nodesets are created and we
 * arbitrary limit the maximum length of those node set. 10000000 is
 * an insanely large value which should never be reached under normal
 * circumstances, one would first need to construct an in memory tree
 * with more than 10 millions nodes.
 */
#define XPATH_MAX_NODESET_LENGTH 10000000

The insanely large value of 10 000 000 nodes is reached!

bturkus commented 6 years ago

Go QCTools! Reaching for new highs!

On Fri, Feb 9, 2018 at 3:37 PM, Reto Kromer notifications@github.com wrote:

Hmm… It seems to be the hardcoded limit in the xpath.c engine of libxml2:

/*

  • XPATH_MAX_NODESET_LENGTH:
  • when evaluating an XPath expression nodesets are created and we
  • arbitrary limit the maximum length of those node set. 10000000 is
  • an insanely large value which should never be reached under normal
  • circumstances, one would first need to construct an in memory tree
  • with more than 10 millions nodes. */

    define XPATH_MAX_NODESET_LENGTH 10000000

The insanely large value of 10 000 000 is reached!

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/amiaopensource/vrecord/issues/235#issuecomment-364558459, or mute the thread https://github.com/notifications/unsubscribe-auth/AIkiXmTUaIkI7l3_JAgBMo-twd8qZNnDks5tTKxsgaJpZM4RudpS .

--

Benjamin Turkus | The New York Public Library Assistant Manager for Audio and Moving Image Preservation

Barbara Goldsmith Preservation Division 40 Lincoln Center Plaza, New York, NY 10023 T. 212.870.1609 <(212)870-1609> | benjaminturkus@nypl.org http://www.nypl.org/preservation https://twitter.com/NYPLPreserve

Lifelong Learning | Advancing Knowledge | Strengthening Our Communities

retokromer commented 6 years ago

@CSchloss385 A quick and dirty fix is to compile xpath.c locally with a higher constant and use it. A real fix would be to generate the XML incrementally.

retokromer commented 6 years ago

@CSchloss385 Could you possibly test the fix?

CSchloss385 commented 6 years ago

Yes, I will try to test this today.

retokromer commented 6 years ago

If it’s fine now, then please close the issue. Otherwise please report back again. Thank you!

retokromer commented 6 years ago

ping

CSchloss385 commented 6 years ago

I just tested this and I did not get any errors. I'm going to close this issue.

retokromer commented 6 years ago

Thank you, @CSchloss385, for reporting back!