Closed timdorr closed 7 years ago
@timdorr I haven't experimented with that idea, although it makes a lot of sense and I can't think of any problems off the top of my head.
I would want to be careful to keep backwards compatibility, which probably means some extra complexity to continue to support the old approach, but if we can make this a new option rather than a breaking change I'd be stoked to get a pull request!
Closing the issue, still interested in this idea, but it's not being actively worked on as far as I know and it's not a priority for us.
98 added a SAX processor that reduced memory footprint quite a bit, as you don't have to convert the response XML first into a Nokogiri object tree before converting back to a more simple hash array. Yay!
However, the results still need to be coalesced into that array before you can return
doc.results
from the parser. While the memory requirements for hashes are not too crazy, it's still pretty wasteful and results in memory bloat (which is bad since Ruby doesn't give back memory).What would be better is to provide back an
Enumerable
from the search that would let us declaratively handle the results and keep a clamp on memory usage. I'm currently playing withNokogiri::XML::Reader
to make this happen, but I'm curious if there have been any experiments with this in the past. I know RETS isn't exactly the most sane standard, so are there problems with this approach? It might require some wrangling to make it work, but it seems worth the effort.