Open dfbravo opened 6 years ago
In this case, it looks like the issue is with the server. When a Limit is requested and the server has more records to provide above that limit, it's supposed to return a MAXROWS
element which tells the client that there's more to get. In this case, the server isn't returning that and so PHRETS believes that it's done.
There are a few options for you I think:
1) Do pagination yourself (don't set the 5th param), or 2) Implement a custom parser that tells PHRETS to behave differently.
For the custom parser, you'd basically need to:
1) Make your own class which extends \PHRETS\Parsers\Search\RecursiveOneX
2) In your class, declare a new continuePaginating()
method that does something like return ($rs->getReturnedResultsCount() < $rs->getTotalResultsCount());
3) Once you make your Session
object, do: $session->setParser(\PHRETS\Strategies::PARSER_SEARCH_RECURSIVE, new YourCustomParser());
and then PHRETS should use your class for controlling automatic pagination instead of it's default, standard behavior.
To answer your question about memory, using automatic pagination does require more memory since PHRETS will collect all records before returning them to you. I have a proof-of-concept working which will allow the best of both worlds (PHRETS manages the retrieval of all records but memory usage is still very low) but it's not ready for release yet.
Any update to support pagination?
Maybe by adding callback function to search
@budirec pagination according to the RETS standard is already supported. Please see my earlier comment for details on why this isn't working with this particular server. RETS is a general standard that requires each vendor to do their own implementation, and with optional features like pagination, it may not be implemented in the correct way (if at all).
I'm sorry, I was referring to your second comment about the proof of concept. My goal with the pagination is to save memory by processing the result 1 page at a time, so we didn't have a very big result and has to deal with them all at once. For now, I'll try the custom parser you mention
thanks
I used below code for pagination https://github.com/troydavisson/PHRETS/wiki/Connect,-download-listing-data-in-csv-format,-disconnect
But i am confused that how that pagination working and can me please clarify limit and offset parameters ?
And is this code for pagination ?
Please reply me as soon as possible.
I'm sorry, I was referring to your second comment about the proof of concept. My goal with the pagination is to save memory by processing the result 1 page at a time, so we didn't have a very big result and has to deal with them all at once. For now, I'll try the custom parser you mention
thanks
Hello, Can you please send me pagination code if you have ?
Sorry, I did have. Currently I limit the search per zip and multiple filters. So it never return crazy results.
On Thu, Jul 30, 2020, 20:55 nxsol593 notifications@github.com wrote:
I'm sorry, I was referring to your second comment about the proof of concept. My goal with the pagination is to save memory by processing the result 1 page at a time, so we didn't have a very big result and has to deal with them all at once. For now, I'll try the custom parser you mention
thanks
Hello, Can you please send me pagination code if you have ?
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/troydavisson/PHRETS/issues/181#issuecomment-666901157, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACWYEREGRNZBT7GELKTHQ4TR6I6ETANCNFSM4EFYATOQ .
Sorry, I did have. Currently I limit the search per zip and multiple filters. So it never return crazy results. … On Thu, Jul 30, 2020, 20:55 nxsol593 @.***> wrote: I'm sorry, I was referring to your second comment about the proof of concept. My goal with the pagination is to save memory by processing the result 1 page at a time, so we didn't have a very big result and has to deal with them all at once. For now, I'll try the custom parser you mention thanks Hello, Can you please send me pagination code if you have ? — You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <#181 (comment)>, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACWYEREGRNZBT7GELKTHQ4TR6I6ETANCNFSM4EFYATOQ .
Ok no problem Thanks
In this case, it looks like the issue is with the server. When a Limit is requested and the server has more records to provide above that limit, it's supposed to return a
MAXROWS
element which tells the client that there's more to get. In this case, the server isn't returning that and so PHRETS believes that it's done.There are a few options for you I think:
- Do pagination yourself (don't set the 5th param), or
- Implement a custom parser that tells PHRETS to behave differently.
For the custom parser, you'd basically need to:
- Make your own class which extends
\PHRETS\Parsers\Search\RecursiveOneX
- In your class, declare a new
continuePaginating()
method that does something likereturn ($rs->getReturnedResultsCount() < $rs->getTotalResultsCount());
- Once you make your
Session
object, do:$session->setParser(\PHRETS\Strategies::PARSER_SEARCH_RECURSIVE, new YourCustomParser());
and then PHRETS should use your class for controlling automatic pagination instead of it's default, standard behavior.
Hi, I used wp property importer plugin. I need to pagination fro importer because at a time all properties import taking mote time and sometime it generated time out error. Can you please provide me code for pagination for importer plugin ?
Hello,
I am trying to use pagination with my RETS request. I am following the example from the video PHRETS: Logging. I thought that by setting the last argument of the Search function to
true
then this would perform a recursive search thus using pagination. I'm not sure if this is true or not.How do I make sure my RETS request is set to use pagination? How do I know if the server is set up to use pagination?
My ultimate goal is to limit the peak memory usage and was wondering if using pagination would a solution. Is this true? If I use pagination would the amount of memory be less than not using pagination?
I want to make sure my application won't crash if the request is abnormally large.
Code:
Output: