Open Frozentreefrog opened 1 year ago
I haven't checked but my assumption is that it must be another header issue. I have a feeling they are trying to stop bots or they are following this repo. I'm partially concerned it may be the latter which means anything I fix will immediately be blocked.
@Alkaar: Doesn’t seem like any new headers have been introduced after the user-agent change last week, however I am quite sure some folks do have their bots working correctly (somehow), looking at today’s release times - the reservations went by instantly while our implementation failed. Thus there has to be some solution which we haven’t figured out here yet. Off of that observation, I doubt they’re following this repo! Lets figure this out!!
Could it be something IP address based? There are some GeoIp and Track API calls made prior to the Book call on the UI/site.
I also noticed a 500 error from the UI upon using the same account after the snipe errors out.
Is there anyone on here who can confirm the current implementation is still working OK for them?
Pretty sure some folks have it figured out from observing today’s release! Help!! :)
I just tested this morning and it still works for me. I'm hesitant to make an official change to hardcode user-agent as I would prefer if it were more dynamic. At least for those of you who are following this repo very closely, try to keep your user-agent up to date.
Just chiming in to confirm it also works for me after adding:
"User-Agent" -> "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36"
Pulled the repo for the first time today.
How do you all go about checking or trialing which headers need to be added, and with what, when the code breaks?
Where do I need to place the user agent details? I am not able to successfully get any reservations even though the bot is able to run.
In the Headers. Seemed to work for me too once I did this
@christj99 you can capture HTTP requests in your browser under the Network tab or in a software like Burp Suite, so you can manually make a reservation on Resy and then look at the api.resy.com/3/book HTTP request to see what headers it's currently sending
@christj99 you can capture HTTP requests in your browser under the Network tab or in a software like Burp Suite, so you can manually make a reservation on Resy and then look at the api.resy.com/3/book HTTP request to see what headers it's currently sending
Ah makes sense. Thanks so much @IAmWinyl !
@crookedbans or @jaipalsilla or @christj99 - have either of you gotten it to work this week? I also applied the user-agent fix, but still getting the "missing the shot" issue for a restaurant with plenty of reservation options (Coletta #58982).
I checked last week and it wasnt working for me
I just tested and it looks busted again. I assume another user agent change and I'm 99% sure they are cracking down on bots.
Has anyone figured out the source of the breaking changes? Resybot.com is still able to pull from resy, but I am not sure if they are using the internal API or just web scraping
@Alkaar I am still looking into it but one major change I noticed was some (if not all) of the requests that were once GET are now OPTIONS.
@cnw004 I was able to get a 200 today. GET request with X-Resy-Auth-Token, Authorization, X-Resy-Universal-Auth, User-Agent and Content-Type headers
@Alkaar any way to fix current issues ?
@jonwihl are you able to get it to work?
Adding "Cache-Control": "no-cache"
to the request headers seems to fix the issue.
Explanation:
Print your GET request response's headers.
If the GET request response's headers contain 'X-Cache': 'Error from cloudfront'
then adding "Cache-Control": "no-cache"
should fix the issue.
NOTE: The "User-Agent"
header had no effect during my testing.
Has anyone confirmed this is working with that fix? Havent been able to get it to work myself
I tried adding user-agent and "Cache-Control": "no-cache" and it had no effect. still get internal server error. was anybody able to fix?
Hi, it seems even after I add the user-agent header:
Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36
The bot no longer works and gets Internal Server Error.
This seems to be a new issue, as the bot was working ~ a week ago.
Does anyone maybe know the issue?