Closed p1d1d1 closed 10 months ago
Thanks for reporting. We didn`t have any code changes on this part lately, but it is a problem we encountered before. We'll have a look.
The data are now preprocessed and loaded in redis correctly in the branche main_preprocessing. However, I still have a problem with the pagination on my local machine, @FStriewski if you want to make a test, maybe it works on the server.
I,ve deployed your branch to the server @eliaferrari and the metadata is coming trough. Let`s see how it looks like tomorrow, to check if it still runs fine after the scraper has run.
Pagination also still seems to work and the sorting seems correct to me
Current main fails because of a pydantic error, that I can`t replicate on localhost. Maybe due to version differences or a cached .csv somewhere? I'll have a look but probably tomorrow
Not sure what the real reason behind this is but if we remove "lang2" and "update" fields from the schema it runs fine. Both fields are optional, so I`m not sure why it gets hung up on that. The type is correct and we have more optional fields, so the issue must be on the csv or pickle side and some changes there.
app/redis/schema.py:
I've just updated the main_preprocessing branch. The problem with pydantic is now resolved removing the two fields (update and lang_2). I've also created a temporary Pickle dataframe in main branch, which will be overwritten from the next scraper run. If there are no more open matters on this issue and the branch works on the server I'll merge the main_preprocessing definitely.
Thanks for seeing into it @eliaferrari 👍 I`ve deployed your changes to prod and it looks very promising! Ì'd say go ahead and merge to main.
As in the title. Please fix