Open tamitchell opened 2 weeks ago
So in terms of rate limiting - I thought that making the setting more strict on OpenAI would suffice, and caching could be done with ReactQuery. I wanted to stay away from localStorage, because maintaining it can be finicky depending on browser settings, but there are actually a few good reasons why i should actually move forward with a more outlined cache management system and why localStorage may come in handy now...
the default location is set to new york, at least until the actual geolocation data is retrieved. this is inadvertently two calls, even when the user refreshes, it defaults back to new york instead of their current location (unsure if bug but this is probably just a mistake in design on my part)
unless the weather has majorly changed (outside of some threshold) i can probably keep the same outfit recommendation and persist it between sessions too (this would save api calls to openai)
gonna push this to a feature branch since the initial setup is done and convert the rate-limiting and cache management to it's own ticket.
Set up LangChain with the weather app for AI-powered clothing recommendations.
Tasks: