DanielNobbe / Forex

0 stars 0 forks source link

Speed up inference retrieval #17

Closed DanielNobbe closed 3 years ago

DanielNobbe commented 3 years ago

When running inference right now, we need to download a lot of data every time the model runs, which costs about a minute. This is far too long. The reason it takes so long right now is because we download with quite a high granularity, and we are limited to 5000 samples per request.

Ideas for fix:

DanielNobbe commented 3 years ago

Probably best to implement multiple improvements to speed this up: