EIDA / eida-statistics

Aggregated statistics of EIDA nodes
GNU General Public License v3.0
0 stars 0 forks source link

Implement a caching system #33

Closed jschaeff closed 4 months ago

jschaeff commented 1 year ago

In order to avoid query flooding and provide fastest replies, implement the caching of the request.

See https://docs.python.org/3/library/functools.html and https://realpython.com/lru-cache-python/

vpet98 commented 1 year ago

I found a way to use the cache decorators like this:

@view_config(route_name='dataselectrestricted', openapi=True)
def restricted(request):
    return to_call(request)

@lru_cache
def to_call(request):
    # real /restricted method code here

But I am not sure if this benefits our service in any way. After testing it, I don't see it being faster. I believe that cache would be helpful, for example, in case a function is called with the same parameters over and over again, probably by another function or itself.

jschaeff commented 1 year ago

Thank you for exploring this. I think we can put this in ultra low priority.