PHPSocialNetwork / phpfastcache

A high-performance backend cache system. It is intended for use in speeding up dynamic web applications by alleviating database load. Well implemented, it can drops the database load to almost nothing, yielding faster page load times for users, better resource utilization. It is simple yet powerful.
https://www.phpfastcache.com
MIT License
2.36k stars 452 forks source link

2 million data cache? #898

Closed eskoctr closed 1 year ago

eskoctr commented 1 year ago

What's your question ?

How can I cache a mysql table with 2 million rows?

References (optional)

No response

Do you have anything more you want to share? (optional)

No response

github-actions[bot] commented 1 year ago

Hello curious contributor ! Since it seems to be your first contribution, make sure that you've been:

Geolim4 commented 1 year ago

How can I cache a mysql table with 2 million rows?

Short answer: Don't.

Long answer: If caching 2 millions rows is the solution to some overload/timeout issue then you probably asking yourself the wrong question.

eskoctr commented 1 year ago

Thanks. So, is partitioned storage possible? I'm trying to do something special. For example, there is a 200mb document, but I want to separate it into 3mb documents and combine them in the query. Is it possible?

Geolim4 commented 1 year ago

You don't need a cache, you either need a CDN or a document storage (Solr, AWS, or NoDB storage).

Geolim4 commented 1 year ago

Hello @eskoctr ,

I’m closing this issue for now because of (inactivity / outdated code / …).

You can always reopen it though! :) Please (update the issue / add comment / clarify …).

Regards, Georges.L