In order to lighten the burden of the backend, let's offload some of the more processing-intensive and repetitive tasks to Celery:
1. Populating the url_queue table.
2. Rebuilding the top-20 lists so they're not constantly being refreshed live.
3. Rebuilding the various page indexes and search tables.
4. Eventually, building a daily snapshot of the node links for the force-directed graph.
5. Processing newly-submitted .onions from the public.
In order to lighten the burden of the backend, let's offload some of the more processing-intensive and repetitive tasks to Celery:
We'll use RabbitMQ as the broker.