Charcoal-SE / halflife

Metasmoke back-end analysis client
Apache License 2.0
6 stars 2 forks source link

Chat is very slow #1

Open tripleee opened 6 years ago

tripleee commented 6 years ago

Frequently when a post is reported, you would like to see very quickly the first results from the analysis: is the domain name blacklisted? did the URL tail match any keywords? and only then proceed to report the more-detailed analysis.

The chat interface throttles sending to one message per second. Some messages could be collected into multi-line chat messages (but then you cannot use formatting -- no bold, italics, links etc) to make reports appear quicker.

Ultimately, I'm thinking the back-end queries should be done using some sort of asynch framework so that multiple queries could be pending at the same time and you don't have to wait for the queries to execute serially before you can get the result from the one you actually care about for this particular post.

tripleee commented 6 years ago

https://pawelmhm.github.io/asyncio/python/aiohttp/2016/04/22/asyncio-aiohttp.html is inspiring but not perhaps the primary candidate for architecture now that Python has native async.