Open the-rarbg opened 9 months ago
Any way we can use postgres and assign a table to write to, and it will write fields like seeders, leechers, completed in it according to info hash
I've looked into that, but it's a pretty hefty performance hit on the server. Another solution, which I am applying for my own new website, is using the API to retrieve seeders and leechers statistics to be shown directly on the website. If you really want this to be pushed to the database, I would rather make a separate "python" or something code (could even make it a separate command in the tracker software), that could request data from the API using a list of hashes or ID's, without interfering with the tracker back-end code itself. ?
it can be created in a way, to persist data after sometime say 1 hour or 5 hour so we dont have to run a seprate scrapper script which take data for whom stats have not changed.
Or we can publish event after a regular interval of stats changed hash.
the reason being, so that we can give user some fresh info while seeing a torrent and we dont have to create another process and scraping data that is not changed.
the reason being, so that we can give user some fresh info while seeing a torrent and we dont have to create another process and scraping data that is not changed.
I'll look into adding a optional function like that. I've mainly focussed on the public tracking, so I'll have to see if I can make some time for that function, however, it could still be a performance hit on your database server, if there are many changes, just keep that in mind.
yes understandable, but suppose if i have million torrents with a table containing column info_hash, seeders, leechers, filename, completed and to keep upto date i have to scrape all data so (a million hit on tracker) as i might not know what info have changed even thought its in ram, then a million hit to database to update.
but in this case se can set batch process that write to db after say a day or two in a batch of say 10000 etc at specific time.
its just a tradeoff
any update on this, if this is in redis instead of any disk based db it would be much better i think.
any update on this, if this is in redis instead of any disk based db it would be much better i think.
It's purely memory based using BTreeMap. I've not got around it, as I've just contracted the Flu 2 days ago, so I'm kind of stuck with a headache, but I will work on it when I got the time for it. I would indeed rather use something like Redis or Meilisearch to update those statistics, but I'll look still into it.
Sorry for the delay.
I've been pondering about this function, and I rather will implement meilisearch for this, since it's a search engine, and made for many changes and such. MySQL I've been overloading doing this exact thing, and will be bad if using this manner. I will create an implementation for either writing to Redis or Meilisearch, but will probably planned for a v3.3 release. v3.2.2 is being worked on right now with minor features and changes, this idea we want is a pretty big change, so I'll be planning it for an v3.3.x release.
make sense, you can also use redis search, its good.
Any way we can use postgres and assign a table to write to, and it will write fields like seeders, leechers, completed in it according to info hash