Power2All / torrust-actix

A multi-functional lightweight BitTorrent Tracker
MIT License
86 stars 5 forks source link

Any way we can use postgres and assign a table to write to #27

Open the-rarbg opened 9 months ago

the-rarbg commented 9 months ago

Any way we can use postgres and assign a table to write to, and it will write fields like seeders, leechers, completed in it according to info hash

Power2All commented 9 months ago

Any way we can use postgres and assign a table to write to, and it will write fields like seeders, leechers, completed in it according to info hash

I've looked into that, but it's a pretty hefty performance hit on the server. Another solution, which I am applying for my own new website, is using the API to retrieve seeders and leechers statistics to be shown directly on the website. If you really want this to be pushed to the database, I would rather make a separate "python" or something code (could even make it a separate command in the tracker software), that could request data from the API using a list of hashes or ID's, without interfering with the tracker back-end code itself. ?

the-rarbg commented 9 months ago

it can be created in a way, to persist data after sometime say 1 hour or 5 hour so we dont have to run a seprate scrapper script which take data for whom stats have not changed.

Or we can publish event after a regular interval of stats changed hash.

the-rarbg commented 9 months ago

the reason being, so that we can give user some fresh info while seeing a torrent and we dont have to create another process and scraping data that is not changed.

Power2All commented 9 months ago

the reason being, so that we can give user some fresh info while seeing a torrent and we dont have to create another process and scraping data that is not changed.

I'll look into adding a optional function like that. I've mainly focussed on the public tracking, so I'll have to see if I can make some time for that function, however, it could still be a performance hit on your database server, if there are many changes, just keep that in mind.

the-rarbg commented 9 months ago

yes understandable, but suppose if i have million torrents with a table containing column info_hash, seeders, leechers, filename, completed and to keep upto date i have to scrape all data so (a million hit on tracker) as i might not know what info have changed even thought its in ram, then a million hit to database to update.

but in this case se can set batch process that write to db after say a day or two in a batch of say 10000 etc at specific time.

its just a tradeoff

the-rarbg commented 8 months ago

any update on this, if this is in redis instead of any disk based db it would be much better i think.

Power2All commented 8 months ago

any update on this, if this is in redis instead of any disk based db it would be much better i think.

It's purely memory based using BTreeMap. I've not got around it, as I've just contracted the Flu 2 days ago, so I'm kind of stuck with a headache, but I will work on it when I got the time for it. I would indeed rather use something like Redis or Meilisearch to update those statistics, but I'll look still into it.

Sorry for the delay.

Power2All commented 4 months ago

I've been pondering about this function, and I rather will implement meilisearch for this, since it's a search engine, and made for many changes and such. MySQL I've been overloading doing this exact thing, and will be bad if using this manner. I will create an implementation for either writing to Redis or Meilisearch, but will probably planned for a v3.3 release. v3.2.2 is being worked on right now with minor features and changes, this idea we want is a pretty big change, so I'll be planning it for an v3.3.x release.

the-rarbg commented 3 months ago

make sense, you can also use redis search, its good.