Closed calexil closed 6 years ago
this seems to be mainly an issue with falling behind after the records have grown considerably.. and the chrome tab actually gets behind where saltybet live is at.
resetting the chrome tab after a number of failed bets would remedy this
I am a bit surprised character lookup is slow as it is now constant-time hash lookup. It is likely the way chrome has to preload all the character stats list for one moment (new match) and then not keeping it any cache automatically for speedup next time. I guess, from what i seen in code, the character data is huge and does not use chrome's internal database storage but something else not designed for big-data.
Given we know individual character records are pushed from the last 15 recent matches of that character(in the main branch?), There would diminishing returns keeping very old match data to feed the character stats list.
If the performance is too slow for you, a work-around is to going into the match data and delete the oldest range of time until you are happy with the speed. What this will do is remove older characters you rarely ever seen in recent time (which decreases the individual character records size), and by matchmaking chances, keep the regular character updates fresh. Again only the last 15 matches pushed for each character is remembered.
hmm
222k records and counting..... we need to figure out how to get the records under control..
not everyone has a supercomputer running this.
conversely a beneficial side effect of the massive records is that matches that are easier to llokup because the outcome is obvious... get entered faster... which is artificially inflating the consistency of correct bets
I think a solution to this problem is to use IndexedDB instead of the local chrome storage.
https://developer.mozilla.org/en-US/docs/Web/API/IndexedDB_API/Using_IndexedDB
It's a fake database but it would certainly be faster to only add one record after each match instead of replacing the existing array of size 222,000 with an array of size 222,001.
The existing match data in the local chrome storage could be imported into the new IndexedDB database when the extension is installed.
Mozilla says that IndexedDB can also be used in Web Workers so the lock up while calculating new chromosomes could be fixed after everything is migrated to IndexedDB.
I'll be honest, I probably won't begin working on this issue until the middle of August since I have to finish the first draft of my Master's thesis until then.
I might have another idea. Why not create a 2nd storage set with aggregated data for all characters? That would allow us to have VERY quick access to full averaged stats and match records, and be able to just call updates to it alongside the match recording after a single long initialization to import data from records. We already have everything worked out to work out average win times, etc, we just need to make a storage set for all that data and re-extrapolate them when the new data comes in by multiplying them out by the match total, then accounting for new data.
Maybe some format like
Char,win(ex: AAAAUABB), loss, avg odds, avg win time, avg loss time, other relevant data
still ongoing, even after upgrading to 8gb of ram
I'm experiencing this now as well, except I'd say it's more than of bets after the first never make it for me. I guess my dataset reached the same out of control breaking point as calexil's. Also took over 2 minutes per generation last time I updated genetic weights.
I suppose I could trim my dataset but I hate to do that since having a ton of data seems like what makes the bot more accurate. I see there was a "trim (or ignore or not store) 0 time matches and matches over a certain age" option mentioned -- did that ever get implemented or is it still is possible development?
still in the works after the refactoring @PapaNovember
I've been super busy with streaming, and recon is reeling from his thesis
we'll be getting back on it in the spring probably
the memory usage and cpu usage is out of hand, we really need to do something
I've got an update for you.
During the summer when I started with my Master's thesis, I lost all motivation for working on Saltbot.
Today I've found new motivation by trying to learn Typescript using the Saltbot. I've converted all Javascript files to Typescript files and I'm currently working on making it run again.
While this may not provide the urgently needed performance improvements for you, Typescript helps me incredibly with its typing support and because now Visual Studio can immediately tell me about errors in the code.
As soon as I get Saltbot running again on my computer, I will try to implement the change from chrome.local.storage to IndexedDB.
Just a warning beforehand: Because of the giant pull request from August there were many code changes in strategy.js including the introduction of a new chromosome part. I don't fully understand the requested code changes yet, but decided to merge many parts anyway.
Saltbot may behave strangely after I push the commit.
0e95b6c4f54b6d1c54da79ce81ee18df6cfbe7b6 has the change from chrome.storage.local to IndexedDB for match records.
Because of the change to typescript, you need to compile the code using "npm install" and "npm run build" as documented in the README file.
@calexil Are you willing to test the version? There may be hidden bugs but on my side the match processing now works lightning fast on my computer.
@reconman Ill test it today and report back
hmm the match data says it's imported, but nothing is showing up in the table.
also theres a warning in console:
document not ready yet, trying again in 500 milliseconds...
glad I made a backup, cause its totally not working anymore
the records and chromo never import
I am going to try and let the import happen while in monk mode to see if thats the issue.
nope, the data says imported, but it is not
RESOLVED, bets are placed lightning fast now
on a celeron 1.5ghz x2 system with 4gb of ram.. the bot cannot process the data fast enough to make a prediction and bet, and submit it before the timer runs out approximately 50% of the time
the first bet it makes is extremely fast. like 4 seconds.. every bet after takes over 45 seconds to formulate and enter and appears to be held up by the record recording
@reconman are the records inserted with some kind of sorting... or just most recent at the bottom?
from what I can tell its the latter. but that doesn't explain why it takes like 25 seconds to record the data...
it's a single line of text... like 100 bytes max ... :confused: