Closed muziqaz closed 3 weeks ago
Specifying date ranges (with time) for the WUs section is missing. That would give you the ability to exclude previous WU testing runs. In my case at 100 WUs a day, the poor web UI takes 8-22 seconds to resize the browser window with (23 days) x (100 WU/day) displayed, zoomed to 30%:
[Edited] An 'OS' column is missing from the upper Work Unit Averages
section, where you can't really tell which PC ran the WUs, where you would probably want to turn on or off segregating the results by 'Machine' name or by 'OS':
For reference, with all the columns turned on at 70% the Machines page looks like this for my setup:
Ah, so Averages
are split per resources
I also noticed few inconsistencies, for example on mobile screen (iphone 12), the table shifts beyond the screen
I was not able to test all fields because SAVE button on settings is always disabled
Good input but this should be split in to multiple issues. The two main things I'm getting from this:
Note, if you click on the WU info icon the WU details page now has a Logged Credits
section. This is built from the same data as https://apps.foldingathome.org/wu.
I also noticed few inconsistencies, for example on mobile screen (iphone 12), the table shifts beyond the screen
I was not able to test all fields because SAVE button on settings is always disabled
Please file separate GitHub issues with more details for these problems.
I'll split this into separate issues once I'm home.
I'm also looking at it now, and have this crazy suggestion:
Why not get rid of Work Unit Averages
section and keep Recent Work Units History
and rename it to Work Units History
or whatever simpler.
Then have each project clickable to expand and list all the WUs (with RCGs) completed for that project with all the specific info, like completed or failed and why failed, etc.
And now I see Logged Credits
, nice one.
I would like to exclude core 0xfe from the tables.
I know I am using super wide monitor as an example, but I believe it would be similar situation with 16:9 AR monitors. I feel like
Work Units
section needs its own width scaling option. Or maybe columns could adjust automatically depending on the length of the content within the column, similar like Excel does auto cell width.Work Unit Averages
will not suffice for us who are benchmarking new projects for public release. When project is pushed to internal testing, quite often researcher (for one reason or the other) has WUs set up to certain step length, so in the beginning one frame takes 10s to complete. We try these initial WUs, and make a suggestion to a researcher to increase the WU length, by increasing number of steps within the simulation. Researcher recreates the project under same project number and pushes to us again. We run it, and this time TPF is at 60s. v8Work Unit Averages
records both WUs under same project, so average TPF comes out as 35s, which is incorrect, since we want to ignore initial WUs in our Base Credit suggestions, and would just use 60s TPF WUs for that. HFM allowed us to wipe old WUs from the benchmark section of the app, so that app would record just the newly regenerated project data and calculate avergaes just from them. So, in v8 we would need an option to see individual WUs completed under the project (and be able to remove unwanted WUs), while still be able to see automatically calculated averages. It would help us (testers) a lot if we could have a column which tells us if the WU has been completed successfully, or if it failed, what was the cause of the failure. Sometimes project runs stable, and suddenly there is one gen or run which is unstable, so with above info, we would be able to point out to researchers that we are seeing certain gens/runs are failing. That also helps with certain hardware failing with certain projects/WUs. That's where being able to see each WU completed under the project would help. It would be easier on the eye, and probably helpful, ifWork Units Averages
andRecent Work Unit History
has similar column selection. Like,Work Unit Averages
hasResources
column which shows which CPU or GPU completed the project, whileRecent History
showsMachine
. Also, I am wondering how will thatResources
column will look like, when I start adding the rest of the machines to be monitored. Since every machine will have different CPU:n set up and different GPUs, will they be just listed in that column one after another? Either way, an option to see each individual WU under the project would sort that issue out. Also, screenshot shows PPD column showing PPD values at random. So you have this little gem available to us: https://apps.foldingathome.org/wu It would be super nice to have this incorporate intoWork Units
section. So when we are checking completed/failed WUs under certain project, we could click or have a link next to each WU taking us to the page above with WU RCG already filled in, that way we can quickly check if certain problematic WU has been completed by others. So I think this link would be helpful just next to failed WUs, since we don't really need to check up upon successfully completed WUs. And, I mentioned before somewhere else,Work Unit Averages
TPF column needs more precise TPF values.Sorry for the long write up.