Closed brettfiedler closed 3 years ago
I think that all this data is available in the step function in which we convert marker location into model values:
We should be able to log everything needed there
This step function is run at 60hz (ideal).
I have written code that spits out a csv also before, and could add an options dialog entry to do that. My guess is that this would take an hour to get a one-off version snapshot that could do this for you. I will hold off until our meeting though.
From today's meeting, I will try to get something to @BLFiedler for him to poke around with by the end of next week.
I'll be talking to EM tomorrow (8/18) and can confirm if anything here should be prioritized for the short-term.
Today @BLFiedler said we can "push this out for a while." Marking as deferred.
This is being revived for this quarter. Reassigning to put on radar, but we may want to move it to a new issue that discusses the specific implementation and/or PhET-iO usage.
@BLFiedler, would you please put out any deadlines for the next quarter or so, even general ones, into this issue?
Looking like week of March 9th for a usable prototype.
Sounds good! Looks like that is about 3 weeks left for this deadline. I will get started now on an MVP.
After instrumenting the sim over in https://github.com/phetsims/ratio-and-proportion/issues/351, here is an example of the data stream as you change one of the hands.
And the model.targetRatioProperty
will be in the data stream whenever it changes. Its initial value is given (along with everything else) in the initialState event that is emitted on startup. You can play with it further by looking at the console output of
A bit nicer and colorized: https://phet-dev.colorado.edu/html/ratio-and-proportion/1.1.0-dev.2/phet-io/ratio-and-proportion_all_phet-io.html?postMessageOnError&phetioStandalone&phetioConsoleLog=colorized&phetioEmitHighFrequencyEvents=false
I'll head over to https://github.com/phetsims/tangible/issues/4 now.
https://github.com/phetsims/tangible/issues/4 is going to take a bit more work. I'll keep posted.
Here is my working progress:
I have been able to commit for this issue, over in https://github.com/phetsims/tangible/issues/4.
Master now supports marker input.
To use:
?tangible
query parameter.You can use this version to test: https://phet-dev.colorado.edu/html/ratio-and-proportion/1.1.0-dev.4/phet/ratio-and-proportion_en_phet.html?tangible
Good luck!
Okay, magically started working (edit: magic = effort by MK). Did manage to get detection and movement in 1.1.0dev.5 (!), did not try going back to dev.4 though. It definitely loses the marker quite frequently when in motion and messing with configs did not seem to help it visually on the visual webcam demo.
To catch up the thread, @BLFiedler and I both found this morning that marker input wasn't working on 1.1.0-dev.4, I created https://github.com/project-beholder/beholder-detection/issues/4 as a best guess of what the problem is. I'm glad it is working for you though!
@BLFiedler, here is an updated version that should track much better than it did before.
Still to do for this issue:
?heightInPixels=X
which can help orient the height, but it is non trivial to do so. Ideally we want to work out the best way to tell remote study people to calibrate their system, in an easy way.It is highly recommended to use https://bayes.colorado.edu/dev/phettest/tangible/demo.html to calibrate your system, since it has a video for you to use. Perhaps we will give this out to participants also if we can't embed the video into the sim in time.
Take a look on master or with something like
Quick answers:
Primary though, given our desire for some data, it might be good and good enough to consider how to get the data export for RaP for things like hand position, toggle states, success states, sound states, etc through phet-io so we can have folks play around on say a tablet and still get that data out.
From design meeting today, @BLFiedler will take a look at the actual items that he wants in the data stream.
To test master with tangible: https://bayes.colorado.edu/dev/phettest/ratio-and-proportion/ratio-and-proportion_en.html?brand=phet&tangible
Highly recommended to have the demo open in a different tab so you can check to see if the markers are working https://bayes.colorado.edu/dev/phettest/tangible/demo.html
And to test master of the PhET-iO data stream:
Current marker numbers:
Alright - these are probably not exact, and I think in most cases we care about the "newValue" for each, but I think we'd want an output of the following:
If possible, it might also be nice to have a reference for the thresholds as well, so we can recreate what sounds are being played when? Might be fine just to have the values, but may be important to log dyanmically when they change? Not sure what value that would be.
Nope, forgot to assign. Sorry! @ you @zepumph
Version needed by next Friday, so I should get a working, recording version by this Friday.
If possible, it might also be nice to have a reference for the thresholds as well, so we can recreate what sounds are being played when?
Would there be any values that aren't included in this set of constants?
So long as those values are all we need, here is how you access everything you asked for.
In general, any phetioID below that is on the discoverScreen
is also on the createScreen. Unless otherwise stated, the event will have a key phetioID
which will match the provided phetioID when a change occurs.
time: included in each event, like this:
{
"index": 36,
"time": 1619815786464,
"type": "model",
"phetioID": "phetioEngine.phetioElementAddedEmitter",
"name": "emitted",
}
targetRatioProperty: ratioAndProportion.discoverScreen.model.targetRatioProperty
tupleProperty (antecedent and consequent): ratioAndProportion.discoverScreen.model.ratio.tupleProperty
unclampedFitnessProperty (the fitness value being used by the model): ratioAndProportion.discoverScreen.model.unclampedFitnessProperty
tickMarkViewProperty (NONE, VISIBLE, or VISIBLE_WITH_TEXT): ratioAndProportion.discoverScreen.view.tickMarkViewProperty
Screen ID? (not sure what the value of this was from the console): ratioAndProportion.general.model.screenProperty
has values of the screen phetioID like ratioAndProportion.discoverScreen
lockedProperty (if on Create screen): ratioAndProportion.createScreen.model.ratio.lockedProperty
tickMarkRangeProperty (if on Create screen): ratioAndProportion.createScreen.view.tickMarkRangeProperty
constant values:
phetioEngine.phetioElementAddedEmitter
emitting an event where data.phetioID
is "ratioAndProportion.global.model.rapConstants". Then data.state
will be the constant values on startup.
We can also capture a complete phetioState every so often (for other data studies we often do every second or so. This way we have a snapshot of all these values every second in addition. Would that be helpful?
You can look at the current values and customize the above phetioIDs by playing with studio here: https://phet-dev.colorado.edu/html/ratio-and-proportion/1.1.0-dev.11/phet-io/wrappers/studio/
Then from there I can create a link to record with. I am presuming that we can use phet-dev to do the study.
https://phet-io.colorado.edu/devguide/ can be a good guide for general PhET-iO understanding.
?phetioConsoleLog=json
Add instrumentation for movingInDirectionProperty and inProportionProperty
I created a parsing engine to give @BLFiedler a table-like format to interpret. I started with a test:
I started with a small data log in which I manipulate every control in the sim
ration-and-proportion-test.json.log
Then I wrote a Node script to turn this into a format that the R package jsonlite
can stream in (just remove the .txt
:
Then I wrote a small R script to make it into a csv:
library(jsonlite)
myData = read.delim2("test.json", header = FALSE)
# Trivial example
mydata <- stream_in("test.json")
print(mydata)
test1 <- stream_in(file("parsedJSONOutput.json"))
print(test1)
write.csv(test1, "data.csv", row.names=FALSE)
It created a csv like this:
@BLFiedler, I hope that this helps get you started.
Great! First glance, it looks good to me. I think the data parsing is probably something we can refine later if I'm having any trouble, but for now let's go ahead and get a test link I can try out before deployment on Friday?
Just a couple minutes out on that!
For the purposes of this study, this is complete and I will close. Further exploration of movement related data output can happen in new issues.
Note: This issue is not tied to RaPs publication. It's just the most relevant sim we have currently in design. For use during meeting scheduled for 8/17
We'd like to be able to extract positional data for sim elements (e.g., draggable), positional data of related tangibles (e.g., fiducial markers [mechamarkers]), or other sim state/trigger information (e.g., goal states, auditory events, etc) for further statistical and qualitative analysis.
Frequency of data would be tied to whatever the limits of each mode are (e.g., mechamarkers might be 50-60Hz). Ultimate result requires synchronization across any obtained data. There are some synchronization issues that may be addressed at this stage (universal time stamps?), but much of it may need to happen in post-process.
I don't have too much more insight how exactly we get this information, but ideally, the end product that we do our analysis looks something like this (I imagine this is a bit simplified or there are better alternatives for specified units, but hopefully gets the idea across):