Open bastianilso opened 1 year ago
Today I managed to:
@aldsanms great. then I think what remains is:
1) showing the remaining possible movement in white (like in the mockup)
and maybe that is it for "head rotation v1" (the first version). In next versions we can make it interactive, but for this one maybe we should start with a static visualization.
When the code is ready, please create a pull request.
@aldsanms great. then I think what remains is:
- showing the remaining possible movement in white (like in the mockup)
and maybe that is it for "head rotation v1" (the first version). In next versions we can make it interactive, but for this one maybe we should start with a static visualization.
When the code is ready, please create a pull request.
Today, I continued this issue. Here is a screenshot of the final render:
The code is made in such a way that the style is easily modified, we can then discuss afterwards so that I can adapt to your preferences.
I posted my code on Github but couldn't make a pull request.
Here is a link to the Github directory: https://github.com/med-material/d3-rshiny-vis/tree/main/Head_orientation_stat/headMovementStat01
Moreover, it is now possible for us to visualize the field of vision of the player during his game according to the data presented in a CSV. Here is an example :https://github.com/med-material/d3-rshiny-vis/issues/2#issuecomment-1298574014
I think this issue can be closed.
Preprocessing step in R for wrapping the head orientation data:
D %>% mutate(
HeadCameraRotEulerY_wrap = ifelse(HeadCameraRotEulerY > 180, HeadCameraRotEulerY - 360, HeadCameraRotEulerY)
)
Now:
Using the column
HeadCameraRotEulerY
show how the user rotated his head left and right.