osrf / handsim

HAPTIX Simulation Utlities
2 stars 1 forks source link

Support video showing experiment where we use phantom objects #85

Closed osrf-migration closed 9 years ago

osrf-migration commented 9 years ago

Original report (archived issue) by David Kluger (Bitbucket: DKluger).

The original report had attachments: P201302_combined stim and record_with phone audio_with face block_with captions_with ref_v20131230_1200_dp.mp4


I apologize if this is not the correct place for me to put this, but I couldn't find a better option on the handsim bitbucket page.

For the phantom object function, I have attached a video of one of our experimental sessions using our old virtual prosthetic limb to show you what we are talking about. In this experiment, the volunteer is not looking at the screen (but he does look in other trials), and he is instructed to flex the virtual fingers to a close or far target. Our motor decode interprets his finger location, and stimulation is applied invoking a sensory perception when his fingers are within the targets. When all fingers are in the correct location, the targets go from red to green. An audible beep is heard when all fingers are in the correct location for ~1 second. The volunteer then indicates whether he thinks the targets are close or far. We would like to recreate this experiment in the new VREs.

Note that the targets do not impede the motion of the robot's fingers at all (thus why we want the "phantom" objects). If implementation for the overlap function is not feasible, there is a workaround we can use. This workaround is using an hx_read_sensors() call in an if statement to find when the motors are in a small range of target positions, and a target sphere could be placed roughly in that range of positions. That sphere would have collisions turned off, but would change color when the finger motor is in the range of target positions. This workaround does introduce some error between where the target is located in virtual space and what finger motor positions cause the fingertip to be in the target. This is why it would be nice, but not 100% necessary, to have the ability to precisely determine when the robot's fingers are overlapping with a target.

osrf-migration commented 9 years ago

Original comment by Jackie K (Bitbucket: jacquelinekay).


Thanks for sharing the video! It's always productive for us to learn more about how teams will set up their experiments.

The Bitbucket page is for tracking issues with the software. Creating a new issue just to share information, rather than report an issue, could clutter our issue tracker. In the future, you can email haptix@osrfoundation.org to contact the OSRF developers on the HAPTIX project. Email is probably a more appropriate venue for this kind of discussion.

osrf-migration commented 9 years ago

Original comment by Jackie K (Bitbucket: jacquelinekay).


Not an issue, just discussion