loopbio / FreemooVR

A standalone, hackable, versatile, composable, perspective-correct VR for freely moving animals engine
Other
1 stars 1 forks source link

FreemooVR

FreemooVR - virtual reality engine

FreemooVR is a virtual reality engine built on OpenSceneGraph. It supports arbitary projection geometry and calibration methods for use in many scientific studies. It was described in the following paper

John R Stowers*, Maximilian Hofbauer*, Renaud Bastien, Johannes Griessner⁑, Peter Higgins⁑,
Sarfarazhussain Farooqui⁑, Ruth M Fischer, Karin Nowikovsky, Wulf Haubensak, Iain D Couzin,
Kristin Tessmar-Raible✎, Andrew D Straw✎.
Virtual reality for freely moving animals. Nature Methods 2017. DOI: 10.1038/nmeth.4399

FreemooVR is a fork of, and the successor to, freemovr. It removes all ROS dependencies, improves support, manipulations and display of OSG files, adds back an extra o to the name to enable the awesome cow logo, and makes the software a simpler and more extensible base upon which to build custom VR setups.

FreemooVR has a remote IPC interface (ZMQ) and a python API which allows using it to build custom VR assays and experiments. Virtual environments are designed in blender and loaded via .osg format into FreemooVR.

Calibration tooling was also consolidated around arbitrary geometry display models and integrated into the repository.

Installation (Ubuntu 18.04)

Display server

Python interface

Theory of operation

A moving observer has a pose within a global coordinate frame. Objects within the global frame may also move or be updated (e.g. a moving grating). Six camera views with a fixed relationship to the observer are used to build a cube map, showing the scene surrounding the observer without regard to the projection surface.

This cube map is then projected onto a 3D shape model of the display surface. From there, this image is warped to the physical display output.

Running FreemooVR

The single executable $ ./bin/display_server runs the VR software 'display server'.

The display server node runs locally on the computer(s) connected to the physical display. During a typical experiment, it will be running a stimulus plugin (typically StimulusOSG or StimulusOSG2).

A VR experiment updates one or many (see: MultiServerBaseZMQ) display servers on the basis of the observer's current position. Given the scenegraph and the calibrated screen layout, each display server will compute the images shown on the projectors.

Other tips

Developing and Testing

The specific details of tracking and calibration, and any additional coordinate systems that they must share are defined by, and a function of the downstream software using FreemooVR; for example

Nevertheless, the coordinate system of the OSG file and what is sent with ServerBaseZMQ.set_position() are internally consistent (see test_coord_system.py).

Showing OSG Files

There are two stimulus plugins for showing OSG files, StimulusOSG and StimulusOSG2. The two have slightly different features and limitations - stemming from how blender/osg represent animations - which depending on what you are doing in your osg file, determine which stimulus you should use.

when blender renders an animation into an OSG file, it bakes the coordinates in

This means that moving a node with started animation in StimulusOSG renders incorrectly. This is fixed in StimulusOSG2 which inserts a new root into the scenegraph in a way that allows moving baked animations.

Conversely, if you have a very simple scene without animation, you should define all your virtual objects in the one OSG file, and use StimulusOSG to move/show/hide them individually. See example

There are some additional features only availalbe in StimulusOSG2, such as fading an object in or out. For an example of these, see multiple_objects_animation

Glossary

Display Coordinates - the native pixel indices on a physical display. These are 2D.

World Coordinates - the 3D coordinates in lab space of physical (or simulated) points. (May also be represented as a 4D homogeneous vector x,y,z,w with nonzero w.)

Physical Display - a physical device capable of emitting a large, rectangluar block of pixels. It has display coordinates - the 2D locations of each pixel. (A physical display does not have world coordinates used for the VR mathematics. On the other hand, A virtual display does have world coordinates.)

Virtual Display - a model of a physical display which relates world coordinates to display coordinates. The model consists of a linear pinhole projection model, a non-linear warping model for lens distortions, viewport used to clip valid display coordinates, 3D display surface shape in world coordinates, and luminance masking/blending. Note that a physical display can have multiple virtual displays, for example, if a projector shines onto mirrors that effectively create multiple projections.

Viewport - vertices of polygon defining projection region in display coordinates (x0,y0,x1,y1,...). It is used to limit the region of the physical display used to illuminate a surface. (The FreemooVR Viewport corresponds to a 2D polygon onto which the image of the projection screen is shown.)

Display Surface - a physical, 2D manifold in 3D space which is illuminated by a physical display (either by projection or direct illumination like an LCD screen).