SMITE is a toolbox for using eye trackers from SMI GmbH with Python, specifically offering integration with PsychoPy. A Matlab version that integrates with Psychtoolbox is also available from https://github.com/dcnieho/SMITE
For questions, bug reports or to check for updates, please visit www.github.com/marcus-nystrom/SMITE.
SMITE is licensed under the Creative Commons Attribution 4.0 (CC BY 4.0) license.
demos/read_me.py
shows a minimal example of using the toolbox's
functionality.
Tested on Windows using PsychoPy with Python 2.7. Also tested with PsychoPy3 (Python 3.6, but see issues below)
If you know what you are doing, install SMITE using: pip install py_smite
or python -m pip install py_smite
.
If you use a standalone PsychoPy installation, do the following steps:
C:\Program Files\PsychoPy
(or wherever you installed PsychoPy) and open a command prompt in the same folder as where you find python.exe
(should be the main PsychoPy install folder). So the command prompt you have should start with something like C:\Program Files\PsychoPy>
python -m pip install py_smite --upgrade
.Then run read_me.py
from the 'examples' folder. Reading through read_me.py
should provide a good starting point for most users of SMITE.
As demonstrated in the demo scripts, the toolbox is configured through the following interface:
settings = SMITE.get_defaults('trackerName');
Supported tracker Names are HiSpeed
,
RED
, REDm
, RED250mobile
, REDn_Scientific
, and REDn_Professional
.The following method calls are available on a SMITE instance
Call | inputs | outputs | description |
---|---|---|---|
get_options() |
|
Get settings | |
init() |
Connect to the SMI eye tracker and initialize it according to the requested settings | ||
is_connected() |
|
Report status of the connection to the eye tracker | |
calibrate() |
|
Do participant setup, calibration and validation | |
start_recording() |
|
Start recording eye-movement data to idf file | |
start_buffer() |
|
Start recording eye-movement data into buffer for online use | |
send_message() |
|
Insert message into idf file | |
get_latest_sample() |
sample :struct array |
Get most recent data sample | |
consume_buffer_data() |
list of samples | Get data from the online buffer. The returned samples are removed from the buffer | |
peek_buffer_data() |
list of samples | Get data from the online buffer. The returned samples remain in the buffer | |
stop_buffer() |
|
Stop recording data into buffer | |
stop_recording() |
Stop recording data into idf file | ||
save_data() |
|
Save idf file to specified location | |
de_init() |
|
Close connection to the eye tracker and clean up | |
set_begaze_trial_image() |
|
Put specially prepared message in idf file to notify BeGaze what stimulus image/video belongs to a trial | |
set_begaze_key_press() |
|
Put specially prepared message in idf file that shows up as keypress in BeGaze | |
set_begaze_mouse_click() |
|
Put specially prepared message in idf file that shows up as mouse click in BeGaze | |
start_eye_image_recording() |
|
Start recording eye images to file. Not supported on RED250mobile , REDn Scientific , and REDn Professional |
|
stop_eye_image_recording() |
Stop recording eye images to file | ||
set_dummy_mode() |
Enable dummy mode, which allows running the program without an eye tracker connected |
ToDos (current discrepancies between the paper and the toolbox):