Closed laurentperrinet closed 7 years ago
Je viens de voir qu'une partie est déjà implémentée (l'enregistrement dans data_paddle_position_
). par contre, comme on va faire plusieur sessions, je recommande qu'on mette tout dans un seul fichier de type hierarchique et identifié par une date par exemple:
import time, datetime
timeStr = time.strftime("%Y-%m-%d_%H%M%S", time.localtime())
print(timeStr)
timeStr = '2017-05-24_154708'
observer = 'thys'
def exp_name(self, mode, observer, block, timeStr):
return os.path.join(self.params_exp['datadir'], timeStr + '_' + observer + '_' + str(block) + '.npy')
cf ce qu'on fait dans https://github.com/chloepasturel/AnticipatorySPEM/blob/master/aSPEM.py
J'ai fais les modifications nécessaire ! Ce qui donne ça:
Je ferme l'issue mais si jamais il faut rajouter/modifier quelque chose, n'hésitez pas à la ré-ouvrir !
I would recommend that you would store the data in a Panda dataframe (it's also easy to store in a CSV) and all data in one file.
for that, it would be necessary to be sure to have a proper calibration (do you make one before recording?) + time synchronization between the computer and the eye tracker
I did not do any calibration beside the Eye tracker's one with the corresponding software. The Ball and paddle have the same timestamp and their coordinates are one the same referential, but apparently it isn't the same for the eye tracker (it seems that the eye tracker take the top left corner of the screen for origin whereas the ball and paddle take the top left corner of the game windows as origin).
For the timestamp, it needs to be verified but I think they are all synchronized. But beware, the eye tracker return time is seconds whereas the ball/paddle time is in miliseconds.
Now my question: how do I do a proper calibration?
first, unify data in one file with one unit for time (seconds). to sync the eye tracker and your program there are always a solution (in the form of a series of "handshake" messages from the computer to the eye tracker
second to perform calibration, besides that of the eye tracker (which I do not know), you should perhaps perform before every level a little "gymnastics" by showing dots on the screen for which you know the position. typically I would recommend the corners of your arena + the center. Typically, you would perform ask people to fixate on the dot and go to the next when they fixated enough (say 1 second) in a small window (say 3 degrees of visual angle). If this is what the eye tracker already does - use these functions but report the coordinates in the same unit as that for the ball / paddle
The Eye Tribe calibration software resemble to this: https://www.youtube.com/watch?v=dpUSrWfHols It is an old version of the software but the idea is here. Thus, I think we don't need to recalibrate after this, no?
Regarding the dataframe, the Eye Tribe gives us a 23 columns data sheet. I don't get how can I do to regroup only the wanted columns with the other datas. I don't get either how can I regroups datas who are generated in different classes: X and Y Ball coords come from the class Ball, X and Y Paddle coords come from the class Paddle and the eye-tracker data comes from the main. The solution might be easy but I can't seem to figure it out.
An other thing that bothers me: pandas.Dataframe allows to create a dataframe object but we want it written in a file, no? So what extension will we use for a dataframe? Does it even matter?
J'ai résolu le soucis d'origine des données (coords X et Y) en fixant la taille de la fenêtre du jeu à la résolution de l'écran que l'on va utiliser. De cette manière, toutes les coordonnées délivrées (EyeTribe et celles données par le jeu lui-même) ont pour origine le sommet gauche de l'écran utilisé.
Pour ce qui est d'avoir la même timestamp pour chacune des trois variables, je ne sais pas encore comment faire. Je pensais d'ailleurs que les temps étaient déjà synchronisés.
c'est un probleme recurrent dès que tu as plusieurs ordis en reseau. d'autre personnes que nous ont été confonrontées au problème: https://stackoverflow.com/questions/37837421/timing-issues-psychopy-pygaze-eyetribe-eyetracker-multithreading par exemple
je ferais un handshake simple au début du jeu en demandant à EyeTribe juste une frame avant de commencer - on enregistre aussi le temps local. les 2 temps permettent alors de tout recaler
whenever we will analyze the data, we will need to compare eye data to the actual movement of objects in the scene. therefore, we need to record their position and attributes as a function of time:
I suspect that pandas
DataFrame
s will be ideal for stroring all these objects along with the eye tracking data:http://pandas.pydata.org/pandas-docs/stable/timeseries.html