AR-Eye-Tracking-Toolkit / ARETT-R-Package

R Package for the Augmented Reality Eye Tracking Toolkit for Head Mounted Displays (ARETT)
MIT License
1 stars 4 forks source link

classify_idt function results in NANs and can't continue #1

Open thomasgoodge opened 1 year ago

thomasgoodge commented 1 year ago

Hi,

I'm trying to classify fixations in some eye tracking data from the HoloLens using the classify_idt function but I'm running into an error.

Warning: NaNs producedError in if (idt_angleDeg > dispersion_threshold) { : missing value where TRUE/FALSE needed

with the warning

In addition: Warning message: In acos(idt_scalar/(idt_iVector_length * idt_jVector_length)) : NaNs produced

I am looping through a series of data frames and this only happens for some of my data files and not others. I have gone through the ones that worked and the ones that didn't and can't see where the difference is or why it would not be able to calculate the idt_angleDeg. Any help you could offer would be greatly appreciated!

thomasgoodge commented 1 year ago

I can see it's specifically when adding new points to the classification window, lines 236-252 in the function code. It gets to row 9047 and then returns a Nan. I've checked the dataframe and the raw datafile and there doesn't seem to be anything different about this line, nor is it reaching the end as there is still ~7000 observations to run through. Any ideas?

sekapp commented 1 year ago

Hi,

sadly I can't really answer this question/report without access to the data and extensive detective work.

Based on your description I assume the issue is within the acos function and that this function does not produce a result for the given value(s). Maybe you can look at your row 9047 and the ones before it, check the values for idt_scalar, idt_iVector_length and idt_jVector_length used in line 243 and manually calculate the acos result. This way it hopefully gets obvious where the issue is. If you already have NaNs at this point, or the values are indeed invalid for the acos function, maybe you can follow the calculations back up the script and compare what differs in your row 9047 compared to the previous rows regarding the interim results used in the calculations.

Based on your observation you then might be able to either fix the raw data (e.g. ommiting lines where recording errors occurred) or add additional validity checks in the script which skip or fill in invalid cases.