acrandal / RevEng_PAJ7620

PAJ7620 gesture sensor Arduino driver and user API
https://acrandal.github.io/RevEng_PAJ7620/
MIT License
24 stars 7 forks source link

Cursor mode to track continuos moves of rotating objects #68

Open federico77 opened 1 year ago

federico77 commented 1 year ago

Hello, I understand that this might get out of topic pretty fast but I am trying to understand if a gesture recognition sensor and library like these can recognize continuos ball rotations in any direction like you would expect, for example, from a trackball.

To be clearer: would the gesture recognition chip in cursor mode behave like an optical navigation chip for a mouse? The sensor documentation is not that extensive in explaining this specific mode.

To my understanding, in cursor mode, the chip simply considers the space it sees like (at least) a bidimensional field where a moving object moving across the visibile space boundaries is reported as being positioned at specific X/Y coordinates. If this is the case would it be possible to trick it into a different behavior?

acrandal commented 7 months ago

I completely missed this issue submission and now I find it a whole year later. I'm sorry that I didn't catch this sooner.

I can see what you're attempting to do, though I don't think this sensor has the resolution to accomplish the described goal. Mice and trackballs use lasers with a very high and accurate feedback of change of distance/surface colors to detect changes (the ball rolling or surface moving in front of the laser).

This sensor is a two dimensional array of infrared sensors. Their feedback is relatively slow and low resolution in the face of changes. The output of the device that I have been able to find so far is the hardware's output of using the embedded centroid clustering to determine the mass center of an occluding object (your finger/hand, etc).

If you could get the actual sensor data values from the 30x30 IR sensor array, you could get closer to an image of the object in view. That's my issue about Image Mode, which the documentation mentions, but I haven't been able to figure out how to get all 900 data points via I2C. Even if you had those values, it would unlikely give you the resolution to determine a rolling object in view unless that object had large protrusions or significant head surface differences. Even then, it's not a very fast sensor (IR generally isn't).

I would look more closely how mice track moving surfaces and even the possibility of pulling apart a laser mouse and building a frame to hold the laser in front of your rolling surface while watching the output of the mouse. There's not much hardware in a mouse, so it might fit into a reasonably small space.

SinanAkkoyun commented 7 months ago

I'm glad you still responded to the issue. I need to build a very small eye tracker and thought the image mode would be ideal.

That's why I wanted to ask you how exactly one can get the images via I2C (you mentioned you didn't quite get all datapoints, could you please share more information?)

acrandal commented 7 months ago

So, the hardware of the PAJ7620 does a lot of the gesture analysis internally. It's designed to abstract away the 30x30 sensor array and return only interpretations of the motion in front of the sensor (changes in the object's position), or just the location of the object.

My driver can give you the estimated centroid of the object in view, a rough scale (not measurement in mm) of distance, the rough size (how many of the 30x30 array sensors) your object is covering up, and some scale information about the brightness of the object (I don't know if that's IR/heat or surface reflection). That's the lowest level data that I've managed to get out of the device so far. Their internal hardware uses those values over time to determine which "gesture" is being performed.

The actual 30x30 data values are something I didn't get time to try to reverse engineer. That's a lot of data to be moving over the I2C bus. It's not well suited for image scale data at high frequency. The chip itself does have a SPI 4 wire bus, but I have only been using the 4 pin break out boards that give me access to I2C.

If you're going to do eye tracking... there might be some possibility of using the raw brightness values if you can get the 30x30 sensor data raw. I just don't know how to do it.

The company who makes the sensor did respond to email. I asked them about "gaming mode" and they got back to me with some documentation. You could potentially contact them directly.

On Thu, Apr 4, 2024 at 9:14 PM Sinan @.***> wrote:

I'm glad you still responded to the issue. I need to build a very small eye tracker and thought the image mode would be ideal.

That's why I wanted to ask you how exactly one can get the images via I2C (you mentioned you didn't quite get all datapoints, could you please share more information?)

— Reply to this email directly, view it on GitHub https://github.com/acrandal/RevEng_PAJ7620/issues/68#issuecomment-2038877296, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABACWAB4BB4EQUUMOXP3IGDY3YQKPAVCNFSM6AAAAAAXHCHBDWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDAMZYHA3TOMRZGY . You are receiving this because you commented.Message ID: @.***>

SinanAkkoyun commented 7 months ago

Thank you very much! I was also wondering about the gaming mode, I'll try to contact them about it, perhaps there is a way to get the raw image via SPI :)

Intuitively, would you think that tracking the gaze as an object is feasible with your current lib as position output?

acrandal commented 7 months ago

My first instinct is no, you won't be able to track the gaze with the current data available.

I use some gaze tracking with AR headsets and have run projects using devices like the Microsoft eye tracking bar for monitors. They use a might higher fidelity camera system to vision process out the current eye position for the tracking. Even those aren't always very accurate. The PAJ7620 just won't have that same level of visual fidelity, even if you can get the 30x30 array out.

The output of the PAJ7620's raw sensor values are likely very similar to the Panasonic Grid-EYE devices. They provide a grid (8x8) of thermal sensor outputs: https://na.industrial.panasonic.com/products/sensors/sensors-automotive-industrial-applications/lineup/grid-eye-infrared-array-sensor

https://rutronik-tec.com/panasonic-infrared-array-sensor-grid-eye/

We found the Grid-Eye to be very slow as a tracking sensor. It's designed for human body scale environments, though up close it can do smaller objects. The rate of reaction in the sensor value changes as a person moved in view was quite slow. Our applications were for smart home technologies, and tracking people with a low fidelity sensor like this would have helped us have a more private (than cameras) approach, but it didn't give us enough data in the 8x8 field of view, and the speed of reaction was so slow for us that it just didn't work as we needed for our research.

-- Crandall

On Sun, Apr 7, 2024 at 5:12 PM Sinan @.***> wrote:

Thank you very much! I was also wondering about the gaming mode, I'll try to contact them about it, perhaps there is a way to get the raw image via SPI :)

Intuitively, would you think that tracking the gaze as an object is feasible with your current lib as position output?

— Reply to this email directly, view it on GitHub https://github.com/acrandal/RevEng_PAJ7620/issues/68#issuecomment-2041658443, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABACWAHDWXCOMEPNGLKFLULY4HOFBAVCNFSM6AAAAAAXHCHBDWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDANBRGY2TQNBUGM . You are receiving this because you commented.Message ID: @.***>