pyaillet / ft6x36-rs

Minimal Rust FT6x36 driver implementation
Apache License 2.0
4 stars 0 forks source link

Coordinates returned are random #2

Closed brianmay closed 2 years ago

brianmay commented 2 years ago

Trying to use this with this hardware:

https://www.makerfabs.com/esp32-3.5-inch-tft-touch-capacitive-with-camera.html

The info returned is:

Touch screen info: Ft6x36Info { chip_id: Ft6236u, firmware_id: 3, panel_id: 17, release_code: 1 }

This is with the latest git version from the main branch.

Seems to detect presses ok, both 1 and 2 fingers.

But not making any sense of the X,Y coordinates. So for example I touch and hold the top right corner of the screen and get (polling at 0.5s):

RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Contact, p1: Some(Point { x: 255, y: 58 }), p2: None }
RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Contact, p1: Some(Point { x: 255, y: 58 }), p2: None }
RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Contact, p1: Some(Point { x: 255, y: 58 }), p2: None }
RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Contact, p1: Some(Point { x: 8, y: 31 }), p2: None }
RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Contact, p1: Some(Point { x: 8, y: 30 }), p2: None }
RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Contact, p1: Some(Point { x: 15, y: 135 }), p2: None }

How are coordinates suppose to work? Where is 0,0 and which way do + values go?

pyaillet commented 2 years ago

Thanks for the report.

When I use it with my TWatch I have the same Touch screen info:

Touch screen info: Ft6x36Info { chip_id: Ft6236u, firmware_id: 3, panel_id: 17, release_code: 1 }

And I get this kind of events:

RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Contact, p1: Some(Point { x: 231, y: 210 }), p2: None }
RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Contact, p1: Some(Point { x: 231, y: 206 }), p2: None }
RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Contact, p1: Some(Point { x: 231, y: 209 }), p2: None }
RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Contact, p1: Some(Point { x: 231, y: 211 }), p2: None }
RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Release, p1: None, p2: None }
RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Release, p1: None, p2: None }
RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Contact, p1: Some(Point { x: 78, y: 51 }), p2: None }
RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Contact, p1: Some(Point { x: 30, y: 46 }), p2: None }
RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Contact, p1: Some(Point { x: 9, y: 46 }), p2: None }
RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Contact, p1: Some(Point { x: 4, y: 46 }), p2: None }
RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Release, p1: None, p2: None }
RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Contact, p1: Some(Point { x: 20, y: 34 }), p2: None }
RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Contact, p1: Some(Point { x: 20, y: 34 }), p2: None }
RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Release, p1: None, p2: None }

The coordinates are 0,0 on bottom right corner and go up to 240,240 on the top left corner.

  1. What should be the size of your panel ?
  2. Were you able to test it with an arduino program (like this one for example) ? Did it work and what were the reported coordinates ?
brianmay commented 2 years ago

The arduino app works fine. Not sure how it sets the screen orientation.

One corner is (28,58) and the other corner is (290,461).

The screen is 320x480.

Unfortunately I think the coordinates here have already been translated into screen pixels, so not sure it is going to help. Except to confirm that the hardware works.

Bit puzzled why I can't get all the way to (0,0) through to (320, 480), but not really concerned here. The lines drawn go all the way to the edge.

brianmay commented 2 years ago

My bad, just noticed that the touch driver code is included in the example, and it looks like it doesn't do any translations:

https://github.com/Makerfabs/Project_Touch-Screen-Camera/blob/master/example/touch_draw_v2/FT6236.cpp

pyaillet commented 2 years ago

Could you share your code ?

Maybe it's a problem regarding I2c initialization.

brianmay commented 2 years ago

It is a bit of a mess right now, but basically the relevant code is here:

https://github.com/brianmay/robotica-remote-rust/blob/main/src/boards/makerfab.rs#L60-L79

(this requires changing the default feature in Cargo.toml to compile)

pyaillet commented 2 years ago

I will add some methods to improve diagnostic capabilities according to this. I'll let you know when it's done, I will probably need you to run some tests then.

pyaillet commented 2 years ago

Could you try with the current master ? And paste the result of debugging get_diagnostics() ? Could you also try with calling get_touch_event_iter() which get the info register by register rather than in one communication ?

On my side, I get:

Touchscreen diagnostics Diagnostics { power_mode: 0, g_mode: 1, lib_version: 12298, state: 1, control_mode: 0 }
Touchscreen irq triggered RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Contact, p1: Some(Point { x: 36, y: 213 }), p2: None }
Touchscreen irq triggered iter RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Contact, p1: Some(Point { x: 36, y: 213 }), p2: None }

I am sorry it's a bit complicated, the datasheet is not really complete and I'm quite new to embedded and Rust too.

brianmay commented 2 years ago

No need to apologise :-) - Thanks for writing this code...

I get:

Touch screen info: Diagnostics { power_mode: 0, g_mode: 1, lib_version: 12298, state: 1, control_mode: 0 }
get_touch_event_iter: RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Contact, p1: Some(Point { x: 25, y: 214 }), p2: None }
get_touch_event: RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Contact, p1: Some(Point { x: 25, y: 214 }), p2: None }

The values always agree.

Sometimes I seem to notice a pattern, but then the pattern disappears and I got nonsense values again.

brianmay commented 2 years ago

I have been looking at your code and comparing with the working Arduino code. I can't see anything significant different. Small things though:

None of this explains the problems I have been having. Will keep digging. But I felt I should report back here regardless.

pyaillet commented 2 years ago

Ok, thank you very much for your comment. I checked again and I think I identified the problem.

If you check this playground you will understand. I thought that the operator priority was the same, but I guess it's not.

Could you give a try to the last version on master ?

If it's working for you, I will make a new release to fix the bug.

brianmay commented 2 years ago

Hmmm. To think I had been staring at that code, and I didn't notice the problem...

Lower level: 35, 45 Upper right: 286, 463

This looks a lot better now.

Will have to remap these values to match the screen orientation. And then see if they correspond with the positions of the objects I have on screen. Will do that tomorrow.

In the meantime though: Thanks!

brianmay commented 2 years ago

Just one other little problem.... Not sure what your plans are regrading API, but I note that the struct elements of the returned object are all private :-(

pyaillet commented 2 years ago

Yes, I hoped to expose some more qualified events, but as it's not really possible to get them directly from the hardware I should expose the RawTouchEvent, at least for now.

I will also be grouping the TouchType with the Point as it's exposed for each points.

Feel free to comment or give suggestions on this approach.

brianmay commented 2 years ago

I understand that this is early code still, and to expect API changes.

At the moment just have simple buttons to press, so just X,Y coordinates are enough. My simple code does mean if I touch an empty area of the screen and slide my finger over a button then the button gets activated. Haven't decided if this is good or bad.

Shame the gesture detection isn't working. Looks like it is the same for me too. At least I am seeing a lot of gesture_id: NoGesture in the `RawTouchEvent'.

What do you mean by the TouchType?

pyaillet commented 2 years ago

What do you mean by the TouchType?

The chip can report if it's just an initial press, an ongoing contact or a release. Though you might have seen that most of the time only contact is reported.

brianmay commented 2 years ago

Sounds like grouping this could be a good idea,

pyaillet commented 2 years ago

Ok, I've done it and added some tests. If this is ok for your, I will prepare a new release (0.2) and close this issue.

brianmay commented 2 years ago

It looks food to me.

pyaillet commented 2 years ago

The new version has been released: https://crates.io/crates/ft6x36/0.2.0