Closed brianmay closed 2 years ago
Thanks for the report.
When I use it with my TWatch I have the same Touch screen info:
Touch screen info: Ft6x36Info { chip_id: Ft6236u, firmware_id: 3, panel_id: 17, release_code: 1 }
And I get this kind of events:
RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Contact, p1: Some(Point { x: 231, y: 210 }), p2: None }
RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Contact, p1: Some(Point { x: 231, y: 206 }), p2: None }
RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Contact, p1: Some(Point { x: 231, y: 209 }), p2: None }
RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Contact, p1: Some(Point { x: 231, y: 211 }), p2: None }
RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Release, p1: None, p2: None }
RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Release, p1: None, p2: None }
RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Contact, p1: Some(Point { x: 78, y: 51 }), p2: None }
RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Contact, p1: Some(Point { x: 30, y: 46 }), p2: None }
RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Contact, p1: Some(Point { x: 9, y: 46 }), p2: None }
RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Contact, p1: Some(Point { x: 4, y: 46 }), p2: None }
RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Release, p1: None, p2: None }
RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Contact, p1: Some(Point { x: 20, y: 34 }), p2: None }
RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Contact, p1: Some(Point { x: 20, y: 34 }), p2: None }
RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Release, p1: None, p2: None }
The coordinates are 0,0 on bottom right corner and go up to 240,240 on the top left corner.
The arduino app works fine. Not sure how it sets the screen orientation.
One corner is (28,58) and the other corner is (290,461).
The screen is 320x480.
Unfortunately I think the coordinates here have already been translated into screen pixels, so not sure it is going to help. Except to confirm that the hardware works.
Bit puzzled why I can't get all the way to (0,0) through to (320, 480), but not really concerned here. The lines drawn go all the way to the edge.
My bad, just noticed that the touch driver code is included in the example, and it looks like it doesn't do any translations:
Could you share your code ?
Maybe it's a problem regarding I2c initialization.
It is a bit of a mess right now, but basically the relevant code is here:
https://github.com/brianmay/robotica-remote-rust/blob/main/src/boards/makerfab.rs#L60-L79
(this requires changing the default feature in Cargo.toml
to compile)
I will add some methods to improve diagnostic capabilities according to this. I'll let you know when it's done, I will probably need you to run some tests then.
Could you try with the current master ? And paste the result of debugging get_diagnostics()
?
Could you also try with calling get_touch_event_iter()
which get the info register by register rather than in one communication ?
On my side, I get:
Touchscreen diagnostics Diagnostics { power_mode: 0, g_mode: 1, lib_version: 12298, state: 1, control_mode: 0 }
Touchscreen irq triggered RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Contact, p1: Some(Point { x: 36, y: 213 }), p2: None }
Touchscreen irq triggered iter RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Contact, p1: Some(Point { x: 36, y: 213 }), p2: None }
I am sorry it's a bit complicated, the datasheet is not really complete and I'm quite new to embedded and Rust too.
No need to apologise :-) - Thanks for writing this code...
I get:
Touch screen info: Diagnostics { power_mode: 0, g_mode: 1, lib_version: 12298, state: 1, control_mode: 0 }
get_touch_event_iter: RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Contact, p1: Some(Point { x: 25, y: 214 }), p2: None }
get_touch_event: RawTouchEvent { device_mode: Working, gesture_id: NoGesture, touch_type: Contact, p1: Some(Point { x: 25, y: 214 }), p2: None }
The values always agree.
Sometimes I seem to notice a pattern, but then the pattern disappears and I got nonsense values again.
I have been looking at your code and comparing with the working Arduino code. I can't see anything significant different. Small things though:
report[2]
probably should use report[2] & 0x07
as upper bits are reserved (generally they seem to be 0 though).report[3]
/ n_XH
(press down vs left up) which your code ignores. Your code uses values from report[2]
/ TD_STATUS
instead (number of touch points).None of this explains the problems I have been having. Will keep digging. But I felt I should report back here regardless.
Ok, thank you very much for your comment. I checked again and I think I identified the problem.
If you check this playground you will understand. I thought that the operator priority was the same, but I guess it's not.
Could you give a try to the last version on master ?
If it's working for you, I will make a new release to fix the bug.
Hmmm. To think I had been staring at that code, and I didn't notice the problem...
Lower level: 35, 45 Upper right: 286, 463
This looks a lot better now.
Will have to remap these values to match the screen orientation. And then see if they correspond with the positions of the objects I have on screen. Will do that tomorrow.
In the meantime though: Thanks!
Just one other little problem.... Not sure what your plans are regrading API, but I note that the struct elements of the returned object are all private :-(
Yes, I hoped to expose some more qualified events, but as it's not really possible to get them directly from the hardware I should expose the RawTouchEvent, at least for now.
I will also be grouping the TouchType with the Point as it's exposed for each points.
Feel free to comment or give suggestions on this approach.
I understand that this is early code still, and to expect API changes.
At the moment just have simple buttons to press, so just X,Y coordinates are enough. My simple code does mean if I touch an empty area of the screen and slide my finger over a button then the button gets activated. Haven't decided if this is good or bad.
Shame the gesture detection isn't working. Looks like it is the same for me too. At least I am seeing a lot of gesture_id: NoGesture
in the `RawTouchEvent'.
What do you mean by the TouchType?
What do you mean by the TouchType?
The chip can report if it's just an initial press, an ongoing contact or a release. Though you might have seen that most of the time only contact is reported.
Sounds like grouping this could be a good idea,
Ok, I've done it and added some tests. If this is ok for your, I will prepare a new release (0.2) and close this issue.
It looks food to me.
The new version has been released: https://crates.io/crates/ft6x36/0.2.0
Trying to use this with this hardware:
https://www.makerfabs.com/esp32-3.5-inch-tft-touch-capacitive-with-camera.html
The info returned is:
Touch screen info: Ft6x36Info { chip_id: Ft6236u, firmware_id: 3, panel_id: 17, release_code: 1 }
This is with the latest git version from the main branch.
Seems to detect presses ok, both 1 and 2 fingers.
But not making any sense of the X,Y coordinates. So for example I touch and hold the top right corner of the screen and get (polling at 0.5s):
How are coordinates suppose to work? Where is 0,0 and which way do + values go?