On Android and iOS the focal (for pinch) and anchor (for rotation) points are always in the coordinate space of the view, while on web they are not. It looks like they are in the window coordinate space, though I didn't investigate it that thoroughly.
The above snippet will render a red point at the relevant point (replace pinch/rotation in detector prop to see the other gesture). It should only change based on the position of the pointer relevant to the blue square, but it's also dependent on its position on the screen and that causes it to move outside the view.
Description
On Android and iOS the focal (for pinch) and anchor (for rotation) points are always in the coordinate space of the view, while on web they are not. It looks like they are in the window coordinate space, though I didn't investigate it that thoroughly.
https://github.com/software-mansion/react-native-gesture-handler/assets/21055725/78632cfc-24f2-43e1-8f8e-afd8a0e4b0ae
Steps to reproduce
The above snippet will render a red point at the relevant point (replace pinch/rotation in detector prop to see the other gesture). It should only change based on the position of the pointer relevant to the blue square, but it's also dependent on its position on the screen and that causes it to move outside the view.
Snack or a link to a repository
https://github.com/software-mansion/react-native-gesture-handler ❤️
Gesture Handler version
main
branchReact Native version
0.74.1
Platforms
Web
JavaScript runtime
None
Workflow
Expo managed workflow
Architecture
Paper (Old Architecture)
Build type
Debug mode
Device
None
Device model
No response
Acknowledgements
Yes