Open plainheart opened 4 years ago
I think use ua is a better way? /(Mobile|Android|Windows Phone)/.test(navigator.userAgent) to judge isTouchDevice and use window.TouchEvent to judge hasTouch.
2020年6月29日 下午1:29,Zhongxiang.Wang notifications@github.com<mailto:notifications@github.com> 写道:
Brief
There are some issues reporting echarts can't detect properly whether touch-screen device supports touch events or not.
Current detection way in zrender/src/core/env.jshttps://github.com/ecomfe/zrender/blob/master/src/core/env.js#L149 is not enough to be compatible with all kinds of devices.
And I searched on StackOverflowhttps://stackoverflow.com/questions/4817029/whats-the-best-way-to-detect-a-touch-screen-device-using-javascript, but it seems there is no the best way.
Therefore, for more compatibilities, should we allow the developer to config it manually according to what themselves know about their devices? For example, we may add a new option into zrender and echarts to specify whether touch event is supported. I have no such touchable device to test these, though.
ECharts
var zr = this._zr = zrender.init(dom, { renderer: opts.renderer || defaultRenderer, devicePixelRatio: opts.devicePixelRatio, width: opts.width, height: opts.height, touchEventsSupported: opts.touchEventsSupported });
ZRender overrides env.touchEventsSupported if developer has specified manually
if (opts.touchEventsSupported != null) { env.touchEventsSupported = !!opts.touchEventsSupported; }
Maybe this looks a blit weird. Of course, it would be better if there could be a good way to detect touchable device.
Others
In highcharts, it use /(Mobile|Android|Windows Phone)/.test(navigator.userAgent) to judge isTouchDevice and use window.TouchEvent to judge hasTouch.
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHubhttps://github.com/apache/incubator-echarts/issues/12864, or unsubscribehttps://github.com/notifications/unsubscribe-auth/ABHESIDBPLNLA46M4A4WHN3RZARDLANCNFSM4OK5HN7Q.
@plainheart @wf123537200 It's indeed a headache problem. I used to fix this kind of issue for some Windows touching device that both support touch and mouse (like Surface). But it still difficult to cover all of the cases. And it makes things worse that there are always no such type of devices to test.
So I think it's a practical way to provide an option for users to control that. Not neat, but probably works.
But not sure what that option like.
Like you said, the option could be opt.touchEventsSupported
.
But at present even though touchEventsSupported
is set as true
, zrender still may not use touch event (if pointer event is detected as supported).
So is it a better idea that make the option more straight forward like:
opt: {
nativeEventSystem: 'touch' | 'pointer' | 'mouse'
}
nativeEventSystem
is specified as 'touch', we manually set
env.touchEventsSupported = true
and env.pointerEventsSupported = false
.nativeEventSystem
is specified as 'pointer', we manually set
env.pointerEventsSupported = true
.nativeEventSystem
is specified as 'mouse' or not specified, we manually set
env.touchEventsSupported = false
and env.pointerEventsSupported = false
.Not sure for that. @pissang @Ovilia any better ideas?
I don't think I understand why should the developers need to config about whether touchEventsSupported
.
Because it seems to me that it depends on the device of the users who visit the website with charts rather than developers who made the charts. So how should the develop config this option for both touch devices and non-touch devices?
User agent looks like the right way to solve this. If there are some specific devices that don't work as expected, like not included in touch UA, I think it's better to know this from the bug issues they open and we then ask them for more information like UA or others, and see if there is any better solution.
Yes, this new option is suitable for those who are developing for their own internal projects, especially for LSDS, but not for the most of users. To be frank, it seems that UA is still not enough to solve problems like #12823. Eventually, we need to find the best way to detect touch-screen device, by code but not by option.
We had the same problem on some Windows touch devices. Since we use custom imports with webpack, I added this one for testing (typescript):
import zRenderEnv from 'zrender/lib/core/env';
zRenderEnv.touchEventsSupported = true;
this works nicely and touch is fixed on these devices. The question: Is there any downside in force-enabling this?
Microsoft Surface Hub we had to disable pointerEventsSupported in addition to enabling touchEventsSupported:
import zRenderEnv from 'zrender/lib/core/env';
zRenderEnv.touchEventsSupported = true;
zRenderEnv.pointerEventsSupported = false;
Brief
There are some issues reporting echarts can't detect properly whether touch-screen device supports touch events or not.
7406
9301
12166
12823
The current detection way in
zrender/src/core/env.js
is not enough compatible with all kinds of devices.And I searched on StackOverflow, but it seems there is no the best way.
Therefore, for more compatibilities, should we allow the developer to configure it manually according to what they know about their devices? For example, we may add a new option into
zrender
andecharts
to specify whether the touch event is supported. I have no such touchable device to test these, though.ECharts
ZRender overrides
env.touchEventsSupported
if developer has specified manuallyMaybe this looks a bit weird. Of course, it would be better if there could be a good way to detect the touchable device.
Others
In
highcharts
, it uses/(Mobile|Android|Windows Phone)/.test(navigator.userAgent)
to judgeisTouchDevice
and useswindow.TouchEvent
to judgehasTouch
.