I love the onboarding setup. You've done a nice job. The problem I have is that the UI doesn't respond to touch events, only swipe. This creates an accessibility issue for users who may only have access to a keyboard or who have issues performing swipe actions.
In addition, Google's automated testing when submitting applications will get stuck because touch events are not recognized, so it can't get further into the app. This workaround won't fix the automated instrumented testing, because the instrumenting used with clash with the automated testing.
Here's a workaround cobbled together from some stack overflow pages that will simulate the swipe event so that the screen can be passed. Basically in your MainActivity, you would put the following:
override fun onTouchEvent(event: MotionEvent?): Boolean {
suspend fun fling(
fromX: Float, toX: Float, fromY: Float,
toY: Float, stepCount: Int) {
withContext(Dispatchers.Default) {
val inst = Instrumentation()
val downTime = SystemClock.uptimeMillis()
var eventTime = SystemClock.uptimeMillis()
var y = fromY
var x = fromX
val yStep = (toY - fromY) / stepCount
val xStep = (toX - fromX) / stepCount
var event = MotionEvent.obtain(
downTime, eventTime,
MotionEvent.ACTION_DOWN, fromX, fromY, 0
)
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.HONEYCOMB_MR1) {
event.source = InputDevice.SOURCE_TOUCHSCREEN
}
inst.sendPointerSync(event)
for (i in 0 until stepCount) {
y += yStep
x += xStep
eventTime = SystemClock.uptimeMillis()
event = MotionEvent.obtain(
downTime, eventTime + stepCount,
MotionEvent.ACTION_MOVE, x, y, 0
)
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.HONEYCOMB_MR1) {
event.source = InputDevice.SOURCE_TOUCHSCREEN
}
inst.sendPointerSync(event)
}
eventTime = SystemClock.uptimeMillis() + stepCount.toLong() + 2
event = MotionEvent.obtain(
downTime, eventTime,
MotionEvent.ACTION_UP, toX, toY, 0
)
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.HONEYCOMB_MR1) {
event.source = InputDevice.SOURCE_TOUCHSCREEN
}
inst.sendPointerSync(event)
}
}
if (event?.action == MotionEvent.ACTION_UP) {
val dm = DisplayMetrics()
windowManager.defaultDisplay.getMetrics(dm)
val halfheight = (dm.heightPixels/2).toFloat()
val width = dm.widthPixels.toFloat()
val h = Handler()
lifecycleScope.launch {
fling(
width-1, width/2, halfheight-1, halfheight-1, 5
)
}
}
}
Testing on API 29
I love the onboarding setup. You've done a nice job. The problem I have is that the UI doesn't respond to touch events, only swipe. This creates an accessibility issue for users who may only have access to a keyboard or who have issues performing swipe actions.
In addition, Google's automated testing when submitting applications will get stuck because touch events are not recognized, so it can't get further into the app. This workaround won't fix the automated instrumented testing, because the instrumenting used with clash with the automated testing.
Here's a workaround cobbled together from some stack overflow pages that will simulate the swipe event so that the screen can be passed. Basically in your MainActivity, you would put the following: