Genymobile / scrcpy

Display and control your Android device
Apache License 2.0
105.44k stars 10.24k forks source link

[Suggestion] Using a webclient instead sdl app. #313

Open pabloko opened 5 years ago

pabloko commented 5 years ago

Hello, i firtsly want to say thanks to the dev of this project where i learned interesting stuff.

As im developing similar project i focused on the communication of data/video and rendering, as my current platform is web-driven i wanted to be able to control the device from a webapp, i set up a simple server (with simplicity far beyond scrcpy) just used MicroHTTPD to serve some http endpoints:

/stream -> that launchs on a thread a process "screenrecord --bit-rate 1M --size 1280x720 --output-format=h264 -" the outputstream of this process is tied to the inputstream of http request, serving a chunked stream of h264 (just like scrcpy does, but with less hassle) /control?tapx=x&tapy=y -> launches "input tap x y", simple... /control?text=txt -> launches "input text txt", simple...

one can just use ffmpeg&ffplay to access this stream "ffplay -f h264 -i http://127.0.0.1:8088/stream" but i wanted to draw on web, remember, so im using Broadway.js, that does h264 decoding on client browser with any given raw data. We have to do a break here, for this specific case, triying to use the http feed with xhr need to write a custom inputsource for xhr, but we have new Fetch api liying around that we can use to constantly pump new data from opened http request, then give it to broadway player and render it to a canvas.

Imgur Image https://imgur.com/a/0mYstz5

The input consists on a always focused input that process all keyevents and make xhr calls to control endpoint, also clicks are sent, being transformed from virtual screen->real coordinates.

Imgur Image https://i.imgur.com/IPOXI2X.gifv

The code: java https://pastebin.com/RQQUvaE9 html https://pastebin.com/mscEYvA8

This sample demonstrates that web rendering is possible even without the need of raw socket/websocket usage, and broadway does the job while outperformed ffmpeg on all my tests.

One could for example render directly on electron app, while still have an easy api to work with other comands/shells/tools on easier graphic mode.

Disclaimer: obviously this code is POC grade and not usable beyond testing, just wanted to share a different point of view on rendering process and possibilitate better UI.

Again many thanks!

dondiscaya0531 commented 5 years ago

Sounds great. Hope this project will remain free , tested similar projects with paid subscriptions like VYSOR still unsatisfied. Been looking forward for something that can fully control a device wirelessly with or without third part app on android and will remain assigned.

pabloko commented 5 years ago

@dondiscaya0531 yeah, what you relly want is a bit more intrincated than just deploying an app, see the example of teamviewer. They have a main application and then an addon app for different mobile vendors. Those "addon" apps, are signed with the platform key(key used to sign the firmware of the rom), so they get privileged access. This means, Sony, Samsung... etc, should sign your addon trought some kind of agreement. Is that or being root/having platform keys for some specific flashed rom on the device. Obviously you could think about using the signed teamviewed addon, calling its service and binding to its aidl interface, but its checking the pubkey of caller app to prevent tampering.

Imagine being on this process of ask/review/sign/publish with that much firmware vendors, thats why theres no good remote control outside adb connectivity on android

rom1v commented 5 years ago

Thank you for sharing these details, it seems cool :+1:

IIUC, you use a web client, but still need to be connected via adb?

One major issue I see is security: this exposes to anyone, via a socket, features that require shell permissions on the device (see the screen and inject events).

pabloko commented 5 years ago

@rom1v well as i said implemention is POC grade, just wanted to test web rendering with Broadway.js, so i dont have problems to listen on my lan for control commands. In my specific case, im not needing adb because ill use my app signed against the platform key of our custom rom, so it will run with system privileges anyway. For example if you pick any emulator without play store, you can use this https://puu.sh/BUBFy/b238b95a39.jks keystore (alias: android_platform, passwords: platform) in order to run as system.

Apart from that, im wondering if i can spend some of my inexistent free time doing a fork with electron as client, maybe it deserves the try, as ill have to do it anyway on my project.

pabloko commented 5 years ago

Please forgive the offtopic, but as far im doing tests, ive came to a non-root remote control that actually works remotely.

Android AccessibilityServices can actually inject input, but not thought inserting input events, instead, you can access all the focus window nodes, then propagate some event (click, edit_text...) on some specific view.

I actually find x and y coordinates by finding the element which rect is behind that position. apart from that, service connects to online websockets that I use as c&c

It actually has also video with MediaProjection (which asks for permission on each boot) I didnt put the video on the screencap because its confusing. Heres a cap with it

So, right there is a non-root remote control solution if you dont mind high latency. Probably good enough for software support and remote configuration. btw, as my tests going, its still usable with screen off (wtf). As it seems accesibility services gives a wide range of options to get views like uiautomator, and interact with them, it also give some handful events for window focus change and injection of global events like back, home, menu, screenshot, display notificationbar,reboot dialog...

rom1v commented 5 years ago

Android AccessibilityServices can actually inject input, but not thought inserting input events, instead, you can access all the focus window nodes, then propagate some event (click, edit_text...) on some specific view.

Do the edit text events have the same limitations as in the InputManager? (see https://github.com/Genymobile/scrcpy/issues/37)

if you dont mind high latency

What is the main cause of the high latency?

its still usable with screen off

You mean the screen is off on the device, but is "on" on your computer?

pabloko commented 5 years ago

Do the edit text events have the same limitations as in the InputManager

This isnt raw input, you have to look for editable view on desired position, iterating from root view and excluding views by bounds and not being editable, then you find a TextBox or similar, you can perform

Bundle arguments = new Bundle(); arguments.putCharSequence(AccessibilityNodeInfo.ACTION_ARGUMENT_SET_TEXT_CHARSEQUENCE, string_sent_to_control); nearestNodeToMouse.performAction(AccessibilityNodeInfo.ACTION_SET_TEXT, arguments);

What is the main cause of the high latency?

The same as the last issue, views are constantly changing, moving and dissapearing, Each interaction needs that you get the root view of focus, and iterate or find affected view, so it takes some time. Apart from that the feel of the control is very different to raw input since you are restricted to interactions only for the actual root view. I think of course it can be properly coded, in my case, i needed to consume minimum bandwidth possible for remote support, other users like game players will find this option totally useless.

As my tests are ongoing, my app is working trought NAT, the service connects to a websocket server (always connected while internet accessible) and waits to control commands, ive tuned mediaprojection to just make static caps that are resized. Those caps are only requested on redraw events. Other thing i dint know about MediaProjection, is that if you check "dont ask again" on cast permission popup, the action persists across reboots, which is nice, the only setup needed for this app is, give camera permission, enable accesibility service and allow mp capture with "dont ask again" checked.

https://puu.sh/BVI0w/e2e2d01819.mp4

This still needs tons of enhancements but its the only possible non-root truly remote control over internet that should work for Android 5+

pabloko commented 5 years ago

@rom1v Ive been thinking about latency since you asked, polished a bit the view finder and messed with mediaencoder/mediacodec hell, and yeah, im out of latency.

https://puu.sh/BWJ7c/b85d9b82bd.mp4

Ive just broadcast the raw h264 NALu packets and feed to the client browser using websocket channelling and render it with Broadwayjs, i also draw on client some boxes for accessible views and broadcast events between both clients. This encoder is quite similar to used in scrcpy, but ive used async method of MediaEncoder with getoutputbuffer/releaseoutputbuffer on onOutputBufferAvailable event instead continuously dequeue output buffer, anyway, the produced buffer is the same as should be totally compatible rendering for both, as the source being MediaProjection or reflected display from SurfaceControl, just first one dont need rooting. Probably wrapping up a a repo of it, or even offering as service...

But for sure i will make a fork of scrcpy with electron, it deserves better client (dont get me wrong, actual client is awesome, but hard to edit) where less experienced users can add more tools, like uiautomator, take recordings, macros... It seems combining electron with adbkit (https://github.com/openstf/adbkit) is the perfect choice to recreate the client.

rom1v commented 5 years ago

This isnt raw input, you have to look for editable view on desired position, iterating from root view and excluding views by bounds and not being editable, then you find a TextBox or similar […]

OK, thank you for the details.

This encoder is quite similar to used in scrcpy, but ive used async method of MediaEncoder with getoutputbuffer/releaseoutputbuffer on onOutputBufferAvailable event instead continuously dequeue output buffer

What are the benefits? I'm interested because I hesitated between synchronous and asynchronous here.

i will make a fork of scrcpy with electron

Will it still push a server to the device via adb? Will it require an Android app (apk)? I'm interested in your progression :)

take recordings

Just for info, there will be recording https://github.com/Genymobile/scrcpy/pull/292 :wink: (it already "works", but it needs more work to be properly integrated).

pabloko commented 5 years ago

Hey @rom1v, as for sync/async encoder integration, ive found more simplistic approach on it, but despite one way or another the data is obtained the same way, its just im a total noob on android/java and i dont know how to properly manage threads and thread-safeness data exchange, btw, i didnt make a shallow copy of buffer data like IO.writefully implementation on scrcpy, do you think i still need to do that? Apart from that, ive found async very convenient and easy to manage, since the service were this stuff is living is an AccessibilityService and i dont control it execution, as is part of android core api. It will be glad to hear your toughts about using sync or async.

Will it still push a server to the device via adb? Will it require an Android app (apk)? I'm interested in your progression :)

as scrcpy server codebase needs at least shell user to work, it has to be launched via adb, ill keep the server as is, and just work on the client, reimplement adb stuff with adbkit lib, reimplement drawing and event broadcasting.

btw scrcpy server could be simplified to the extreme if you pipe video trought stdout and call it trought adb exec-out and for input and send input trought monkey server (find it protocol here https://github.com/aosp-mirror/platform_development/blob/master/cmds/monkey/README.NETWORK.txt) afaik reflection is being used on this project due to latency of input commands, but this server is mostly immediate.

Just for info, there will be recording #292 😉 (it already "works", but it needs more work to be properly integrated).

Nice job, didnt you think about saving the framebuffer so you can later pick some part of it? this kind of work is why id prefer to have electron client, having this kind of features are way easier to integrate, user could use it easily trought UI, adding controls and windows wont be a pain... Id like to have some package explorer, file explorer, command shell... u know.

Maybe for some reason you prefer to keep c+sdl client could use scripting engine like lua, wich being provided with enough api, plugins could be added.

As for my project goes, im testing it on real devices without root. Im quite satisfied with the result: https://puu.sh/BWO7i/5fd1e19141.mp4

rom1v commented 5 years ago

i didnt make a shallow copy of buffer data like IO.writefully implementation on scrcpy

How did you write the data to the socket?

as scrcpy server codebase needs at least shell user to work, it has to be launched via adb, ill keep the server as is, and just work on the client, reimplement adb stuff with adbkit lib, reimplement drawing and event broadcasting.

So why do you need MediaProjection or AccessibilityServices?

btw scrcpy server could be simplified to the extreme if you pipe video trought stdout and call it trought adb exec-out

My first PoC did this. There are several drawbacks compared to a socket:

this kind of work is why id prefer to have electron client, having this kind of features are way easier to integrate, user could use it easily trought UI, adding controls and windows wont be a pain..

That's true, SDL has no widgets, and currently scrcpy outputs many things on the console which would be better on the screen (but we could print text using SDL).

However, I would be very surprised if electron/javascript had the same performances (framerate, latency, start time, cpu, memory…).

Alternatively, it could be also written in Qt.

pabloko commented 5 years ago

How did you write the data to the socket?

Just read a byte[] from ByteBuffer and send it to websocket channel. I readed on the comments some devices would need a shallow copy of data.

So why do you need MediaProjection or AccessibilityServices?

Because they dont need adb. I can use it from wireless/4g without any special need apart from enabling the accessibility service and allowing screen recording forever.

adb sometimes prints garbage on stdout (for example if the daemon is not started) which is interleaved with the actual stream.

Its good you mention it because it helps me to determine why i was having bad video using this method on my firts PoCs too

I would be very surprised if electron/javascript had the same performances

I can assure rendering performance is top notch, even im using the worst implementation of broadwayjs (no worker thread/no webgl, just decoding bitmaps to a canvas on main thread) Im quite sure performance limit in this case is on the server and network capability.

If you want to test on my app send me a message, you could access it from internet

rom1v commented 5 years ago

If you want to test on my app send me a message, you could access it from internet

Is it open source? Could you publish it on github or somewhere?

pabloko commented 5 years ago

Is it open source? Could you publish it on github or somewhere?

Well ive started it on saturday... so its quite unmature for publishing it yet, i was talking about sending you some private link to test one of my devices.

At this moment im working on the protocol to save all bandwidth i can. I think ill have to do multiple tests using netspeed and netdelay on the emulator as I want to serve different video quality depending on the network speed. Ive been looking for https://github.com/facebook/network-connection-class to classify network capability, but this implementation is based on RX and it seems a i have to use TrafficStats.getTotalTxBytes();... Maybe you know any better way to do this?

pabloko commented 5 years ago

Hey @rom1v i have been reading this issue and noticed your question about deployement strategy

Will it still push a server to the device via adb? Will it require an Android app (apk)?

Ive tried to use different approach by spawning a server for just input with a local socket that will be used by main process, connected to websockets. With this implementation user has to spawn the process via adb on each boot, but being this done, the process keeps running forever accepting input from the network service. Ive just added this as a second input layer to be used with accessibility, as i can handle custom keyboards layouts without hassle, and still control the device with limited features if user dont start the daemon.

apk magic It goes something like... export base=/system && export CLASSPATH=$(pm path es.pabloko.remotecontroltester2 | cut -d : -f2) && app_process / es.pabloko.remotecontroltester2.MainActivity &

This is a test on a sony xperia z5 without adb or root, connected to internet server. https://puu.sh/BZsI9/f0833d3ea4.mp4

Given this process is running on uid 2000 it can perform lots of tasks apart from sending inputs. for example ive added a minimal terminal, that were quite fast to implement.

Im still creating a strategy to evaluate network quality to serve different stream formats the shown in examples is for 3g, has a lot of downscale/4 and 1M bitrate 10fps. The problem is that the streaming only serves data when theres visible changes or I frames, so, measuring traffic meters on time intervals wont do the trick in this case... that method requires upload/download to the throughput, but it isnt our case. Ill try to find better approach...

NickAcPT commented 5 years ago

This looks amazing. Have there been any updates on this?

drauggres commented 5 years ago

Hello. I made some changes to scrcpy and added WebSockets support. And here is a simple Web client for it (in early stage).

While I was working on it, scrcpy was going forward and now there are conflict. If you are interested I'll fix them and send a PR.

rom1v commented 5 years ago

@drauggres I like such experiments, they are interesting, but I don't see how they could be integrated upstream. They provide alternatives to the way scrcpy works. How would you see it?

sheshnath1st commented 4 years ago

@drauggres I have tried your link code and it is successfully run on command but no page is open by default after run successfully

drauggres commented 4 years ago

@drauggres I have tried your link code and it is successfully run on command but no page is open by default after run successfully

The HTTP server is listening on port 8000, open http://127.0.0.1:8000 if you are running it on the same machine. Please ask questions about ws-scrcpy directly on ws-scrcpy/issues page, so we won't bother others in this thread.

Maciejszuchta commented 4 years ago

@pabloko I want to ask you a question. Does your stream support streaming a video longer than 3 minutes(180 sec)? Adb kills the screenrecord after this time limit.

MartinRGB commented 3 years ago

Hello. I made some changes to scrcpy and added WebSockets support. And here is a simple Web client for it (in early stage).

While I was working on it, scrcpy was going forward and now there are conflict. If you are interested I'll fix them and send a PR.

@drauggres @rom1v

(The window with black titlebar is a BrowserWindow)

I'm developing an ADB Electron App for regular users(they may know little abt programming). At first I tried to offer an executable file to install lib like FFmpeg & SDL. But in my country, the installation may cost 10 minutes or more,it will confuse them.

So I decided to render it in browser.I used part of @drauggres's code. It's very portable and no need to install lib via brew. I agree with the idea of providing an 'official' webclient. It's helpful for developer who develop product for regular users.

bhasalma commented 3 years ago

@pabloko .Thanks for sharing ;it sounds good .Could it be possible to publish it on github or somewhere?

drauggres commented 3 years ago

I was thinking about implementing a web client for original scrcpy-server with WebUSB instead of WebSocket server. But after a quick search in github, it turned out that this already been done.

https://github.com/yume-chan/ya-webadb/tree/master/packages/demo

kudos to @yume-chan