Open jszigetvari opened 7 years ago
Since yesterday, I ditched 0.3.7, and installed the latest version via git. The behavior is the same. :(
If I understand correctly, you would like to cast your desktop video (?) to your chromecast. If you would like to do so, then this should work:
python mkchromecast.py --video --screencast
Is it video what you want?.
Hello Muammar,
Thank you for your answer!
You are correct, I would like to cast my (KDE) desktop working area to my chromecast. (I guess that casting only one fullHD display will be possible, but that is no problem.)
I assume that you are saying that it is only possible to cast a video capture of my desktop with mkchromecast. That is no problem, I'm only experimenting, and would like to see it working in the first place.
Unfortunately your suggestion did not work for me, as the 0.3.7.1 version of mkchromecast does not seem to recognize the --screencast
option:
$ python mkchromecast.py --video --screencast
usage: mkchromecast.py [-h] [--alsa-device ALSA_DEVICE] [-b BITRATE]
[--chunk-size CHUNK_SIZE] [-c CODEC] [--config]
[--control] [--debug] [-d]
[--encoder-backend ENCODER_BACKEND] [--host HOST]
[-i INPUT_FILE] [-n NAME] [--notifications] [-r]
[--reboot] [--reconnect] [--resolution RESOLUTION] [-s]
[--sample-rate SAMPLE_RATE] [--seek SEEK]
[--segment-time SEGMENT_TIME] [--source-url SOURCE_URL]
[--subtitles SUBTITLES] [-t] [--update] [-v] [--video]
[--volume] [-y YOUTUBE]
mkchromecast.py: error: unrecognized arguments: --screencast
Hello Muammar,
Sorry for the misunderstanding. I noticed, that the feature you were talking about was only available on the devel branch. I've switched to it, and mkchromecast started up okay.
$ mkchromecast --name janos-chromecast --control --notifications --video --screencast
Mkchromecast v0.3.8
Starting Video Cast Process...
PID of main process: 11928
PID of streaming process: 11932
* Running on http://0.0.0.0:5000/ (Press CTRL+C to quit)
Information about janos-chromecast
DeviceStatus(friendly_name=u'janos-chromecast', model_name=u'Chromecast', manufacturer=u'Google Inc.', api_version=(1, 0), uuid=UUID('8532def4-d073-c6b4-1d93-6fb2526ebd72'), cast_type='cast')
Status of device janos-chromecast
CastStatus(is_active_input=False, is_stand_by=True, volume_level=1.0, volume_muted=False, app_id=u'E8C28D3C', display_name=u'Backdrop', namespaces=[u'urn:x-cast:com.google.cast.debugoverlay', u'urn:x-cast:com.google.cast.cac', u'urn:x-cast:com.google.cast.sse'], session_id=u'4b4c27ce-f743-4dd8-bb7b-9c3f356b93ac', transport_id=u'4b4c27ce-f743-4dd8-bb7b-9c3f356b93ac', status_text=u'')
The IP of janos-chromecast is: X.Y.Z.56
Your local IP is: X.Y.Z.13
The media type string used is: video/mp4
Cast media controller status
CastStatus(is_active_input=False, is_stand_by=True, volume_level=1.0, volume_muted=False, app_id=u'CC1AD845', display_name=u'Default Media Receiver', namespaces=[u'urn:x-cast:com.google.cast.debugoverlay', u'urn:x-cast:com.google.cast.broadcast', u'urn:x-cast:com.google.cast.media'], session_id=u'771a3eb4-8cf0-47aa-bd1f-93db97e2715f', transport_id=u'771a3eb4-8cf0-47aa-bd1f-93db97e2715f', status_text=u'Ready To Cast')
Controls:
=========
Volume Up: u
Volume Down: d
Pause Casting: p
Resume Casting: r
Quit the Application: q or Ctrl-C
ffmpeg version 3.2.4-1build2 Copyright (c) 2000-2017 the FFmpeg developers
built with gcc 6.3.0 (Ubuntu 6.3.0-8ubuntu1) 20170221
configuration: --prefix=/usr --extra-version=1build2 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --enable-gpl --disable-stripping --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libebur128 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmp3lame --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-omx --enable-openal --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libopencv --enable-libx264 --enable-shared
libavutil 55. 34.101 / 55. 34.101
libavcodec 57. 64.101 / 57. 64.101
libavformat 57. 56.101 / 57. 56.101
libavdevice 57. 1.100 / 57. 1.100
libavfilter 6. 65.100 / 6. 65.100
libavresample 3. 1. 0 / 3. 1. 0
libswscale 4. 2.100 / 4. 2.100
libswresample 2. 3.100 / 2. 3.100
libpostproc 54. 1.100 / 54. 1.100
[x11grab @ 0x55e84ea5d100] Stream #0: not enough frames to estimate rate; consider increasing probesize
Input #0, x11grab, from ':0.0+0,0':
Duration: N/A, start: 1504549022.881362, bitrate: N/A
Stream #0:0: Video: rawvideo (BGR[0] / 0x524742), bgr0, 1920x1080, 25 fps, 1000k tbr, 1000k tbn, 1000k tbc
[libx264 @ 0x55e84ea66400] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 AVX2 LZCNT BMI2
[libx264 @ 0x55e84ea66400] profile Constrained Baseline, level 4.0
[libx264 @ 0x55e84ea66400] 264 - core 148 r2748 97eaef2 - H.264/MPEG-4 AVC codec - Copyleft 2003-2016 - http://www.videolan.org/x264.html - options: cabac=0 ref=1 deblock=0:0:0 analyse=0:0 me=dia subme=0 psy=1 psy_rd=1.00:0.00 mixed_ref=0 me_range=16 chroma_me=1 trellis=0 8x8dct=0 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=0 threads=4 lookahead_threads=4 sliced_threads=1 slices=4 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=0 weightp=0 keyint=60 keyint_min=6 scenecut=0 intra_refresh=0 rc_lookahead=0 rc=crf mbtree=0 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 vbv_maxrate=10000 vbv_bufsize=20000 crf_max=0.0 nal_hrd=none filler=0 ip_ratio=1.40 aq=0
Output #0, mp4, to 'pipe:1':
Metadata:
encoder : Lavf57.56.101
Stream #0:0: Video: h264 (libx264) ([33][0][0][0] / 0x0021), yuv420p, 1920x1080, q=-1--1, 25 fps, 12800 tbn, 25 tbc
Metadata:
encoder : Lavc57.64.101 libx264
Side data:
cpb: bitrate max/min/avg: 10000000/0/0 buffer size: 20000000 vbv_delay: -1
Stream mapping:
Stream #0:0 -> #0:0 (rawvideo (native) -> h264 (libx264))
Press [q] to stop, [?] for help
X.Y.Z.56 - - [04/Sep/2017 20:17:02] "GET /stream HTTP/1.1" 200 -
Cleaning up /tmp/...q=12.0 size= 33485kB time=00:02:32.24 bitrate=1801.8kbits/s speed= 1x
[Done]
Streaming a video copy of my desktop now works! Thank you for your help so far. There is something else: due to the overhead of the encoding, and streaming, there is about a 5-7 second delay to real time. Is there some way to decrease this delay by choosing for example some other less cpu-intensive video codec? My laptop is pretty new, with a Core i5-7440HQ CPU and 32 GB of RAM.
Furthermore, do you think it is possible to add the feature of specifying mkchromecast which display (like in the xrandr active output) you wish to cast to Chromecast?
Thank you!
There is something else: due to the overhead of the encoding, and streaming, there is about a 5-7 second delay to real time.
Yes, that is, unfortunately, an issue right now :(. More work has to be done to decrease the delay.
Furthermore, do you think it is possible to add the feature of specifying mkchromecast which display (like in the xrandr active output) you wish to cast to Chromecast?
That sounds interesting, but I have no clue how could it be possible to capture only one display. As long as it is supported for the inputs in ffmpeg
, it is very likely we could do something about it.
Furthermore, do you think it is possible to add the feature of specifying mkchromecast which display (like in the xrandr active output) you wish to cast to Chromecast?
That sounds interesting, but I have no clue how could it be possible to capture only one display. As long as it is supported for the inputs in ffmpeg, it is very likely we could do something about it.
Well, after doing a minimal amount of digging, it seems that the only viable solution is to use offsets to capture a specific monitor/display. On my multi-monitor setup, both monitors/displays seem to belong to X display 0 and X screen 0 below that. (So 0:0 for short.) Thus there seems to be no way to address the individual monitors by themselves, only through adding offsets to the X screen and using video_size.
For more information, please check: https://trac.ffmpeg.org/wiki/Capture/Desktop
Your going to need some ducktape some vga or hdmi cables old cables and a pocket knife and some 180 ohm resistors. It is very simple to capture one display when it does not exist :+1: Display 1 and 2 and cast to 3 like me. ;)
:0 display :0.1 display to cast to :0.2 export DISPLAY=:0.2 mkchromecast --name janos-chromecast --control --notifications --video --screencast And your done :)
Then cast it by using a dummy vga port :) http://blog.zorinaq.com/the-5-second-vga-dummy-plug/ Here is the nfo for the xconfig https://wiki.archlinux.org/index.php/multihead
I was saying that all my attached monitors are combined to a single 0:0 (or 0.0), and they did not show up as separate 0.1 or 0.2, unfortunately. At least that's what xrandr tells me.
I switched to a single screen and this is what I use to pass just the third screen which since I have a nvidia I can send the card a fake edid and make it believe that a monitor is connected :p
ffmpeg -f x11grab -framerate 30 -s 1366x768 -i :0.0+3286,0 -f matroska -
I'm in Linux using --video --screencast
and it works, but I don't hear any sound and I don't see any sink in pulseaudio when casting video. Why?
I'm in Linux using --video --screencast and it works, but I don't hear any sound and I don't see any sink in pulseaudio when casting video. Why?
@edulix The ffmpeg
command is not capturing audio. When doing a screencast this is what is happening:
python3 mkchromecast.py --video --command 'ffmpeg -f x11grab -r 25 -s 1920x1080 -i :0.0+0,0 -vcodec libx264 -preset ultrafast -tune zerolatency -maxrate 10000k -bufsize 20000k -pix_fmt yuv420p -g 60 -f mp4 -max_muxing_queue_size 9999 -movflags frag_keyframe+empty_moov pipe:1' -s
I added a new --command
flag that you can try to use whichever ffmpeg
command you like. Now, according to https://trac.ffmpeg.org/wiki/Capture/Desktop you need to add to the ffmpeg
command either -f alsa -ac 2 -i hw:0
or -f pulse -ac 2 -i default
for ALSA and Pulse respectively.
python3 mkchromecast.py --video --command 'ffmpeg -f pulse -ac 2 -i default -f x11grab -r 25 -s 1920x1080 -i :0.0+0,0 -vcodec libx264 -preset ultrafast -tune zerolatency -maxrate 10000k -bufsize 20000k -pix_fmt yuv420p -g 60 -f mp4 -max_muxing_queue_size 9999 -movflags frag_keyframe+empty_moov pipe:1' -s
This worked for me with pulse.
A simple mkchromecast --video --screencast
fails.... but
python3 mkchromecast.py --video --command 'ffmpeg -f pulse -ac 2 -i default -f x11grab -r 25 -s 1920x1080 -i :0.0+0,0 -vcodec libx264 -preset ultrafast -tune zerolatency -maxrate 10000k -bufsize 20000k -pix_fmt yuv420p -g 60 -f mp4 -max_muxing_queue_size 9999 -movflags frag_keyframe+empty_moov pipe:1'
works for me, thanks a lot!
Why are all those parameters necessary?
FYI: I discover that the Google Chrome browser can cast the full desktop on Linux (and not only a tab) with lower latency (mkchromecast 8 seconds, Chrome 0.5 second).
If some of you are interested, please check out this https://github.com/muammar/mkchromecast/commit/b6d0b3be4ba1cc54d936a591a5181ad9c7093ac6.
Hello Muammar,
I'm using (K)Ubuntu 17.04 (with KDE desktop and no GNOME) with mkchromecast 0.3.7 and a single chromecast 2. I use my desktop environment with two displays, but for during my tests I have tried everything with the external display connected, and without. (I use a laptop.) I have tried streaming a youtube video with mkchromecast, and it worked great. My main goal would be streaming my desktop to the chromecast device, but I always failed doing so.
When I start mkchromecast with debug mode, I get the following:
When this happens, the screen of the HDMI monitor displays "Mkchromecast v0.3.7" with a progress bar almost at 100%.
My other attempt was to use mkchromecast in tray mode. There, the tray icon showed up okay, and on right click the usual menu appeared, with "Stop Casting" and "Search For Google Cast Devices". When I selected "Search for..." the tray icon disappeared, and the list was shown in the console, with no way of selecting anything. The debug output is below:
If this is not a bug, could you give some hints how to make it work? Thank you!
Regards, János