Closed huguesdevimeux closed 3 years ago
I prefer using manim.play
over SpecialScene.play
.
And I would also love for this to open the doors for twitch streaming, but I'm not sure how that would work exactly.
I prefer using
manim.play
overSpecialScene.play
.
of course, me too. It was just a demonstration.
And concerning the twitch live feature, it can be done with ffmpeg : https://stackoverflow.com/a/54273877 (as an example)
I would love it if I could run this as a transparent overlay on my webcamera video (perhaps via a v4llopback fake webcam). Then I could be having jitsi, zoom, skype, etc calls and then just randomly start generating formulae and other mobjects around my face :)
It will be possible with FFMPEG as well, like with normal video generation.
I think it might be possible to actually implement that with OpenCV as well, by rerouting the movie pipe and stuff
It will be possible with FFMPEG as well, like with normal video generation.
Right, you'd use ffmpeg to merge the two videos (webcam and manim), but then you'd still have to pipe that through a video loopback device so that it looks just like another webcam for all the chat programs (like jitsi or zoom) to pick up. Otherwise, how do you tell you chat program to use that video stream instead of the webcam?
A few years back I hacked together something like this to do screensharing in qubesos (catpure in dom0 with ffmpeg -> rpc pipe into AppVM -> use ffmpeg to convert video for v4lloopback -> select loopback webcam in your chat program). ... let's just say it was a 'proof of concept' :D
i'm going to temporary close this, as I'm not going to work on this for a while.
If someone want to be interested in, feel free to re-open it.
I have successfully implemented this livestreaming my dudes.
I don't know exactly how pull requests work or how I can share my own development of this, so I'm willing to spew words if anyone's interested.
I have successfully implemented this livestreaming my dudes.
I don't know exactly how pull requests work or how I can share my own development of this, so I'm willing to spew words if anyone's interested.
Hello! That sounds like really good news!
In order to have a meaningful discussion about your implementation, we will need to actually see your changes. The easiest way of doing this is pushing a branch containing your implementation to a fork of the ManimCommunity/manim repository:
I don't know your background when it comes to git; if you are having troubles with any of these steps please let us know.
@NeoPlato I can't wait to see what you did!
I have successfully implemented this livestreaming my dudes. I don't know exactly how pull requests work or how I can share my own development of this, so I'm willing to spew words if anyone's interested.
Hello! That sounds like really good news!
In order to have a meaningful discussion about your implementation, we will need to actually see your changes. The easiest way of doing this is pushing a branch containing your implementation to a fork of the ManimCommunity/manim repository:
- Visit https://github.com/ManimCommunity/manim and hit "Fork" to create your own copy of the repository
- Use git to push your changes to your copy,
- and then open a corresponding pull request.
I don't know your background when it comes to git; if you are having troubles with any of these steps please let us know.
Well, I tried on my own. I am way lost.
So I've been editing the files that came with the zip file I uploaded from ManimCommunity. The master branch gave me problems with pango and cairocffi and the like so I didn't even touch it and moved to a GeometryAdditions one.
Took me three hours to translate the working from 3b1b manim to ManimCommunity. And I don't even know how the command git push
is supposed to work. I tried cloning the fork so I could simply paste all the new files there but it's bothersome as well.
And that was the end of my patience, solo of course.
fatal: not a git repository (or any of the parent directories): .git
What do I do?
I'm sorry you are having trouble with this @NeoPlato. Unfortunately, git is far from intuitive and is definitely an (unavoidable) hurdle when contributing to manim.
Can you tell us how did you clone this fork? After cloning the fork, the directory where you cloned it should be a git repository, and the message you obtained should not have been shown. Are you sure you are running the commands from within the right directory?
Also, would you be able to tell us what part of this page gave you trouble, if any? This will help us improve our documentation for the future :slightly_smiling_face:
I'm sorry you are having trouble with this @NeoPlato. Unfortunately, git is far from intuitive and is definitely an (unavoidable) hurdle when contributing to manim.
Can you tell us how did you clone this fork? After cloning the fork, the directory where you cloned it should be a git repository, and the message you obtained should not have been shown. Are you sure you are running the commands from within the right directory?
Also, would you be able to tell us what part of this page gave you trouble, if any? This will help us improve our documentation for the future 🙂
I survived 🙂
Okay, it would be nice if the branch thing had an explicit example eg.
git push -u origin master
Granted anyone who's really lost will rush to those instead.
Given there is https://github.com/NeoPlato/manim-livestream on the one hand, and the features for interactivity / render preview for rendering with OpenGL, this issue can be closed.
This issue is mostly a reminder for myself, and a place to discuss about what we want and what we don't.
First of all, livestreaming with manim looks very powerful to my eyes, as it looks like this : This gif has been taken from a previous PR in 3b1b/manim, meant to fix this feature : https://github.com/3b1b/manim/pull/786. Unfortunalty, everything about livestreaming got broken at some point and was eventually removed, exepted streaming settings, ruins of the old world. Everything was removed in https://github.com/3b1b/manim/pull/985/, so if you want to see how it used to be implemented, go there.
TL;DR : how could it work ? Well, it is actually pretty simple as
ffmpeg
has lot of livestreaming features.Proposal :
Here is what I propose to (re)implement, and how.
manim --livestreaming
will pop an interactive python shell dedicated to get manim play/wait/etc calls. Pretty much what is on the gif sent above.Scene
object, one has to write a Scene definition and a construct method in order to writeplay
calls, and therefore it is not possible to live-render aplay
call. The best idea would be to haveSpecialScene
that would includeThreeDScene
,GraphSCene
, etc. Indeed, there won't be no scene defintion in the shell so the user won't be able to specify which type of scene does they want. See stream_starter.py in this PR here to see how it used to work (it was pretty much how I described except using the old ans ugly CONFIG dict.). So, it would be possible to do :SpecialScene
(renamed here tomanimlive
). E.g :The problem is about variables. Going back in the video stream is fairly easy thanks to scene-caching ( ;) ), but if we decide that when one goes back, all the variables are set to their original values, then it won't be fairly easy. Same for adding an existing scene with all the variables that the new SCene will bring. (Note : for the last idea, it's likely impossible but it is worth trying).
.cfg
. The idea is to have a [streaming] section in the .cfg.