Open ollyde opened 4 years ago
Why is startImageStream insufficient? Is it just the audio that is missing? Why would it be faster?
Yes, startImageStream
It was actually used at Flutter Live? https://youtu.be/OAEWySye0BQ?t=1463 but doesn't exist in the source code https://github.com/flutter/plugins/blob/master/packages/camera/lib/camera.dart
Perhaps it was just renamed to startCameraStream
? It returns a CameraImage
,
/// A single complete image buffer from the platform camera. /// /// This class allows for direct application access to the pixel data of an /// Image through one or more [Uint8List]. Each buffer is encapsulated in a /// [Plane] that describes the layout of the pixel data in that plane. The /// [CameraImage] is not directly usable as a UI resource. /// /// Although not all image formats are planar on iOS, we treat 1-dimensional /// images as single planar images.
Any updates on this?
@safdar99 we had to create our own live stream libraries and camera library for devices.
@safdar99 we had to create our own live stream libraries and camera library for devices.
do you have a sample?
@safdar99 we had to create our own live stream libraries and camera library for devices.
do you have a sample?
@safdar99 we had to create our own live stream libraries and camera library for devices.
do you have a sample?
I can’t give that code out unfortunately. But there’s some great packages you can base from like flutter rmtp
Any updates on this? Or if anyone has any leads on how to implement I'd be happy to try and help out here.
gstreamer? but then how would gstreamer talk to skia to draw textures?
startImageStream
was added to Flutter camera but this isn't sufficient.We really need
startByteStream
which also includes audio, and is much faster.Many of us are trying to create live streams from our apps and at the moment it's impossible.
startVideoRecording
locks the file, soon as for exampleFlutterFFmpeg
tries to read the file for live RTMP streaming it stops the camera.