Sorry for spamming you with PRs :smile: I'll leave it up to you if you want to implement this or not.
This allows the user to provide a custom pre-processor for each frame before it is sent off to Image for rendering. It takes a class instead of just a simple function typedef, so that stateful pre-processors might be implemented. It also allows the implementation to drop frames entirely.
We added this because with the current implementation of flutter_mjpeg, iOS 16 simply crashes when the URL of a 2k@12fps or 4k@10fps stream is provided. (Yes, these are not really use-cases for MJPEG but people do try...). This allowed us to build a custom fps-limiting pre-processor on top of this library to prevent these crashes.
I don't want to upstream that code, as it is very specific, so I decided for this generic implementation. The code I actually wrote on top of it is (as an example):
import 'package:flutter_mjpeg/flutter_mjpeg.dart';
class LimitImpactMjpegPreprocessor extends MjpegPreprocessor {
/// When reading the JPEG header, sizes from 0 - this are considered valid.
static int _frameSizeLimit = 16384; // px
/// Limit of frames per second depending on resolution.
/// Stored as resolution pixel product : 1000ms / max fps for faster access
/// E.g. streams with a size over 320x240 are displayed with 60fps max.
Map<int, int> _fpsLimits = {
320 * 240: (1000 / 60).floor(), // QVGA
640 * 480: (1000 / 40).floor(), // VGA
720 * 640: (1000 / 20).floor(), // HD
1080 * 720: (1000 / 12).floor(), // Full HD
2560 * 1440: (1000 / 2).floor(), // 2k
3840 * 2160: (1000 / 1).floor() // 4k
};
bool _frameMeasured = false;
int _frameWidth = 0;
int _frameHeight = 0;
int _frameDelayMin = 0;
Stopwatch _fwatch = Stopwatch();
@override
List<int>? process(List<int> frame) {
// If this is the first frame, we attempt to decode
if (!_frameMeasured) {
_readImageDimensions(frame);
_applyFpsLimit();
_frameMeasured = true;
if (_frameDelayMin != 0) _fwatch.start();
}
// If a frame limiter is in place, measure time between frames
if (_frameDelayMin != 0) {
if (_fwatch.elapsedMilliseconds > _frameDelayMin) {
_fwatch.reset();
return frame;
} else {
return null;
}
}
return frame;
}
void _readImageDimensions(List<int> frame) {
for (var i = 0; i < frame.length - 10; i++) {
if (frame[i] == 0xFF && frame[i + 1] == 0xC0) {
final height = (frame[i + 5] << 8) + frame[i + 6];
final width = (frame[i + 7] << 8) + frame[i + 8];
if (height > 0 && height < _frameSizeLimit) _frameHeight = height;
if (width > 0 && width < _frameSizeLimit) _frameWidth = width;
return;
}
}
}
void _applyFpsLimit() {
final pixelProduct = _frameWidth * _frameHeight;
for (final limitProduct in _fpsLimits.keys) {
if (pixelProduct >= limitProduct) {
_frameDelayMin = _fpsLimits[limitProduct] ?? 0;
}
}
}
}
Sorry for spamming you with PRs :smile: I'll leave it up to you if you want to implement this or not.
This allows the user to provide a custom pre-processor for each frame before it is sent off to
Image
for rendering. It takes a class instead of just a simple function typedef, so that stateful pre-processors might be implemented. It also allows the implementation to drop frames entirely.We added this because with the current implementation of
flutter_mjpeg
, iOS 16 simply crashes when the URL of a 2k@12fps or 4k@10fps stream is provided. (Yes, these are not really use-cases for MJPEG but people do try...). This allowed us to build a custom fps-limiting pre-processor on top of this library to prevent these crashes.I don't want to upstream that code, as it is very specific, so I decided for this generic implementation. The code I actually wrote on top of it is (as an example):