Open beeblebrox opened 10 years ago
I thought a bit about that but it's really hard to implement.I understand the request and I can imagine situations where this is really helpful. Problem is that the interface DataSource itself is kind of static or at least gives the impression that the underlying thing is not changing. So all classes using DataSource must use the DataSource in a way that they can handle an ever changing size (such as a socket receiving a h264 stream). Still thinking ...
I agree with everything you said. Other than the streaming limitation, mp4parser is so far a great fit for our use cases. If writing the raw file to disk becomes an issue I am hopeful in being provided time to implement a streaming track implementation. A nice middle ground would be if the h264 parsing mechanics and helpers were split out from the existing TrackImpl and accepted smaller, static, byte ranges, or buffers. Of course, that may not be an easy task either. If this would be made available then it should be fairly trivial for me to write a streaming track impl, although not zero copy, it would still only need to keep chunks of the data in memory.
Soon, I'll also be looking at this for the audio tracks.
hia sannies
i am getting null pointer exception at
H264TrackImpl h264Track = new H264TrackImpl(new FileDataSourceImpl(android.os.Environment.getExternalStorageDirectory().getAbsolutePath() + "/test.h264"));
but the file is there in the device.So what is the problem?
Thanks in Advance
Please include the stack trace your error description is as clear as 'my car doesn't work' Am 24.01.2015 12:30 schrieb "mohanraj546" notifications@github.com:
hia sannies
i am getting null pointer exception at
H264TrackImpl h264Track = new H264TrackImpl(new FileDataSourceImpl(android.os.Environment.getExternalStorageDirectory().getAbsolutePath()
- "/test.h264"));
but the file is there in the device.So what is the problem?
Thanks in Advance
— Reply to this email directly or view it on GitHub https://github.com/sannies/mp4parser/issues/7#issuecomment-71312339.
this is the log response
01-24 19:26:52.438: D/dalvikvm(25792): Late-enabling CheckJNI
01-24 19:26:52.508: E/path is(25792): <><>/storage/emulated/0/test.h264
01-24 19:26:52.518: D/AndroidRuntime(25792): Shutting down VM
01-24 19:26:52.518: W/dalvikvm(25792): threadid=1: thread exiting with uncaught exception (group=0x41554ba8)
01-24 19:26:52.518: E/AndroidRuntime(25792): FATAL EXCEPTION: main
01-24 19:26:52.518: E/AndroidRuntime(25792): Process: com.example.decodingh264tomp4, PID: 25792
01-24 19:26:52.518: E/AndroidRuntime(25792): java.lang.RuntimeException: Unable to start activity ComponentInfo{com.example.decodingh264tomp4/com.example.decodingh264tomp4.MainActivity}: java.lang.NullPointerException
01-24 19:26:52.518: E/AndroidRuntime(25792): at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2195)
01-24 19:26:52.518: E/AndroidRuntime(25792): at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2245)
01-24 19:26:52.518: E/AndroidRuntime(25792): at android.app.ActivityThread.access$800(ActivityThread.java:135)
01-24 19:26:52.518: E/AndroidRuntime(25792): at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1196)
01-24 19:26:52.518: E/AndroidRuntime(25792): at android.os.Handler.dispatchMessage(Handler.java:102)
01-24 19:26:52.518: E/AndroidRuntime(25792): at android.os.Looper.loop(Looper.java:136)
01-24 19:26:52.518: E/AndroidRuntime(25792): at android.app.ActivityThread.main(ActivityThread.java:5017)
01-24 19:26:52.518: E/AndroidRuntime(25792): at java.lang.reflect.Method.invokeNative(Native Method)
01-24 19:26:52.518: E/AndroidRuntime(25792): at java.lang.reflect.Method.invoke(Method.java:515)
01-24 19:26:52.518: E/AndroidRuntime(25792): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:779)
01-24 19:26:52.518: E/AndroidRuntime(25792): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:595)
01-24 19:26:52.518: E/AndroidRuntime(25792): at dalvik.system.NativeStart.main(Native Method)
01-24 19:26:52.518: E/AndroidRuntime(25792): Caused by: java.lang.NullPointerException
01-24 19:26:52.518: E/AndroidRuntime(25792): at com.googlecode.mp4parser.authoring.tracks.H264TrackImpl$SliceHeader.
hai sannies,
is the response clear or not?
I'd like to add a "me too" for this.
We're currently using a heavily customised version of a (very!!) old release of the IsoParser, and are trying to upgrade to the latest version. However, our use-case means that we need to rewrite MP4 metadata "on the fly" - preferrably while it's travelling across the network as an InputStream
and has not been written to disk.
Our customised version reads all header boxes up to the mdat
box into memory and then leaves the rest "on the wire". It then modifies the header and streams the mdat box afterwards; the original IsoFile class made this approach not too difficult. However, not only has the new IsoParser replaced InputStream
with a DataSource
, but this DataSource
must now be seekable! A network stream is obviously not seekable.
I think we've come up with some kind of "file-backed cache" approach here, but it feels as if we're fighting a losing battle to bridge what we need the IsoParser to do and where the IsoParser is currently heading. Does the IsoParser need to use a seekable data source? Could there a way of using it that doesn't need the source to be seekable please?
Hi, the traditional MP4 format is heavily bound to files and meant to be streamed in any way. You can easily see that if you look at the Chunk Offset Box - it's an offset from the beginning of a file and kind of requires to keep the file somehow accessible. I moved the API to strongly into the direction of file based I/O with the traditional MP4 in mind. The picture changed with the fragmented MP4 files as they only require a small piece of the whole file to be read completely to make sense and streaming use-case are now more prominent. Unfortunately this recent movement towards fragmented MP4s - which can be kind of streamed - is not reflected in the API as it still sticks to the underlying file model I'm trying to move it into this direction with the package com/mp4parser/streaming which will allow to connect some datasource with a datasink but things take time especially if the project doesn't fund you - 90% of the big restructuring/modernizing tasks are solely funded by my personal time and this slows things down as I have to pay rent. Stay tuned. I'm still working on it. (You reminded me and I'll work on it today.)
Best Regards, Sebastian
2015-07-28 10:56 GMT+02:00 Chris Rankin notifications@github.com:
I'd like to add a "me too" for this.
We're currently using a heavily customised version of a (very!!) old release of the IsoParser, and are trying to upgrade to the latest version. However, our use-case means that we need to rewrite MP4 metadata "on the fly" - preferrably while it's travelling across the network as an InputStream and has not been written to disk. Our customised version reads all header boxes up to the mdat box into memory and then leaves the rest "on the wire". It then modifies the header and streams the mdat box afterwards; the original IsoFile class made this approach not too difficult. However, not only has the new IsoParser replaced InputStream with a DataSource, but this DataSource must now be seekable! A network stream is obviously not seekable. I think we've come up with some kind of "file-backed cache" approach here, but it feels as if we're fighting a losing battle to bridge what we need the IsoParser to do and where the IsoParser is currently heading. Does the IsoParser need to use a seekable data source? Could there a way of using it that doesn't need the source to be seekable please?
— Reply to this email directly or view it on GitHub https://github.com/sannies/mp4parser/issues/7#issuecomment-125509163.
I'd like to be able to implement a DataSource which can be passed to the Track Impls that may have data pending semantics. Currently DataSource, and its use in the tack Impls, require the complete size of the data to be known in advanced. My use case is that I am streaming the data from a source. In that case, the size is unknown to me and so current DataSource semantics requires me to stream all the data from the source before processing it.
Am I missing something that should allow this?
If not, I propose a new constructor on the track Impls which would take in an NIO byte channel. Or alternatively, a new ByteBufferDataSource interface type which can be used to construct these Impls which would gracefully handle partial data until the ByteBufferDataSourceImpl indicates eof by some other means. That construction may not make sense for all Track types, but it would, for example for the H264 track.