ar1st0crat / NWaves

.NET DSP library with a lot of audio processing functions
MIT License
453 stars 71 forks source link

WaveFile class closes the underlying audio stream #32

Closed cldeba closed 3 years ago

cldeba commented 3 years ago

Hello!

I don't know if that behavior is intentional but to me it does not seem completely logical. When I read an audio stream using the WaveFile class, the stream gets closed automatically afterwards by the underlying BinaryReader. Thus, when I instantiate another WaveFile object using the same Stream instance, I get this exception: "Cannot access a closed stream".

As a hint: the BinaryReader class supports leaving the stream open (see https://docs.microsoft.com/en-us/dotnet/api/system.io.binaryreader.-ctor?view=netcore-3.1#System_IO_BinaryReader__ctor_System_IO_Stream_System_Text_Encoding_System_Boolean_).

King regards from Austria! Clemens

P.S.: I am a big fan of your work! I really like your coding style as it makes it very easy to debug.

ar1st0crat commented 3 years ago

Hi!

Yes, definitely you have a point! I simply didn't focus much on loading/saving data and I didn't consider any scenario other than "open one file -> load data -> close file -> forget about it :-)". I'll take your remark into account, and In the next version the stream won't be closed in WaveFile methods. Moreover, I think I'm going to add another overloaded constructor that will simply accept plain array of bytes as the parameter.

Thanks for your kind words! ))

Regards, Tim

ar1st0crat commented 3 years ago

Hello again, I've committed a fix.

A couple of notes:

While I agree that there are reasons why underlying stream should not be closed in WaveFile constructor, I don't think that your case is one of these reasons (I mean, the way you use the class). In fact, WaveFile is not intended to be a "wrapper around the stream" or to acquire any resource (and it doesn't implement IDisposable interface, for example). It's more like a "constructor of signals in memory based on data from the stream" and its lifetime is not synchronized with the stream.

Once you've constructed these signals from wav file (i.e. filled WaveFile object with data), you don't need to worry anymore about the underlying data stream, simply work with signals in memory.

Thus, there's no need to do this: (instantiate more than one WaveFile objects, if I understood you correctly):

DiscreteSignal s1, s2;

using (var stream = new FileStream("1.wav", FileMode.Open))
{
    var waveFile = new WaveFile(stream);
        s1 = waveFile[0];

    // ...

        stream.Seek(0, SeekOrigin.Begin);

        var waveFile2 = new WaveFile(stream);
        s2 = waveFile2[0];
}

The code above works OK now, however, I'd do this instead:

WaveFile waveFile;

using (var stream = new FileStream("1.wav", FileMode.Open))
{
    waveFile = new WaveFile(stream);
}

DiscreteSignal s1 = waveFile[0];
DiscreteSignal s2 = s1.Copy();      // get the copy *before* any processing

// process s1 and s2
// ...

Regards, Tim