Closed kurddt closed 5 years ago
Hi,
While I really appreciate the effort, you're still covering up obvious standards violations. This kind of broken data must not be covered up deep in the stack, it's an end-user decision to make if it should be interpreted as unsigned.
I'm sorry, but I won't merge this. If supporting >32k samples per trace is really important to you, it can be handled in the Python bit. Covering it up by silently reinterpreting broken data is not going to fly.
If you don't mind me asking, what kind survey is this with 60k samples?
I understand you point. I will see what I can so on the python side
The data come from continuous recording, a trace being one minute of data
I understand you point. I will see what I can so on the python side
It might be slightly involved to get all the details right, but I'm available if you need help, or if you want to hand it off. Regardless, I'm very thankful for this contribution.
The data come from continuous recording, a trace being one minute of data
Interesting. Good thing you cut it after one minute, because you only have ~5.5k more samples to spare. :----------)
I will close this PR then since #408 seems to be a better starting point.
It might be slightly involved to get all the details right, but I'm available if you need help, or if you want to hand it off. Regardless, I'm very thankful for this contribution.
You are welcome. I'm not sure when or how much time I will take me to do that, so if you want to implement it please go ahead.
This allows to have 16bit unsigned field.
Resolves #407