danielhrisca / asammdf

Fast Python reader and editor for ASAM MDF / MF4 (Measurement Data Format) files
GNU Lesser General Public License v3.0
655 stars 226 forks source link

"ValueError: field occurs more than once" in append, for version 3.00 #1051

Closed morbult closed 2 months ago

morbult commented 4 months ago

Hi Daniel,

I came accross a bunch of version 3.00 MDF files giving this type of error. It seems to occur when the master signal of a group has the same name as another signal in that same group, in this case 'Time'.

I was able to create a sample file (test.zip) reproducing the error by manually renaming the master signal after I saved it with asammdf. Using the sample file, this reproduces the error:

import asammdf
with asammdf.MDF('test.mdf') as mdf:
    channels = [('Time', 0, 1), ('dummy1', 0, 2)]
    mdf.to_dataframe(channels)
Traceback (most recent call last):

  Cell In[34], line 4
    mdf.to_dataframe(channels)

  File ~\AppData\Local\anaconda3\Lib\site-packages\asammdf\mdf.py:4305 in to_dataframe
    mdf = self.filter(channels)

  File ~\AppData\Local\anaconda3\Lib\site-packages\asammdf\mdf.py:2067 in filter
    cg_nr = mdf.append(

  File ~\AppData\Local\anaconda3\Lib\site-packages\asammdf\blocks\mdf_v3.py:2014 in append
    types = dtype(types)

ValueError: field 'Time' occurs more than once

Here is the output of mdf.info():

{'author': '',
 'department': '',
 'project': '',
 'subject': '',
 'version': '3.00',
 'groups': 1,
 'group 0': {'cycles': 10,
  'comment': 'Python',
  'channels count': 4,
  'channel 0': 'name="Time" type=master',
  'channel 1': 'name="Time" type=value',
  'channel 2': 'name="dummy1" type=value',
  'channel 3': 'name="dummy2" type=value'}}

Would be very thankful if this case could be handled without raising an error.

danielhrisca commented 4 months ago

@morbult please try the development branch code

morbult commented 4 months ago

I am not able to install the development branch because the cutils build fails. But if you tested it and it works on the sample file it should be good.

Thanks for a quick job! Hope the next version will be out soon :)