Open peter-t-fox opened 4 years ago
One way to do this is to construct a Python Table Array object, from C++ Array/Table object (yet merge). then it should works
However, it should be wait until a more complicated python example are available tested, e.g. write several table , array into a single h5 files.
in C++, Both standalone , test_array_local(), and test_dp_array() are available. but test_dp_array() {using standard }are comment out, the error, is always "Group" exists, so it does not give me the change to get_write_group() it again to write metadata write_array(). In python, h5 are open in "a" mode, while in c++ standalone local test, it is open in "w" model. in C++ I use "ceate_group()", while python use "require_group()", those tiny diff may make the bugs.
Presuming the hdf5 write mode interpretation is consistent between the python and C++ implementations, "w"
would mean truncation, which I don't think is what we want - we may want to write multiple components to a single file.
The
read/write_table
andread/write_array
wrappers are working in a "standalone mode" i.e. they can read/write their respective data types from test files or Python objects.However, they are not integrated with the Python data pipeline API implementation. In particular, the
write_table
wrapper has a problem associated with trying to attach attributes (e.g. units) to a written table.