aldanor / hdf5-rust

HDF5 for Rust
https://docs.rs/hdf5
Apache License 2.0
310 stars 84 forks source link

Can't find "lib/hdf5/plugin" #200

Closed jatentaki closed 2 years ago

jatentaki commented 2 years ago

I've tried building hdf5 on ubuntu20.04 with apt-get installed libhdf5-dev and conda-installed hdf5. The project builds but when executed it panics with called Result::unwrap() on an Err value: H5Dread(): can't read data: can't open directory: /usr/lib/x86_64-linux-gnu/hdf5/plugins and called Result::unwrap() on an Err value: H5Dread(): can't read data: can't open directory: /home/jatentaki/miniconda3/lib/hdf5/plugin, respectively. With apt-get /usr/lib/x86_64-linux-gnu/hdf5 contains serial (and nothing else), with miniconda /home/jatentaki/miniconda3/lib/hdf5 doesn't exist (let alone the plugin subdirectory). What can I be doing wrong? I looked at the CI code for how to set up the env variables and I don't think I am missing anything.

aldanor commented 2 years ago

It's a problem with the hdf5 itself... what are the HDF5 versions you're using (the ubuntu one and the conda one?)

mulimoen commented 2 years ago

It would also be helpful to know which filter(s) you are trying to use, and how you are testing the crate

jatentaki commented 2 years ago

@aldanor Let's maybe fix our attention on ubuntu, since that's what I'd prefer to use anyway. It's this

jatentaki@drozd:~$ apt-cache policy libhdf5-dev
libhdf5-dev:
  Installed: 1.10.4+repack-11ubuntu1
  Candidate: 1.10.4+repack-11ubuntu1
  Version table:
 *** 1.10.4+repack-11ubuntu1 500
        500 http://ch.archive.ubuntu.com/ubuntu focal/universe amd64 Packages
        100 /var/lib/dpkg/status

@mulimoen The dataset is compressed with lzf. Regarding how I am testing it: I am not sure what exactly you would like to know. I'm developing a python extension (via pyO3) to load data for deep learning. This is very early stage, so all I have is a function exposed to python get_hdf5(path: str) which looks like this (some boilerplate skipped):

fn read_hdf5(path: &str) -> Result<Array<f32, Ix3>, hdf5::Error> {
    let file = hdf5::File::open(path)?;
    let group = file.dataset("dataset")?;
    group.read::<f32, Ix3>()
}

#[pymodule]
fn rustdata(_py: Python<'_>, m: &PyModule) -> PyResult<()> {
    #[pyfn(m)]
    fn get_hdf5<'py>(
        py: Python<'py>,
        path: &str,
    ) -> &'py PyArray3<f32> {
        read_hdf5(path).unwrap().into_pyarray(py)
    }

    Ok(())
}

and then I test it in python as rustdata.get_hdf5(position_path), which is where the error happens. I can confirm that this code works if I point the code to an uncompressed hdf file.

mulimoen commented 2 years ago

@jatentaki It seems the plugins are not installed or not in an expected directory. Have you tried to register the lzf filter in this crate by using the lzf feature?

jatentaki commented 2 years ago

@mulimoen ok, turns out I didn't have the lzf feature enabled. Setting it on fixes the issue.