HDFGroup / hsds

Cloud-native, service based access to HDF data
https://www.hdfgroup.org/solutions/hdf-kita/
Apache License 2.0
125 stars 52 forks source link

Example Requests and Authentication #362

Closed JattMones closed 1 week ago

JattMones commented 1 month ago

Hi, I'd like to use hsds to store my websites hdf5 files. I'd like to be able to add new h5 files, update their attributes, and remove h5files. I'd like all my app users to be able to do this using an API key and a curl request.

Could you give an example of how I can set up an API key and perform a POST, PUT/UPDATE (for attribute updates), GET, and DELETE curl request using the API key?

Can you cover how I can change the API key, or if I can add multiple keys with different user access? (ie having a different key for each user or group that uses my site so that they can only edit their files).

Thanks!

jreadey commented 1 month ago

Hi @JattMones - thanks for your questions. First, I should make it clear that (currently at least) you can't just point HSDS to a bucket of H5 files, you need to import them with the hsload utility. hsload by default will convert all the data to the HSDS schema. Alternatively you can use "hsload --link" that will convert just the metadata, but leave the dataset data in the file.

Secondly, HSDS doesn't use API keys, per se. Requests can include an auth header with username and password (OAuth is also supported). Once authenticated, the actions a given use can take is determined by the ACLs (Access Control List) of the domain (i.e. hsds file). There's a blog post that goes into some detail on how this works: https://www.hdfgroup.org/2015/12/serve-protect-web-security-hdf5/.

Finally this repo: https://github.com/HDFGroup/hdf-rest-api covers use of the REST API. For example to put/update an attribute, the curl would be:

$ curl -X PUT -u username:password --header "X-Hdf-domain: /shared/tall.h5" --header "Content-Type: application/json"
  -d "{\"shape\": 2, \"type\": {\"class\": \"H5T_COMPOUND\", \"fields\": [{\"type\": \"H5T_STD_I32LE\", \"name\": \"temp\"},
  {\"type\": \"H5T_IEEE_F32LE\", \"name\": \"pressure\"}]}, \"value\": [[55, 32.34], [59, 29.34]]}" hsdshdflab.hdfgroup.org/groups/g-45f464d8-883e-11e8-a9dc-0242ac12000e/attributes/attr_compound

There are a bunch of new features that haven't been added to the docs yet (we are in the process of updating them), but all the basic operations should be there.

Depending on your environment, it might be more convenient to use the hs tools for thins like hsrm /home/foo.h5 rather than constructing a curl request. Don't have a hstool for adding an attribute yet, but that wouldn't be hard to do.

Hope this helps! Let me know if you hae ore questions or anything seems unclear.

JattMones commented 1 month ago

Thanks @jreadey, that does help. If you could explain a little more about how an h5 bucket is structured from top level down with Domains, Groups, and Datasets that would also be helpful.

For example what would each item below be considered in the following hierarchy?

Would an ACL group be associated with the whole domain (ie. if the domain was something like filename.h5)? Would a datatype be associated with the domain (ie. if the domain was something like filename.h5) and return that someDataSet is a dataset, and subtier and tier1 are groups?

jreadey commented 1 month ago

No, a bucket consists of folders (equivalent to directories) and domains (equivalent to HDF5 files) and folders may have other folders or domains under them.

The domain object itself (e.g. /home/myfolder/mydomain.h5/.domain.json) contains just a root group uuid, and any ACLs that belong to the domain. The actual hdf objects of a domain are stored under the "db" key. This is a bit indirect, but has the benefit that you can move or rename domains without having to move all the objects in the domain.

This is documented here: https://github.com/HDFGroup/hsds/blob/master/docs/design/obj_store_schema/obj_store_schema_v2.md, but in normal circumstances it shouldn't matter how the data is stored if you are accessing content using the HDF Rest API, or the hdf api's (REST VOL or h5pyd).

jreadey commented 1 week ago

Closing this issue - please reopen if there are follow-on questions