Closed andyfaff closed 3 years ago
@bmaranville could you look into this. The example datafile from https://www.reflectometry.org/projects/file_formats/tasks/ws_2021-06_text/, as held in the file https://github.com/reflectivity/orsopy/blob/main/tests/test_example.ort, does not validate against the reduced data schema held here. So either the schema is incorrect, or the input json is incorrect.
I've been using https://www.jsonschemavalidator.net/ to do the validation. The initial validation produces these errors:
Message: Invalid type. Expected String but got Object. Schema path: #/definitions/DataSource/properties/owner/type Message: Required properties are missing from object: sample. Schema path: #/definitions/Experiment/required Message: Required properties are missing from object: omega, wavelength. Schema path: #/definitions/Measurement/required Message: Required properties are missing from object: facility, experimentID, experimentDate, title. Schema path: #/definitions/DataSource/required
Against the input json of:
{ "creator": { "name": "G. User", "affiliation": "PSI", "time": "2020-02-03T14:37:15", "computer": "lnsa17.psi.ch" }, "data_source": { "owner": { "name": "T. Proposer", "affiliation": "The Institute (TI)", "contact": "t_proposer@institute.org" }, "experiment": { "facility": "Paul Scherrer Institut, SINQ", "ID": "2020 0304", "date": "2020-02-03T14:37:15", "title": "Generation of input for formatting purposes", "instrument": "Amor", "probe": "neutrons" }, "sample": { "name": "Ni1000", "description": [ { "amb": "air" }, { "layer": { "material": "Ni", "thickness": "100 nm" } }, { "sub": "Si" } ] }, "measurement": { "scheme": "angle- and energy-dispersive", "instrument_settings": { "sample_rotation": { "alias": "mu", "unit": "deg", "value": 0.7 }, "detector_rotation": { "alias": "mu", "unit": "deg", "value": 1.4 }, "incident_angle": { "unit": "deg", "min": 0.4, "max": 1.0, "resolution": { "type": "constant", "unit": "deg", "value": 0.01 } }, "wavelength": { "unit": "angstrom", "min": 3.0, "max": 12.5, "resolution": { "type": "proportional", "value": 0.022 } }, "polarisation": "+" }, "data_files": [ { "file": "amor2020n001925.hdf", "created": "2020-02-03T14:37:15" }, { "file": "amor2020n001926.hdf", "created": "2020-02-03T14:37:15" }, { "file": "amor2020n001927.hdf", "created": "2020-02-03T14:37:15" } ], "references": [ { "file": "amor2020n001064.hdf", "created": "2020-02-03T14:37:15" } ] } }, "reduction": { "software": "eos.py", "call": "eos.py -Y 2020 -n 1925-1927 -y 9,55 ni1000 -O -0.2 -r 1064 -s 1 -i -a 0.005 -e", "comment": "corrections performed by normalisation to measurement on reference sample", "corrections": [ "footprint", "incident intensity", "detector efficiency" ] }, "columns": [ { "name": "Qz", "unit": "1/angstrom", "dimension": "WW transfer" }, { "name": "R", "dimension": "reflectivity" }, { "name": "sR", "dimension": "error-reflectivity" }, { "name": "sQz", "unit": "1/angstrom", "dimension": "resolution-WW transfer" } ], "data_set": "spin_up" }
seems to be fixed somewhat by #10
I haven't updated the schemas since the June meeting - they are due for an update!
Fixed in #10
@bmaranville could you look into this. The example datafile from https://www.reflectometry.org/projects/file_formats/tasks/ws_2021-06_text/, as held in the file https://github.com/reflectivity/orsopy/blob/main/tests/test_example.ort, does not validate against the reduced data schema held here. So either the schema is incorrect, or the input json is incorrect.
I've been using https://www.jsonschemavalidator.net/ to do the validation. The initial validation produces these errors:
Against the input json of: