Closed craig-willis closed 3 years ago
Overall, this build looks good. The only significant issue I hit was publishing to DataONE Dev, a mismatch in file MD5s. I don't think this is on our en:
Failed to publish to Data ONE with "There was a fatal error while uploading {fname}. Please contact the support team." Tried two different examples. Looking at gwvolman logs:
<description>The checksum calculated from the saved local file is
E583EFC6491C6200EE9C0AB4CA980EE1. But it doesn't match the value
from the system metadata 13a45db266c3dd93e2208650a87f1d09.
</description>
The md5 in the manifest is right and matches what I get locally, so maybe a problem on D1 side? The {fname} replacement in the error message could be fixed.
If I add a dataset by DOI and export, the fetch.txt contains DOIs with no resolver, so I can't run locally
If I rename a version, the timestamp for the version is updated (old version becomes newer version)
Not reproducible, but I've seen the progress go into ERROR when starting an instance, but it recovers and continues. Almost like a fleeting state change
Notes:
Test Plan
https://github.com/whole-tale/wt-design-docs/issues/new?template=TEST_PLAN.md
Note: For all tests, repeat for supported browser/OS combinations.
Preconditions:
Splash page
Authentication
[x] Basic login flow
[x] Basic logout flow
[x] Return-route for non-logged in users
Navigation
Tale Dashboard
Preconditions:
Assumes production Tales present (e.g., LIGO, materials, etc).
No running instances
[x] General
[x] Search
[x] View tale
[x] Launch instance
Managing Data
Preconditions
Empty home directory
No registered data
[x] Register General
[x] Register DataONE data
10.5065/D6862DM8
[x] Dataverse
[x] Globus/MDF
Run Tale
Preconditions:
No running Tale instances
[x] General
[x] Interact tab
[x] Files tab
[x] Metadata tab - Owned Tale
Published Location
readsThis Tale has not been published
Environment
dropdown menuLicense
dropdown (CC4 and CC0)[x] Metadata tab - Non-Owned Tale (e.g., Public LIGO Tale)
[x] Home
[x] External data
doi:10.18739/A29G5GD0V
)[x] Tale Workspace
[x] Files - Non-Owned Tale
[x] Export Tale
pip install bdbag
andbdbag --validate full .
Settings
[x] Default State
[x] Connect to Zenodo
[x] Connect to Dataverse
[x] Connect to DataONE
[x] Confirm tokens retained across logins #370
/user/me
endpoint to confirm tokens are still presentTale Creation
[x] Create JupyterLab Tale
[x] Create RStudio Tale
[x] Compose Jupyter Notebook Tale
[x] Compose JupyterLab Tale
[x] Compose JupyterLab with Spark
[x] Compose MATLAB Desktop Tale
multiplicative_arima_example_script.m
, confirm outputsmultiplicative_arima_example_livescript.m
, confirm outputs[ ] Compose MATLAB Jupyter Kernel Tale
multiplicative_arima_example.ipynb
, confirm outputs matchmultiplicative_arima_example.html
[ ] Compose MATLAB Linux Desktop Tale
multiplicative_arima_example_script.m
, confirm outputs[x] Compose STATA Desktop Tale
example.do
, confirm outputs[x] Compose STATA Jupyter Kernel Tale
example.ipynb
, confirm outputs matchexample.html
[x] Compose OpenRefine Tale
[x] Too many instances
Analyze in Whole Tale
These test cases cover potential situations that can occur when importing datasets from Dataverse.
[x] Import dataset from Dataverse
Replication Data for: "Agricultural Fires and Health at Birth"
Input data
section matches the uri withData Source
appendedCreate New Tale
button is disabledCreate New Tale
science
[x] Import dataset from DataONE: READ-ONLY
Fire influences on forest recovery and associated climate feedbacks in Siberian Larch Forests, Russia
Selected data
section matches the uri withData Source
appendedRstudio
is selected in the Environments widgetCreate New Tale
button is enabledCreate New Tale
science
[x] Import from DataONE: READ-WRITE
Selected data
section matches the uri withData Source
appendedCreate New Tale
button is disabledREAD/WRITE
Create New Tale
[ ] Import from DataONE: alternate sites
Create New Tale
Tale metadata tests
The purpose of these tests are to confirm that the metadata files (manifest.json, environment.json, LICENSE) we generate are correct.
manifest.json
file undermetadata/
wt
context present@id
references the correct Tale IDschema:author
createdBy
aggregates
aggregates
wt:usesDataset
dct:hasVersion
is present and matches your created versionrdflib
can parse:import urllib.parse urllib.parse.uses_relative.append('arcp') urllib.parse.uses_netloc.append('arcp')
g = Graph().parse(source="manifest.json", format="json-ld") print(g.serialize(format='turtle', indent=2).decode())