HEAD currently isn't constructing DAG links to readme components.
steps to reproduce:
$ qri add nyc-transit-data/turnstile_daily_counts_2020
$ qri log
# make sure latest hash is /ipfs/QmaZJjtbSQVgeZSAamP4UpvVvDG7x7wHntAKSFbtxfssdU
$ qri get body nyc-transit-data/turnstile_daily_counts_2020
If you're running qri connect, that last command will just hang. If you're offline, you'll get merkledag: not found.
Running qri get body nyc-transit-data/turnstile_daily_counts_2020 --log-all while offline trips the debug call on line 40 of base.OpenDataset:
ok listing links shows a readme.md file, which should be the file we need. what gives. Well, base.OpenDataset looks at dataset.readme.scriptPath in a dataset to find the IPFS path. Let's look at it:
So, scriptPath thinks the value is: QmYZMAGE5VVDDGpEj7wWQgQAyBM76Q31iSXm15D8YqR9Jd. The hash of readme.md (the script) is QmXU7gg17hyErNyo5Cr8ASJcbadFFEyMfMaaKLB7rYcj8R. Those are different. bad.
Because we've cloned this dataset from somewhere else, we don't have QmYZMAGE5VVDDGpEj7wWQgQAyBM76Q31iSXm15D8YqR9Jd in our local repo.
Finally, let's look at the contents of readme.md in that DAG:
This is a pervasive problem within our stack that's now causing major UX problems. This problem has gotten worse in recent revisions, creeping into readme when it only used to be in viz.
Steps to fix:
[ ] add a test that confirms file structure & names
[ ] add an all-component test that confirms DAG structure pins all hashes, using an IPFS node
HEAD currently isn't constructing DAG links to readme components.
steps to reproduce:
If you're running
qri connect
, that last command will just hang. If you're offline, you'll getmerkledag: not found
.Running
qri get body nyc-transit-data/turnstile_daily_counts_2020 --log-all
while offline trips the debug call on line 40 ofbase.OpenDataset
:https://github.com/qri-io/qri/blob/c21419b3a0df46e69c923ffcc8bcdbeb2690f31e/base/dataset.go#L38-L43
That means qri can't find a hash it needs. Let's use IPFS to investigate.
ok listing links shows a
readme.md
file, which should be the file we need. what gives. Well,base.OpenDataset
looks atdataset.readme.scriptPath
in a dataset to find the IPFS path. Let's look at it:So, scriptPath thinks the value is:
QmYZMAGE5VVDDGpEj7wWQgQAyBM76Q31iSXm15D8YqR9Jd
. The hash ofreadme.md
(the script) isQmXU7gg17hyErNyo5Cr8ASJcbadFFEyMfMaaKLB7rYcj8R
. Those are different. bad.Because we've cloned this dataset from somewhere else, we don't have
QmYZMAGE5VVDDGpEj7wWQgQAyBM76Q31iSXm15D8YqR9Jd
in our local repo.Finally, let's look at the contents of
readme.md
in that DAG:... that doesn't look like a readme file 🤷♀. Time to go digging around in
base/dsfs/dataset.go
.We've run into this problem before with default viz hashes not getting pinned, and have added hacks like this to get around it:
https://github.com/qri-io/qri/blob/c21419b3a0df46e69c923ffcc8bcdbeb2690f31e/base/dataset.go#L45-L58
This is a pervasive problem within our stack that's now causing major UX problems. This problem has gotten worse in recent revisions, creeping into readme when it only used to be in viz.
Steps to fix: