Closed yarikoptic closed 3 months ago
Do not just use docker image, install
master
of dandi-archive into it (I think we do smth similar already)
The Docker image is built from dandi-archive's master
already.
Before cleaning up the env in the fixture taking care about docker compose, connect to postgres DB and dump
Audit
table as a list of json records into the file pointed to byDANDI_TESTS_AUDIT_JSON
env var
And then what? Is this feature going to be used as part of the GitHub Actions CI — in which case, should the file be uploaded as a build artifact?
Also, how exactly is the dump supposed to be performed? pg_dump
doesn't have a JSON option, and I do not want to add a Python database library to dandi-cli's test dependencies just for this feature, which I honestly don't expect to be used very much.
Before cleaning up the env in the fixture taking care about docker compose, connect to postgres DB and dump
Audit
table as a list of json records into the file pointed to byDANDI_TESTS_AUDIT_JSON
env varAnd then what? Is this feature going to be used as part of the GitHub Actions CI — in which case, should the file be uploaded as a build artifact?
nah, just locally ATM
Also, how exactly is the dump supposed to be performed? pg_dump doesn't have a JSON option, and I do not want to add a Python database library to dandi-cli's test dependencies just for this feature, which I honestly don't expect to be used very much.
could you just add it to some optional extras_require (e.g. dbdump
)?
@yarikoptic
Also, how exactly is the dump supposed to be performed? pg_dump doesn't have a JSON option, and I do not want to add a Python database library to dandi-cli's test dependencies just for this feature, which I honestly don't expect to be used very much.
could you just add it to some optional extras_require (e.g.
dbdump
)?
You're focusing on the wrong part of my comment. How do you want the dump to be performed? Or are you trying to say that it should be done all in Python?
I thought that indeed we could just connect to DB from python and perform the dump. Is that too tricky?
@yarikoptic If you're OK with CSV instead of JSON, it would be far easier to just do a SELECT
via psql
instead of futzing with DB libraries.
sure, csv would be just fine!
note though that details
is a json record so would be full of ,
so within csv those should be handled appropriately (not just dumped via ad-hoc print
).
:rocket: Issue was released in 0.63.1
:rocket:
Now that we have
audit
functionality merged into master of dandi-archive, it would be great toCould we hackup quickly the following :
DANDI_TESTS_AUDIT_JSON
env var is set:master
of dandi-archive into it (I think we do smth similar already)Audit
table as a list of json records into the file pointed to byDANDI_TESTS_AUDIT_JSON
env var