THE BRANCH FOR THIS ROLLOUT IS: kg2.10.1cTHE ARAX-DATABASES.RTX.AI DIRECTORY FOR THIS ROLLOUT IS: /home/rtxconfig/KG2.10.1 Sprint changelog link: (Changelog)
Prerequisites
ssh access
To complete this workflow, you will need ssh access to:
[x] arax-databases.rtx.ai
[x] the self-hosted ARAX/KG2 instance, arax.ncats.io (see example configuration information below)
[x] the self-hosted PloverDB instances, kg2cploverN.rtx.ai
[x] the self-hosted Neo4j instances for KG2c, kg2canoncalizedN.rtx.ai
[x] the self-hosted CI/CD instance, cicd.rtx.ai
[x] the webserver for downloading of the KG2c "lite" JSON file, kg2webhost.rtx.ai
GitHub access
[x] write access to the RTXteam/PloverDB project area
[x] write access to the RTXteam/RTX project area
[x] write access to the ncats/translator-lfs-artifacts project area (not critical, but needed for some final archiving steps; Amy Glen and Sundar Pullela have access)
AWS access
You will need:
[x] access to the AWS Console (you'll need an IAM username; ask Stephen Ramsey about getting one)
[x] IAM permission to start and stop instances in EC2 via the AWS Console
[x] access to the S3 bucket s3://rtx-kg2/ (ask Stephen Ramsey for access)
Slack workspaces
You will also need access to the following Slack workspaces:
[x] ARAXTeam (subscribe to #deployment)
[x] NCATSTranslator (subscribe to `#devops-teamexpanderagent)
Example ssh config for setting up login into arax.ncats.io:
[x] merge master into the branch being used for this KG2 version (which would typically be named like KG2.X.Yc). Record this issue number in the merge message.
[x] update the four hardcoded biolink version numbers in the branch (as needed):
[x] in code/UI/OpenAPI/python-flask-server/openapi_server/openapi/openapi.yaml (github; local)
[x] in code/UI/OpenAPI/python-flask-server/KG2/openapi_server/openapi/openapi.yaml (github; local)
[x] in code/UI/OpenAPI/specifications/export/ARAX/1.5.0/openapi.yaml(github)
[x] in code/UI/OpenAPI/specifications/export/KG2/1.5.0/openapi.yaml(github)
[ ] build a new KG2c on buildkg2c.rtx.ai from the branch (how-to is here)
[x] before starting the build:
[x] make sure there is enough disk space available on arax-databases.rtx.ai (need at least 100G, ideally >120G). delete old KG2 database directories as needed (warn the team on Slack in advance).
[x] make sure to choose to build a new synonymizer in kg2c_config.json, as described in the how-to
[ ] after the build is done, verify it looks ok:
[x] node_synonymizer.sqlite should be around 8-15 GB
[ ] make sure node_synonymizer.sqlite's last modified date is today (or whatever day the build was run)
[ ] make sure kg2c_lite.json.gz's last modified date is today (or whatever day the build was run)
[ ] the entire build runtime (synonymizer + KG2c) shouldn't have been more than 24 hours
[ ] the synonymizer and KG2c artifacts should have been auto-uploaded into the proper directory on arax-databases.rtx.ai (/home/rtxconfig/KG2.X.Y)
[ ] verify the correct KG2 version was uploaded by running this query: match (n {id:"RTX:KG2c"}) return n
[ ] update RTX/code/config_dbs.json in the branch:
[ ] update the synonymizer version number/path
[ ] update the fda_approved_drugs version number/path
[ ] update the autocomplete version number/path
[ ] update the meta_kg version number/path
[ ] update the kg2c sqlite version number/path
[ ] update the KG2pre and KG2c Neo4j endpoints
[ ] copy the kg2c_lite_2.X.Y.json.gz file (which you can get from the S3 bucket s3://rtx-kg2/kg2c_lite.json.gz (but CHECK THE DATE AND MD5 HASH TO BE SURE YOU ARE NOT GETTING AN OLD FILE) to the directory /home/ubuntu/nginx-document-root/ on kg2webhost.rtx.ai
[ ] load the new KG2c into Plover (how-to is here)
[ ] start the new self-hosted PloverDB on kg2cploverN.rtx.ai:
[ ] ssh ubuntu@kg2cploverN.rtx.ai
[ ] cd PloverDB && git pull origin kg2.X.Yc
[ ] if you have not yet built the 2.X.Y docker image/container on this instance, run:
[ ] ./run.sh ploverimage2.X.Y plovercontainer2.X.Y "sudo docker" (takes about an hour)
[ ] otherwise, simply run:
[ ] sudo docker start plovercontainer2.X.Y (takes about five minutes)
[ ] verify that Plover's regression tests pass, and fix any broken tests (note: tests must use canonical curies!); from any instance/computer, run:
[ ] update config_dbs.json in the branch for this KG2 version in the RTX repo to point to the new Plover for the 'dev' maturity level
2. Rebuild downstream databases:
The following databases should be rebuilt and copies of them should be put in /home/rtxconfig/KG2.X.Y on arax-databases.rtx.ai. Please use this kind of naming format: mydatabase_v1.0_KG2.X.Y.sqlite.
[ ] XDTD database @chunyuma (may be skipped - depends on the changes in this KG2 version)
NOTE: As databases are rebuilt, RTX/code/config_dbs.json will need to be updated to point to their new paths! Push these changes to the branch for this KG2 version, unless the rollout of this KG2 version has already occurred, in which case you should push to master (but first follow the steps described here).
3. Update the ARAX codebase:
All code changes should go in the branch for this KG2 version!
[ ] regenerate the KG2c test triples file in the branch for this KG2 version
[ ] ensure the new KG2c Neo4j is currently running
[ ] check out the branch and pull to get the latest changes (this is important for ensuring the correct KG2c Neo4j is used)
[ ] push the regenerated file to RTX/code/ARAX/KnowledgeSources/RTX_KG2c_test_triples.json
[ ] update Expand code as needed
[ ] update any other modules as needed
[ ] test everything together:
[ ] check out the branch and pull to get the latest changes
[ ] locally set force_local = True in ARAX_expander.py (to avoid using the old KG2 API)
[ ] then run the entire ARAX pytest suite (i.e., pytest -v)
[ ] address any failing tests
[ ] update the KG2 and ARAX version numbers in the appropriate places (in the branch for this KG2 version)
[ ] Bump version on line 12 in RTX/code/UI/OpenAPI/python-flask-server/openapi_server/openapi/openapi.yaml (github; local); the major and minor release numbers are kept synchronous with the TRAPI version; just bump the patch release version (least significant digit)
[ ] Bump version on line 12 in RTX/code/UI/OpenAPI/python-flask-server/KG2/openapi_server/openapi/openapi.yaml (github; local); the first three digits are kept synchronous with the KG2 release version
[ ] Bump version on line 4 in RTX/code/UI/OpenAPI/python-flask-server/RTX_OA3_TRAPI1.4_ARAX.yaml (github; local); same as for the ARAX openapi.yaml file
[ ] Bump version on line 4 in RTX/code/UI/OpenAPI/python-flask-server/RTX_OA3_TRAPI1.4_KG2.yaml (github; local); same as for the KG2 openapi.yaml file
4. Pre-upload databases:
Before rolling out, we need to pre-upload the new databases (referenced in config_dbs.json) to arax.ncats.io and the ITRB SFTP server. These steps can be done well in advance of the rollout; it doesn't hurt anything to do them early.
[ ] make sure arax.ncats.io has at least 100G of disk space free; delete old KG2 databases to free up space as needed (before doing this, warn the team on the #deployment Slack channel on the ARAXTeam workspace)
[ ] copy the new databases from arax-databases.rtx.ai to arax.ncats.io:/translator/data/orangeboard/databases/KG2.X.Y; example for KG2.8.0:
[ ] upload the new databases and their md5 checksums to ITRB's SFTP server using the steps detailed here
5. Rollout new KG2c version to arax.ncats.io development endpoints
[ ] Notify the #deployment channel in the ARAXTeam Slack workspace that you are rolling out a new version of KG2c to the various arax.ncats.io development endpoints. Provide the KG2c version number in this notification.
[ ] for the RTXteam/RTX project, merge the master branch into the branch for this KG2 version. Record the RTX issue number (for the KG2c rollout checklist issue) in the merge message.
[ ] for the RTXteam/RTX project, merge this KG2 version's branch back into the master branch. Record this issue number in the merge message.
[ ] to roll master out to a specific ARAX or KG2 endpoint named /EEE, you would do the following steps:
[ ] If you are offsite, log into your office VPN (there are strict IP address block restrictions on client IPs that can ssh into arax.ncats.io)
[ ] Log in to arax.ncats.io: ssh arax.ncats.io (you previously need to have set up your username, etc. in ~/.ssh/config; see the top of this issue template for an example)
[ ] Enter the rtx1 container: sudo docker exec -it rtx1 bash
[ ] Become user rt: su - rt
[ ] Go to the directory of the code repo for the EEE endpoint: cd /mnt/data/orangeboard/EEE/RTX
[ ] Make sure it is on the master branch: git branch (should show * master)
[ ] Stash any updated files (this is IMPORTANT): git stash
[ ] Update the code: git pull origin master
[ ] Restore updated files: git stash pop
[ ] If there have been changes to requirements.txt, make sure to do pip3 install -r code/requirements.txt
[ ] Become superuser: exit (exiting out of your shell session as user rt should return you to a root user session)
[ ] Restart the service: service RTX_OpenAPI_EEE restart
[ ] View the STDERR logfile as the service starts up: tail -f /tmp/RTX_OpenAPI_EEE.elog
[ ] Test the endpoint via the web browser interface to make sure it is working
[ ] Query the KG2c version by entering this TRAPI query JSON into the browser UI: {"nodes": {"n00": {"ids": ["RTX:KG2c"]}}, "edges": {}} (it should return 1 result and the name of that node gives the KG2c version that is installed in the PloverDB that is being queried by the endpoint)
[ ] look up RTX:KG2 in the Synonyms tab in the UI
[ ] roll master out to the various arax.ncats.io development endpoints. Usually in this order:
[ ] devED
[ ] kg2beta
[ ] beta
[ ] kg2test
[ ] test
[ ] devLM
[ ] inside the Docker rtx1 container, run the pytest suite on the various ARAX development endpoints (that means devED, devLM, test, and beta):
[ ] cd /mnt/data/orangeboard/EEE/RTX/code/ARAX/test && pytest -v
[ ] update our CI/CD testing instance with the new databases:
[ ] ssh ubuntu@cicd.rtx.ai
[ ] cd RTX
[ ] git pull origin master
[ ] If there have been changes to requirements.txt, make sure to do ~/venv3.9/bin/pip3 install -r requirements.txt
[ ] run a Test Build through GitHub Actions, to ensure that the CI/CD is working with the updated databases; all of the pytest tests that are not skipped, should pass
6. Final items/clean up:
[ ] update the current RTX GitHub changelog issue (add the rollout of this KG2 version as a changelog item)
[ ] delete the kg2.X.Yc branch in the RTX repo (since it has been merged into master at this point)
[ ] turn off the old KG2c version's neo4j instance (if it has not already been turned off; it is likely to have been turned off when the old KG2c was rolled out)
[ ] determine what is the DNS A record hostname for kg2-X-Zc.rtx.ai (where Z is one less than the new minor release version): run nslookup kg2-X-Zc.rtx.ai (it will return either kg2canonicalized.rtx.ai or kg2canonicalized2.rtx.ai; we'll call it kg2canonicalizedN.rtx.ai).
[ ] message the #deployment channel in the ARAXTeam Slack workspace that you will be stopping the kg2canonicalizedN.rtx.ai Neo4j endpoint
[ ] ssh ubuntu@kg2-X-Zc.rtx.ai
[ ] sudo service neo4j stop
[ ] In the AWS console, stop the instance kg2canonicalizedN.rtx.ai
[ ] turn off the old KG2c version's plover instance (if it has not already been turned off during the previous KG2c roll-out; under normal circumstances, we turn off the self-hosted PloverDB for the new KG2c, during clean-up)
[ ] Determine what is the DNS A record hostname for kg2-X-Zcplover.rtx.ai (where Z is one less than the new minor release version): run nslookup kg2-X-Zploverc.rtx.ai (it will return either kg2cplover.rtx.ai, kg2cplover2.rtx.ai, or kg2cplover3.rtx.ai; we'll call it kg2cploverN.rtx.ai).
[ ] message the #deployment channel in the ARAXTeam Slack workspace that you will be stopping the kg2-X-Zcplover.rtx.ai PloverDB service
[ ] Log into kg2cploverN.rtx.ai: ssh ubuntu@kg2cploverN.rtx.ai
[ ] Stop the PloverDB container: sudo docker stop plovercontainer2.X.Z (if you are not sure of the container name, use sudo docker container ls -a to get the container name).
[ ] turn off the new KG2pre version's Neo4j instance (Coordinate with the KG2pre team before doing this)
[ ] deploy new PloverDB service into ITRB CI that is backed by the new KG2c database:
[ ] merge PloverDB main branch into kg2.X.Yc branch (if main has any commits ahead of kg2.X.Yc). Reference this issue (via its full GitHub URL) in the merge message.
[ ] merge PloverDB kg2.X.Yc branch into main branch. Reference this issue (via its full GitHub URL) in the merge message.
[ ] update kg_config.json in the main branch of the Plover repo to point to the new kg2c_lite_2.X.Y.json.gz file (push this change)
[ ] wait about 60 minutes for Jenkins to build the PloverDB project and deploy it to kg2cploverdb.ci.transltr.io
[ ] verify the CI Plover is running the new KG2 version by running the following test and inspecting the command line output: cd PloverDB && pytest -vs test/test.py -k test_version --endpoint https://kg2cploverdb.ci.transltr.io
[ ] run Plover tests to verify it's working: cd PloverDB && pytest -v test/test.py --endpoint https://kg2cploverdb.ci.transltr.io
[ ] run the ARAX pytest suite with the NCATS endpoint plugged in (locally change the URL in RTX/code/config_dbs.json and set force_local = True in Expand)
[ ] if all tests pass, update RTX/code/config_dbs.json in the master branch to point to the ITRB Plover endpoints (all maturity levels): (dev: kg2cploverdb.ci.transltr.io; test: kg2cploverdb.test.transltr.io; prod: kg2cploverdb.transltr.io)
[ ] push the latest master branch code commit to the various endpoints on arax.ncats.io that you previously updated (this is in order to get the changed config_dbs.json file) and restart ARAX and KG2 services
[ ] check the Test Build (CI/CD tests) to make sure all non-skipped pytest tests have passed
[ ] turn off the self-hosted plover endpoint for the new version of KG2c
[ ] message the #deployment channel to notify people what you are about to do
[ ] ssh ubuntu@kg2cploverM.rtx.ai
[ ] sudo docker container ls -a (gives you the name of the container; assume it is plovercontainer2.X.Y)
[ ] sudo docker stop plovercontainer2.X.Y
[ ] verify once more that ARAX is still working properly, even with the self-hosted new-KG2c-version PloverDB service turned off
[ ] delete the kg2.X.Yc branch in the PloverDB repo (since it has been merged into main at this point)
[ ] upload the new kg2c_lite_2.X.Y.json.gz file to the translator-lfs-artifacts repo (ask Amy Glen or Sundar Pullela, who have permission to do this)
[ ] In GitHub, for the RTXteam/RTX project, merge master to itrb-test. Record this issue number in the merge message.
[ ] In GitHub, for the RTXteam/PloverDB project, merge main to itrb-test.
[ ] Tag the release using the master branch of RTXteam/RTX project.
[ ] Tag the release using the main branch of RTXteam/PloverDB project.
[ ] Via a message in the #devops-teamexpanderagent channel in the NCATSTranslator Slack workspace, put in a request to @Sarah Stemann to open a ticket to re-deploy ARAX, RTX-KG2, and PloverDB to ITRB test
[ ] Monitor the #devops-teamexpanderagent channel to follow the roll-out of the updated services in ITRB test (i.e., to see if there are any errors reported by ITRB)
[ ] Check proper functioning of kg2cploverdb.test.transltr.io
[ ] from any git checkout of RTXteam/PloverDB project's master branch, do : cd PloverDB && pytest -v test/test.py --endpoint https://kg2cploverdb.test.transltr.io
[ ] Check proper functioning of kg2.test.transltr.io (look at messages log debug mesages to verify that it is indeed querying kg2cploverdb.test.transltr.io)
[ ] Check proper functioning of arax.test.transltr.io (look at messages log debug mesages to verify that ARAX-Expand is indeed querying kg2.test.transltr.io)
8. Roll-out to ITRB PRODUCTION
[ ] In GitHub, for the RTXteam/RTX project, merge master to production. Record this issue number in the merge message.
[ ] In GitHub, for the RTXteam/PloverDB project, merge main to production.
[ ] Via a message in the #devops-teamexpanderagent channel in the NCATSTranslator Slack workspace, put in a request to @Sarah Stemann to open a ticket to re-deploy ARAX, RTX-KG2, and PloverDB to ITRB production
[ ] Monitor the #devops-teamexpanderagent channel to follow (i.e., to see if there are any errors reported by ITRB) the roll-out of the updated services in ITRB production (this could take several days, as there is a formal approval process for deployments to ITRB production)
[ ] Check proper functioning of kg2cploverdb.transltr.io
[ ] Check proper functioning of kg2.transltr.io (look at messages log debug mesages to verify that it is indeed querying kg2cploverdb.transltr.io)
[ ] Check proper functioning of arax.transltr.io (look at messages log debug mesages to verify that ARAX-Expand is indeed querying kg2.transltr.io)
NOTE: To create a new issue based on this template, simply go to: https://github.com/RTXteam/RTX/issues/new?template=kg2rollout.md
THE BRANCH FOR THIS ROLLOUT IS:
kg2.10.1c
THE ARAX-DATABASES.RTX.AI DIRECTORY FOR THIS ROLLOUT IS:/home/rtxconfig/KG2.10.1
Sprint changelog link: (Changelog)
Prerequisites
ssh access
To complete this workflow, you will need
ssh
access to:arax-databases.rtx.ai
arax.ncats.io
(see example configuration information below)kg2cploverN.rtx.ai
kg2canoncalizedN.rtx.ai
cicd.rtx.ai
kg2webhost.rtx.ai
GitHub access
RTXteam/PloverDB
project areaRTXteam/RTX
project areancats/translator-lfs-artifacts
project area (not critical, but needed for some final archiving steps; Amy Glen and Sundar Pullela have access)AWS access
You will need:
s3://rtx-kg2/
(ask Stephen Ramsey for access)Slack workspaces
You will also need access to the following Slack workspaces:
Example ssh config for setting up login into
arax.ncats.io
:1. Build and load KG2c:
master
into the branch being used for this KG2 version (which would typically be named likeKG2.X.Yc
). Record this issue number in the merge message.code/UI/OpenAPI/python-flask-server/openapi_server/openapi/openapi.yaml
(github; local)code/UI/OpenAPI/python-flask-server/KG2/openapi_server/openapi/openapi.yaml
(github; local)code/UI/OpenAPI/specifications/export/ARAX/1.5.0/openapi.yaml
(github)code/UI/OpenAPI/specifications/export/KG2/1.5.0/openapi.yaml
(github)buildkg2c.rtx.ai
from the branch (how-to is here)arax-databases.rtx.ai
(need at least 100G, ideally >120G). delete old KG2 database directories as needed (warn the team on Slack in advance).kg2c_config.json
, as described in the how-tonode_synonymizer.sqlite
should be around 8-15 GBnode_synonymizer.sqlite
's last modified date is today (or whatever day the build was run)kg2c_lite.json.gz
's last modified date is today (or whatever day the build was run)arax-databases.rtx.ai
(/home/rtxconfig/KG2.X.Y
)match (n {id:"RTX:KG2c"}) return n
RTX/code/config_dbs.json
in the branch:kg2c_lite_2.X.Y.json.gz
file (which you can get from the S3 buckets3://rtx-kg2/kg2c_lite.json.gz
(but CHECK THE DATE AND MD5 HASH TO BE SURE YOU ARE NOT GETTING AN OLD FILE) to the directory/home/ubuntu/nginx-document-root/
onkg2webhost.rtx.ai
kg2cploverN.rtx.ai
:ssh ubuntu@kg2cploverN.rtx.ai
cd PloverDB && git pull origin kg2.X.Yc
./run.sh ploverimage2.X.Y plovercontainer2.X.Y "sudo docker"
(takes about an hour)sudo docker start plovercontainer2.X.Y
(takes about five minutes)cd PloverDB
pytest -v test/test.py --endpoint http://kg2cploverN.rtx.ai:9990
config_dbs.json
in the branch for this KG2 version in the RTX repo to point to the new Plover for the 'dev' maturity level2. Rebuild downstream databases:
The following databases should be rebuilt and copies of them should be put in
/home/rtxconfig/KG2.X.Y
onarax-databases.rtx.ai
. Please use this kind of naming format:mydatabase_v1.0_KG2.X.Y.sqlite
.NOTE: As databases are rebuilt,
RTX/code/config_dbs.json
will need to be updated to point to their new paths! Push these changes to the branch for this KG2 version, unless the rollout of this KG2 version has already occurred, in which case you should push tomaster
(but first follow the steps described here).3. Update the ARAX codebase:
All code changes should go in the branch for this KG2 version!
RTX/code/ARAX/KnowledgeSources/RTX_KG2c_test_triples.json
force_local = True
inARAX_expander.py
(to avoid using the old KG2 API)pytest -v
)RTX/code/UI/OpenAPI/python-flask-server/openapi_server/openapi/openapi.yaml
(github; local); the major and minor release numbers are kept synchronous with the TRAPI version; just bump the patch release version (least significant digit)RTX/code/UI/OpenAPI/python-flask-server/KG2/openapi_server/openapi/openapi.yaml
(github; local); the first three digits are kept synchronous with the KG2 release versionRTX/code/UI/OpenAPI/python-flask-server/RTX_OA3_TRAPI1.4_ARAX.yaml
(github; local); same as for the ARAXopenapi.yaml
fileRTX/code/UI/OpenAPI/python-flask-server/RTX_OA3_TRAPI1.4_KG2.yaml
(github; local); same as for the KG2openapi.yaml
file4. Pre-upload databases:
Before rolling out, we need to pre-upload the new databases (referenced in
config_dbs.json
) toarax.ncats.io
and the ITRB SFTP server. These steps can be done well in advance of the rollout; it doesn't hurt anything to do them early.arax.ncats.io
has at least 100G of disk space free; delete old KG2 databases to free up space as needed (before doing this, warn the team on the#deployment
Slack channel on theARAXTeam
workspace)arax-databases.rtx.ai
toarax.ncats.io:/translator/data/orangeboard/databases/KG2.X.Y
; example for KG2.8.0:ssh myuser@arax.ncats.io
cd /translator/data/orangeboard/databases/
mkdir -m 777 KG2.8.0
scp rtxconfig@arax-databases.rtx.ai:/home/rtxconfig/KG2.8.0/*2.8.0* KG2.8.0/
5. Rollout new KG2c version to
arax.ncats.io
development endpoints#deployment
channel in theARAXTeam
Slack workspace that you are rolling out a new version of KG2c to the variousarax.ncats.io
development endpoints. Provide the KG2c version number in this notification.RTXteam/RTX
project, merge themaster
branch into the branch for this KG2 version. Record the RTX issue number (for the KG2c rollout checklist issue) in the merge message.RTXteam/RTX
project, merge this KG2 version's branch back into themaster
branch. Record this issue number in the merge message.master
out to a specific ARAX or KG2 endpoint named/EEE
, you would do the following steps:arax.ncats.io
)arax.ncats.io
:ssh arax.ncats.io
(you previously need to have set up your username, etc. in~/.ssh/config
; see the top of this issue template for an example)rtx1
container:sudo docker exec -it rtx1 bash
rt
:su - rt
EEE
endpoint:cd /mnt/data/orangeboard/EEE/RTX
git branch
(should show* master
)git stash
git pull origin master
git stash pop
requirements.txt
, make sure to dopip3 install -r code/requirements.txt
exit
(exiting out of your shell session as userrt
should return you to aroot
user session)service RTX_OpenAPI_EEE restart
tail -f /tmp/RTX_OpenAPI_EEE.elog
{"nodes": {"n00": {"ids": ["RTX:KG2c"]}}, "edges": {}}
(it should return 1 result and the name of that node gives the KG2c version that is installed in the PloverDB that is being queried by the endpoint)RTX:KG2
in the Synonyms tab in the UImaster
out to the variousarax.ncats.io
development endpoints. Usually in this order:devED
kg2beta
beta
kg2test
test
devLM
rtx1
container, run the pytest suite on the various ARAX development endpoints (that meansdevED
,devLM
,test
, andbeta
):cd /mnt/data/orangeboard/EEE/RTX/code/ARAX/test && pytest -v
ssh ubuntu@cicd.rtx.ai
cd RTX
git pull origin master
requirements.txt
, make sure to do~/venv3.9/bin/pip3 install -r requirements.txt
sudo bash
mkdir -m 777 /mnt/data/orangeboard/databases/KG2.X.Y
exit
~/venv3.9/bin/python3 code/ARAX/ARAXQuery/ARAX_database_manager.py --mnt --skip-if-exists --remove_unused
6. Final items/clean up:
kg2.X.Yc
branch in the RTX repo (since it has been merged intomaster
at this point)kg2-X-Zc.rtx.ai
(whereZ
is one less than the new minor release version): runnslookup kg2-X-Zc.rtx.ai
(it will return eitherkg2canonicalized.rtx.ai
orkg2canonicalized2.rtx.ai
; we'll call itkg2canonicalizedN.rtx.ai
).#deployment
channel in theARAXTeam
Slack workspace that you will be stopping thekg2canonicalizedN.rtx.ai
Neo4j endpointssh ubuntu@kg2-X-Zc.rtx.ai
sudo service neo4j stop
kg2canonicalizedN.rtx.ai
kg2-X-Zcplover.rtx.ai
(whereZ
is one less than the new minor release version): runnslookup kg2-X-Zploverc.rtx.ai
(it will return eitherkg2cplover.rtx.ai
,kg2cplover2.rtx.ai
, orkg2cplover3.rtx.ai
; we'll call itkg2cploverN.rtx.ai
).#deployment
channel in theARAXTeam
Slack workspace that you will be stopping thekg2-X-Zcplover.rtx.ai
PloverDB servicekg2cploverN.rtx.ai
:ssh ubuntu@kg2cploverN.rtx.ai
sudo docker stop plovercontainer2.X.Z
(if you are not sure of the container name, usesudo docker container ls -a
to get the container name).main
branch intokg2.X.Yc
branch (ifmain
has any commits ahead ofkg2.X.Yc
). Reference this issue (via its full GitHub URL) in the merge message.kg2.X.Yc
branch intomain
branch. Reference this issue (via its full GitHub URL) in the merge message.kg_config.json
in themain
branch of the Plover repo to point to the newkg2c_lite_2.X.Y.json.gz
file (push this change)kg2cploverdb.ci.transltr.io
cd PloverDB && pytest -vs test/test.py -k test_version --endpoint https://kg2cploverdb.ci.transltr.io
cd PloverDB && pytest -v test/test.py --endpoint https://kg2cploverdb.ci.transltr.io
RTX/code/config_dbs.json
and setforce_local = True
in Expand)RTX/code/config_dbs.json
in themaster
branch to point to the ITRB Plover endpoints (all maturity levels): (dev
:kg2cploverdb.ci.transltr.io
;test
:kg2cploverdb.test.transltr.io
;prod
:kg2cploverdb.transltr.io
)master
branch code commit to the various endpoints onarax.ncats.io
that you previously updated (this is in order to get the changedconfig_dbs.json
file) and restart ARAX and KG2 services#deployment
channel to notify people what you are about to dossh ubuntu@kg2cploverM.rtx.ai
sudo docker container ls -a
(gives you the name of the container; assume it isplovercontainer2.X.Y
)sudo docker stop plovercontainer2.X.Y
kg2.X.Yc
branch in the PloverDB repo (since it has been merged intomain
at this point)kg2c_lite_2.X.Y.json.gz
file to the translator-lfs-artifacts repo (ask Amy Glen or Sundar Pullela, who have permission to do this)kg2_nodes_not_in_sri_nn.tsv
file to the translator-lfs-artifacts repo7. Roll-out to ITRB TEST
master
toitrb-test
. Record this issue number in the merge message.main
toitrb-test
.master
branch of RTXteam/RTX project.main
branch of RTXteam/PloverDB project.#devops-teamexpanderagent
channel in theNCATSTranslator
Slack workspace, put in a request to@Sarah Stemann
to open a ticket to re-deploy ARAX, RTX-KG2, and PloverDB to ITRB test#devops-teamexpanderagent
channel to follow the roll-out of the updated services in ITRB test (i.e., to see if there are any errors reported by ITRB)kg2cploverdb.test.transltr.io
RTXteam/PloverDB
project'smaster
branch, do :cd PloverDB && pytest -v test/test.py --endpoint https://kg2cploverdb.test.transltr.io
kg2.test.transltr.io
(look at messages logdebug
mesages to verify that it is indeed queryingkg2cploverdb.test.transltr.io
)arax.test.transltr.io
(look at messages logdebug
mesages to verify that ARAX-Expand is indeed queryingkg2.test.transltr.io
)8. Roll-out to ITRB PRODUCTION
master
toproduction
. Record this issue number in the merge message.main
toproduction
.#devops-teamexpanderagent
channel in theNCATSTranslator
Slack workspace, put in a request to@Sarah Stemann
to open a ticket to re-deploy ARAX, RTX-KG2, and PloverDB to ITRB production#devops-teamexpanderagent
channel to follow (i.e., to see if there are any errors reported by ITRB) the roll-out of the updated services in ITRB production (this could take several days, as there is a formal approval process for deployments to ITRB production)kg2cploverdb.transltr.io
kg2.transltr.io
(look at messages logdebug
mesages to verify that it is indeed queryingkg2cploverdb.transltr.io
)arax.transltr.io
(look at messages logdebug
mesages to verify that ARAX-Expand is indeed queryingkg2.transltr.io
)