NLPSandbox.io is an open platform for benchmarking modular natural language processing (NLP) tools on both public and private datasets. Academics, students, and industry professionals are invited to browse the available tasks and participate by developing and submitting an NLP Sandbox tool.
This repository provides an example implementation of the [NLP Sandbox PHI Deidentifier API] written in Python-Flask. An NLP Sandbox PHI deidentifier takes as input a clinical note (text) and outputs a list of predicted Protected Health Information (PHI) annotations and the clinical note where PHI elements are anonymized using one of the strategy selected (masking character, annotation type mask). This NLP Sandbox tool depends on other tools to identify PHI elements (see section Specification).
This tool is provided to NLP developers who develop in Python as a starting point to package their own PHI deidentifier as an NLP Sandbox tool (see section Development). This section also describes how to generate a tool "stub" using openapi-generator for 50+ programming languages-frameworks. This repository includes a GitHub CI/CD workflow that lints, tests, builds and pushes a Docker image of this tool to Synapse Docker Registry. This image of this example tool can be submitted as-is on NLPSandbox.io to benchmark its performance -- just don't expect a high performance!
Preview this NLP Sandbox tool at https://phi-deidentifier.nlpsandbox.io.
See nlpsandbox/phi-deidentifier-app for instruction on how to deploy the NLP Sandbox PHI Deidentifier example, including its React web client.
The command below starts this NLP Sandbox PHI deidentifier locally.
docker compose up --build
You can stop the container run with Ctrl+C
, followed by docker compose down
.
Create a Conda environment.
conda create --name phi-deidentifier python=3.9
conda activate phi-deidentifier
Install and start this NLP Sandbox PHI deidentifier.
cd server && pip install -r requirements.txt
python -m openapi_server
This NLP Sandbox tool provides a web interface that you can use to annotate clinical notes. This web client has been automatically generated by openapi-generator. To access the UI, open a new tab in your browser and navigate to one of the following address depending on whether you are running the tool using Docker (production) or Python (development).
This section describes how to develop your own NLP Sandbox PHI deidentifier in Python-Flask and other programming languages-frameworks. This example tool is also available in Java in the GitHub repository nlpsandbox/phi-deidentifier-example-java.
Depending on the language-frameworks you want to develop with:
You can also use a different code repository hosting service like GitLab and Bitbucket.
This repository includes a GitHub CI/CD workflow that lints, tests, builds and pushes a Docker image of this tool to Synapse Docker Registry. Only the images that have been pushed to Synapse Docker Resgitry can be submitted to NLPSandbox.io benchmarks for now.
After creating your GitHub repository, you need to configure the CI/CD workflow if you want to benefit from automatic lint checks, tests and Docker builds.
SYNAPSE_USERNAME
: Your Synapse.org username.SYNAPSE_TOKEN
: A personal access token that has the permissions View
,
Download
and Modify
.docker_repository
with the value docker.synapse.org/<synapse_project_id>/<docker_image>
where:
<synapse_project_id>
: the Synapse ID of a project you have created on
Synapse.org.<docker_image>
is the name of your image/tool.This repository includes a Dependabot configuration that instructs GitHub to let you know when an update is available for one of your dependencies (e.g. Python, Node, Docker). Dependabot will automatically open a PR when an update is available. If you have configured the CI/CD workflow that comes with this repository, the workflow will automatically run and notify you if the update is breaking your code. You can then resolve the issue before merging the PR, hence making the update effective.
For more information on Dependabot, please visit the GitHub page [Enabling and disabling version updates].
The development of new NLP Sandbox tools is streamlined by using the openapi-generator to generate tool "stubs" for more than 50 programming languages and frameworks. Here a PHI deidentifier stub refers to an initial implementation that has been automatically generated by openapi-generator from the NLP Sandbox PHI Deidentifier API specification.
Run the command below to get the list of languages-framework supported by the
openapi-generator (under the section SERVER generators
).
npx @openapitools/openapi-generator-cli list
Generate the PHI deidentifier stub from an empty GitHub repository (here in Python-Flask):
mkdir server
npx @openapitools/openapi-generator-cli generate \
-g python-flask \
-o server \
-i https://nlpsandbox.github.io/nlpsandbox-schemas/phi-deidentifier/latest/openapi.json
where the option -i
refers to the OpenAPI specification of the [NLP Sandbox
PHI Deidentifier API].
The URL is composed of different elements:
phi-deidentifier
- The type of NLP Sandbox tool to generate. The list of all
the NLP Sandbox tool types available is defined in the NLP Sandbox schemas.latest
- The latest stable version of the NLP Sandbox schemas. This token
can be replaced by a specific release version x.y.z
of the [NLP Sandbox
schemas].The NLP Sandbox schemas is updated after receiving contribution from the community. For example, the Patient schema may include in the future additional information that NLP Sandbox tools can leverage to generate more accurate predictions.
After an update of the NLP Sandbox schemas, NLPSandbox.io will only accept to evaluate tools that implement the latest version of the schemas. It is therefore important to keep your tools up-to-date and re-submit them so that they continue to appear in the leaderboards and to be used by the community.
This GitHub repository includes a workflow that checks daily if a new release of the NLP Sandbox schemas is available, in which case a PR will be created. Follow the steps listed below to update your tool.
Checkout the branch created by the workflow.
git checkout
Re-run the same openapi-generator command you used to generate the tool
stub. If you started from an existing tool implementation like the one
included in this GitHub repository, run the following command to update your
tool to the latest version of the NLP Sandbox schemas (this command would
be defined in package.json
).
npm run generate:server:latest
Review the updates made to this tool in the NLP Sandbox schemas CHANGELOG.
Review and merge the changes. If you are using VS Code, this step can be performed relatively easily using the section named "Source Control". This section lists the files that have been modified by the generator. When clicking on a file, VS Code shows side-by-side the current and updated version of the file. Changes can be accepted or rejected at the level of an entire file or for a selection of lines.
Submit your updated tool to NLPSandbox.io.
If you started from an existing tool implementation like the one included in this GitHub repository, run the following command to lint and test your tool.
npm run lint
npm run test
For Python-Flask tools:
Maintainers are required to follow the procedure below when creating a new release. Releases are created with the npm package [release-it].
package.json
(updated automatically by release-it
).
README.md
docker-compose.yml
server/openapi_server/controllers/tool_controller.py
npm run release -- major --ci --no-npm --dry-run
npm run release -- minor --ci --no-npm --dry-run
npm run release -- patch --ci --no-npm --dry-run
npm run release -- major --ci --no-npm
npm run release -- minor --ci --no-npm
npm run release -- patch --ci --no-npm
The NLP Sandbox promotes the development of tools that are re-usable, reproducible, portable and cloud-ready. The table below describes how preventing a tool from connecting to remote server contributes to some of these tool properties.
Property | Description |
---|---|
Reproducibility | The output of a tool may not be reproducible if the tool depends on external resources, for example, that may no longer be available in the future. |
Security | A tool may attempt to upload sensitive information to a remote server. |
The Docker Compose configuration included with this GitHub repository
(docker-compose.yml) prevents the tool container to
establish remote connection. This is achieved through the use of a internal
Docker network and the presence of the Nginx container placed in front of the
tool container. One benefit is that you can test your tool locally and ensure
that it works fine while it does not have access to the internet. Note that when
being evaluated on NLPSandbox.io, additional measures are put in place to
prevent tools from connecting to remote servers.
This repository uses semantic versioning to track the releases of this tool. This repository uses "non-moving" GitHub tags, that is, a tag will always point to the same git commit once it has been created.
The artifact published by the CI/CD workflow of this GitHub repository is a Docker image pushed to the Synapse Docker Registry. This table lists the image tags pushed to the registry.
Tag name | Moving | Description |
---|---|---|
latest |
Yes | Latest stable release. |
edge |
Yes | Latest commit made to the default branch. |
edge-<sha> |
No | Same as above with the reference to the git commit. |
<major>.<minor>.<patch> |
No | Stable release. |
You should avoid using a moving tag like latest
when deploying containers in
production, because this makes it hard to track which version of the image is
running and hard to roll back.
Visit nlpsandbox.io for instructions on how to submit your NLP Sandbox tool and evaluate its performance.
Thinking about contributing to this project? Get started by reading our contribution guide.