leoek / material-safety-search

Search engine for Material Safety Datasheets based on Solr
https://mss.leoek.tech
GNU General Public License v2.0
4 stars 0 forks source link
ansible datasheets docker material openapi react redux safety search solr spring

MSDS Search

Search engine for Material Safety Datasheets based on Solr

Live Demo

Status

Backend Frontend Deployment
Build Status Build Status Build Status

Report und Zwischenpräsentationen

Der Report und die Zwischenpräsentationen sind in docs/ zu finden.


How to Run

NOTE: Read the following section carefully!

NOTE Installtion Instructions for required Software to run MSDS Search:

NOTE Installtion Instructions for required Software to develop for MSDS Search:

NOTE: If you have questions about the following docker commands, you can find their documentation here. These are the basic commands which you will probably need:

  1. docker-compose up -d Start/create containers, volumes and networks. Setup and detach your console.
  2. docker-compose logs -f View the logs and attach your console to them.
  3. docker-compose down Stop and delete all containers.
  4. docker-compose down -v Stop, delete all containers and remove all data volumes.
  5. The -f flag is used to specify a certain compose file. There are different compose files with different purposes in this repository.

Quick Start

  1. clone the repository: git clone https://github.com/leoek/material-safety-search.git
  2. cd material-safety-search
  3. download submodules: git submodule update --init --recursive
  4. With access to https://hub.docker.com/r/materialsafetysearch/private/
    1. login to dockerhub: docker login
    2. Use our prebuilt docker images: sudo docker-compose -f docker-compose.staging.yml up -d
    3. [Optional] View logs: sudo docker-compose -f docker-compose.staging.yml logs -f mss-server
  5. Without access to the prebuilt docker images
    1. Build and run the images on your local machine (which might take a while): sudo docker-compose -f docker-compose.local.yml up -d
    2. [Optional] View logs: sudo docker-compose -f docker-compose.local.yml logs -f mss-server
  6. Wait for the dataset to be indexed, the search engine is already usable if only a part is indexed. (You can check the progress in the logs)

There are two options available to run this search engine:

  1. Use our prebuilt docker images: sudo docker-compose -f docker-compose.staging.yml up -d
  2. Build and run the images on your local machine (which might take a while): sudo docker-compose -f docker-compose.local.yml up -d

ATTENTION: Please note that the solr container does not require any login information. Therefore it should not be exposed to the public. Our docker-compose.local.yml configuration will expose all ports for debugging purposes. You should never use that configuration in any production environment.

Production Setup / Deployment


OpenApi Specification

The Api is defined by an [penapi specification]api.yml).


Development / Building from Source

Check for these files to be present:

  1. The dataset
  2. fscMap.txt
  3. fsgMap.txt

Development Workflow

Building inside a Docker Container:

Requirements:

  1. Docker-ce 15+
  2. docker-compose 1.16+

Procedure:

  1. sudo docker-compose -f docker-compose.local.yml up -d
  2. Solr will be available at http://localhost:8983
  3. The Backend will be available at http://localhost:8080
  4. The Frontend will be available at http://localhost:80

Backend

Requirements:

  1. Java JDK 10
  2. Docker-ce
  3. docker-compose 1.16+

Procedure:

  1. You will need a Solr instance for development. The easiest way to get one up and running which is configured for local development is by running docker-compose up -d.
  2. Our Backend is a Spring Boot application, which uses Gradle for package management. You can build and run it with: ./gradlew bootRun (theoretically it should work with the .bat file on windows, however until now nobody had the nerves to mess with windows).
  3. Solr will be available at http://localhost:8983
  4. The Backend will be available at http://localhost:8080

Frontend

The frontend can be found within the frontend/ folder. Check out the rontend Readme

Requirements:

  1. Yarn 1.8+
  2. Node 8+

Procedure:

  1. Run yarn to obtain the dependencies.
  2. Run yarn start to start the development server.
  3. The Frontend will be available at http://localhost:3000

NOTES:

  1. The Backend will check whether documents are already imported to Solr. If the solr core is not empty it will skip the import of the documents.
  2. Solr data is in a named docker volume, which will prevent losing the Solr core with the indexed documents in between container restarts.
  3. The Solr data volume can be deleted with the -v flag: docker-compose down -v

Postman

Postman can be used to talk to the backend directly. Import the supplied collection and environments and you should be good to go.