gardners / surveysystem

System for on-line and off-grid survey preparation and submissions
BSD 2-Clause "Simplified" License
2 stars 1 forks source link

Surveysystem

System for on-line and off-grid survey preparation and submissions

Data storage

This system purposely uses a simplistic data storage scheme, in the interests of robustness, and also making it easier to scale up and down, and perform analysis on the data it collects.

The structure is relatively simple:

Stale sessions can simply be deleted via the file system, and surveys added or updated or deleted similarly easily.

Environment Variables

SURVEY_HOME (required):

All data lives in SURVEY_HOME. The SURVEY_HOME environment variable must be defined and represents an absolute directory path to the backend dir (no trailing slash).

SURVEY_PYTHONDIR (optional):

Optionally you can define an external Python controller path via SURVEY_PYTHONDIR. This must be an absolute directory path. The backend will look for <SURVEY_PYTHONDIR>/nextquestion.py. This is recommended for more complex analysis requirements. If not defined the backend falls back to the local <SURVEY_HOME>/python/nextquestion.py (see structure)

SURVEY_FORCE_PYINIT (tests only!):

If SURVEY_FORCE_PYINIT is set to "1" the Python interpreter will re-initialise on every Python C-Api function call.

Use this only for testing or development environments, as this substantially slows down the application and might cause side effects inside your python controller.

SS_TRUSTED_MIDDLEWARE:

Register a trusted authentication middleware source. For details see authorisation-and-middleware.md

SS_LOG_FILE

Path to a writable custom log file

Installation (backend)

This system requires Python >= 3.8 and clang. Additionally, zlib and bmake is required for compiling kcgi. To install on Ubuntu:

sudo apt-get install clang make
sudo apt-get install python3.8 python3.8-dev
sudo apt-get install zlib1g-dev bmake

Tests require Lighttpd and Curl

sudo apt-get install curl lighttpd

Then make sure to build and install kcgi:

git submodule init
git submodule update
cd backend/kcgi
./configure
sudo bmake install

Then create a folder for logs:

mkdir surveysystem/backend/logs
sudo chmod 777 surveysystem/backend/logs
mkdir surveysystem/backend/testlog
sudo chmod 777 surveysystem/backend/testlog

Overview

surveysystem architecture

REST API

Note that the following section reflects the current state of development and will be subject to future changes.

Endpoint

Paths and queries

method endpoint response description
GET / index (not used, returns 204 no content)
GET /session?surveyid text: sessionid create session and retrieve generated session id
POST /session?surveyid&sessionid text: sessionid create session with a given session id (uuidv4)
GET /questions?sessionid json: next questions get next questions to answer (questions, progress, status)
POST /answers?sessionid 1) 2) json: next questions answer previous questions, format: serialised answers5) (lines of colon separated values) in request body
POST /answers?sessionid&answer 1) json: next questions answer single previous question, format: serialised answer5) (colon separated values)
POST /answers?sessionid&{uid1}={value1}&{uid2}={value2} 1) json: next questions answer previous questions by ids and values, format: question id = answer value
DELETE /answers?sessionid 3) json: next questions delete last answers (roll back to previous questions)
DELETE /answers?sessionid&questionid 3) json: next questions delete last answers until (and including) the given question id (rollback)
GET /analysis?sessionid 4) json get analysis based on your answers
GET /status(?extended) status 200/204 no content system status use the extended param for checking correct configuration and paths

The survey model is sequential. POST /surveyapi/answer is required to submit the answers for question ids in the exact same order as they were recieved. Similar with DELETE /answer requests, where question ids have to be submitted in the exact reverse order.

Documentation

Documentation files live in docs. The most important docs are about the backend data serialisation model and the backend session life cycle