This is a case study of an Incubator with the purpose of understanding the steps and processes involved in developing a digital twin system. This incubator is an insulated container with the ability to keep a temperature and heat, but not cool.
To understand what a digital twin is, we recommend you read/watch one or more of the following resources:
Goal: The goal of this document is to provide users with a basic overview of the digital twin incubator and enabled them to run it on their computers.
Audience: This documentation is targeted at users who are acquainted with running programs from the command line and are able to install python and run python scripts.
The documentation is all contained into a single document for simplicity and ease of maintenance. If the user wishes to see the documentation in some other format then we recommend cloning this repository and using Pandoc to convert the documentation into another format.
Searching the documentation can be done by using the browser search function or using Github's search feature on top of the screen.
A digital twin is a software system that supports the operation of a cps (called the physical twin). So the following documentation describes the physical twin first and then the digital twin.
The overall purpose of the system is to reach a certain temperature within a box and keep the temperature regardless of content.
The system consists of:
Hardware Elements
In order to read a temperature from the temperature sensors multiple IDs can be found in /sys/bus/w1/devices/
.
If they are not visible there, check that the wires are well connected. Also check whether that the kernel submodules are active:
pi@incubatorpi:~/source/software $ lsmod | grep w1
w1_therm 28672 0
w1_gpio 16384 0
wire 36864 2 w1_gpio,w1_therm
So to read a value from a sensor, one can do: cat /sys/bus/w1/devices/10-0008039ad4ee/w1_slave
.
The sensors were connected according to this online guide.
Each ID in the /sys/bus/w1/devices
folder correspond to a particular sensor. Follow the wires of the temperature sensors, and you will find a paper label with a number on it.
The ID to number mapping is:
ID to Sensor Number Mapping:
ID | Sensor Number |
---|---|
10-0008039ad4ee | 1 |
10-0008039b25c1 | 2 |
10-0008039a977a | 3 |
The simplest is to just connect a keyboard, screen, and mouse to the pi, then use setup a hotspot with your phone, and connect the PI to it. Ensure that the rabbitmq server is accessible from the PI, and you should be able to run everything. We do not recommend running the influxdb within the PI itself. To gather data on the pi, we recommend start_csv_data_recorder, which will store locally a set of csv files with the data from the low_level_driver_server.py. On another computer that has access to the rabbitmq server and influxdb server, you can run the start_influx_data_recorder.py to forward the data to the influxdb server.
You can SSH to the PI. The username and password is physically located on the Pi.
A .obj file is available at: figures/incubator_plant.obj.
Ongoing development of the cad model is at incubator
Most models are implemented using the oomodellingpython package.
Models can be organized as plant models, controller models, or physical twin models (the later couple plant and controller together):
On the raspberry pi:
PS software> python -m incubator.physical_twin.low_level_driver_server
PS software> python -m startup.start_controller_physical
The DT follows a service based architecture, with different services communicating with each other via a RabbitMQ message exchange running on a docker container. Each service is started with a python script, and most services refer to software/startup.conf for their configuration.
The code that starts the services is in software/startup. It is possible to start all services (except the 3D visualization service) from the same script software/startup/start_all_services.py or each service individually.
The services (and their starting scripts) currently implemented are:
We use the C4 Model to document the software architecture.
All communication goes through the RabbitMQ server, even the communication between the Controller and Plant. The Controller communicates with the Plant by listening and sending RabbitMQ messages to the low_level_driver_server.py.
Anyone else interested in this communication, such as the digital twin, can listen to it as well.
The most complex behavior is the self-adaptation, as it involves most implemented DT components.
This is introduced in the following paper, which we recommend you read:
The main components and their dependencies are:
Legend:
The following shows the main stages involved in a self-adaptation:
In particular, these are followed by the SelfAdaptationManager:
The following diagram shows the main interactions between the main entities that participate in the self-adaptation process. It is assumed that an anomaly has occurred due to the lid being open.
It is possible to run the digital twin on our computer, with or without a connection to the physical twin.
You're advised to read carefully all documentation before acting on any instruction.
python -m venv venv
.\venv\Scripts\Activate.ps1
source venv/bin/activate
pip install wheel
pip install -r ./requirements.txt
software$ python -m startup.start_all_services
time execution_interval elapsed heater_on fan_on room box_air_temperature state
19/11 16:17:59 3.00 0.01 True False 10.70 19.68 Heating
19/11 16:18:02 3.00 0.03 True True 10.70 19.57 Heating
19/11 16:18:05 3.00 0.01 True True 10.70 19.57 Heating
19/11 16:18:08 3.00 0.01 True True 10.69 19.47 Heating
19/11 16:18:11 3.00 0.01 True True 10.69 19.41 Heating
incubator
incubator
incubator
incubator
Make sure you can successfully start the DT framework
To run the unit tests, open a terminal in software, and
CLIMODE = "ON"
python -m unittest discover -v incubator/tests -p "*.py"
Make sure you can successfully start the DT framework and run the unit tests before attempting to run the integration tests.
The script run_integration_tests.ps1 contains the instructions.
We assume the reader is broadly familiar with Godot engine.
C:\path\to\your\godot4.exe --rendering-driver opengl3.
The Incubator contains several scripts for interacting with the DT as it is running live. The scripts can be found in software/cli.
The scripts can be executed by following the pattern: python -m cli.<script_name>
.
For instance, to generate dummy data the following script can be executed: python -m cli.generate_dummy_data
. Notice that some scripts require extra parameters.
Certain configurations of the Incubator support anomaly detection to determine if the styrofoam lid has been removed from the box.
When running a mocked version of the PT, the behavior of removing the lid can be simulated through the software/cli/mess_with_lid_mock.py script.
In practice, the simulation is accomplished by changing the $G{box}$ parameter of the mocked PT. The $G{box}$ represents the rate of energy transfer between the air inside the box and the air outside the box.
Executing the script:
The script can be executed by running python -m cli.mess_with_lid_mock <N>
, where $N$ represents a positive integer to multiply the original $G_{box}$ with.
For instance, the python -m cli.mess_with_lid_mock 100
simulates a greater loss of energy to the outside (lid off) and python -m cli.mess_with_lid_mock 1
simulates the original behavior (lid on).
If any errors show up during the startup process, check that the RabbitMQ server and InfluxDB are being correctly started.
The following instructions were used to configure these services in the first time and may help you test them:
To start the RabbitMQ server, run the following in software/incubator/communication/installation:
# start and install rabbitmq container
PS software\incubator\communication\installation> docker-compose up --detach --build
[MACOS: docker compose up --detach --build]
# Check logs of rabbitmq-server
PS software\incubator\communication\installation> docker logs rabbitmq-server
# Run script to test server (assumes you have correct environment)
cd [RepoRoot]\software\
[Activate virtual environment]
PS software> python -m incubator.communication.installation.test_server
# Stop and remove the server
PS software> docker-compose down -v
[MACOS: docker compose down -v]
The script should produce:
Sending message...
Message sent.
Retrieving message. Received message is {'text': '321'}
More information about the Dockerfile: https://hub.docker.com/_/rabbitmq
Management of local RabbitMQ:
Run the following if this is the first time you're starting the InfluxDB server:
influxdb
│ config.yaml
│ docker-compose.yml
│ Dockerfile
│ influxdb.zip
│ README.md
│ test_server.py
└── influxdb
│ influxd.bolt
└── engine
│ data
└── wal
To start the InfluxDB server, run from the influxdbserver folder:
PS software\digital_twin\data_access\influxdbserver> docker-compose up --detach --build
PS software\digital_twin\data_access\influxdbserver> cd [RepoRoot]\software
[Activate virtual environment]
PS software> python -m digital_twin.data_access.influxdbserver.test_server
docker-compose down -v
More information: https://docs.influxdata.com/influxdb/v2.0/get-started/
docker exec -it influxdb-server /bin/bash
This has been done once, and there's no need to repeat it. But it is left here in case we lose the file influxdb.zip.
user: incubator
pass: incubator
organization: incubator
bucket: incubator
If while running test_server
in Start influxdb server, you get an error resembling the following:
> write_api.write(bucket, org, point)
(Pdb) config["influxdb"]
ConfigTree([('url', 'http://localhost:8086'), ('token', '-g7q1xIvZqY8BA82zC7uMmJS1zeTj61SQjDCY40DkY6IpPBpvna2YoQPdSeENiekgVLMd91xA95smSkhhbtO7Q=='), ('org', 'incubator'), ('bucket', 'incubator')])
influxdb_client.rest.ApiException: (401)
Reason: Unauthorized
HTTP response headers: HTTPHeaderDict({'Content-Type': 'application/json; charset=utf-8', 'X-Platform-Error-Code': 'unauthorized', 'Date': 'Wed, 31 Aug 2022 09:35:17 GMT', 'Content-Length': '55'})
HTTP response body: {"code":"unauthorized","message":"unauthorized access"}
-> write_api.write(bucket, org, point)
Then the cause is the token used in the startup.conf needs to be updated. To fix open the InfluxDB web management page, go to InfluxDB->Tokens and generate a new token. Then update startup.conf with the new token.
Original issue described in #23.
We make extensive use of README.md files. Please read them and keep them up to date.
General guidelines:
Get-ChildItem -Include *.md -Recurse | Foreach {markdown-link-check --config .\markdown_link_check_config.json $_.fullname}
The software directory contains all things code related.