to get into a dev environment locally (on linux):
sh <(curl -L https://nixos.org/nix/install) --daemon
enable flakes:
~/.config/nix/nix.conf
or /etc/nix/nix.conf
:
experimental-features = nix-command flakes
run nix develop
to enter dev shell
run runner.py
for the local usb to CAN listener on host machine
usage notes:
nix flake lock --update-input ht_can_pkg_flake
by default, it uses a fixed version of the hytech CAN library and it must be manually updated. downstream usage of this can update this too via specifying it in the flake input as well if need be.
[x] write test script for creating a cantools constructed hytech CAN msg and sends it over a virtual CAN line
[x] make the deserialization task for unpacking received data from CAN in the data acq service script.
[x] create nixos module for py_data_acq
[x] make ability to start / stop / control in general the data recording via grpc calls for the mcap writer task (pending nix-proto adjustment)
[x] make user script / interface for the grpc calls for ease of interaction with the service (pending nix-proto adjustment)
[x] actually get current data from car into protobuf encoded CAN messages in an integration test
[x] get nix-proto working with dbc input from url for creation of python lib
[x] get py_data_acq working in dev shell with nix-proto generated python lib for proto msg packing
[x] make service script that creates an instance of the mcap writer and the foxglove websocket
[x] come up with a good way of associating the dbc file with the protobuf file
flowchart TD
sym[PCAN symbol editor generation of `.sym` file] --> CI
subgraph user input
sym
end
CI[remote CI generation and release of dbc / proto] --> pio[local built platformio util CAN lib]
CI --> np[local built nix proto gen lib]
CI --> bin[remote schema binary generation using ci devshell]
bin --> fg[foxglove webserver service]
np --> mc[mcap writer / CAN msg to protobuf service]
CI --> cantools[cantools dbc load]
input:
output:
flowchart TD
CAN[RPI CAN] --> py_async_q[encoded CAN data]
py_async_q --> des[DBC based CAN parser]
des --> pb_pack[protobuf packet creation]
pb_pack --> data_q1[webserver protobuf packet queue]
pb_pack --> data_q2[MCAP file writer protobuf packet queue]
subgraph websocket thread
data_q1 --> enc[serialize into protobuf packet]
enc --> py_foxglove[foxglove server websocket]
end
subgraph file writer thread
data_q2 --> py_mcap[MCAP file writer]
end
filter journalctl based on service: journalctl -u nginx.service
it looks like the PCAN sym files along with the editor is a good format and tool for creation of the CAN network
I will simply match the CAN frame id name to the message name, and match each signal name (spaces into underscores) to the field name in the proto. The protobuf message will be packed with the parsed and converted data from cantools.
I want each CAN ID to have its own protobuf message. perhaps in the protobuf message I will also include the CAN ID as a fixed part of the protobuf message in the creation of the proto file.
I know that I will be using cantools to create the DBC file so I might as well extend that creation script to create the proto at the same time. Additionally, I know that I will be using tim's auto-magic nix-proto for creation of the python auto-gen code.
kvaser u100 pinout:
get files from the car:
rsync -azP nixos@192.168.40.1:/home/nixos/recordings ~/hytech_mcaps