eclipse-sdv-blueprints / fleet-management

A close to real-life showcase for truck fleet management where trucks run an SDV software stack so that logistics fleet operators can manage apps, data and services for a diverse set of vehicles.
Apache License 2.0
16 stars 10 forks source link

Running CSV provider using fms-demo-compose.yaml #16

Open Hariprasath-TE opened 11 months ago

Hariprasath-TE commented 11 months ago

Using the CSV-Provider With the CSV-provider one can replay signals from a CSV-file to an instance of the Kuksa.val data broker. More details are available in the upstream repository. To execute the CSV-provider with the Docker Compose setup, add the argument '--profile csv':

docker compose -f ./fms-demo-compose.yaml --profile direct --profile csv up --detach

The above command necessitates "fms-demo-compose.yaml" file which i could not find in the git repo. Moreover, when i tried to feed data from signals.csv using provider.py file from upstream repository, i.e., Kuksa-val-feeders repo, i could not see the data on Grafana nor on InfluxDB.

However, I can get the data sent from kuksa databroker cli. From thereon, it is not clear. Could anybody help me with clear steps and proper commands to set it right up till end (visualization of data).

Any advice or information as to what i am missing out on and possible solutions to proceed with is appreciable.

sophokles73 commented 10 months ago

I am not sure about what you want to achieve. Have you followed the instructions given in README.md?

Hariprasath-TE commented 10 months ago

i wanna to push the data from csv_provider to influxDB and grafana for that i have done the below steps

1)For that i have started all services by run this docker compose "docker compose -f ./fms-blueprint-compose.yaml up --detach". 2)Then i have sent the data from signal.csv to kuksa databroker. Now how can i send those data from kuksa databroker to influxDB and grafana through fms forwarder.

sophokles73 commented 10 months ago

Then i have sent the data from signal.csv to kuksa databroker.

You do not need to do that manually, the Docker Compose file already starts a CSV-Provider container which publishes the data from the csv-provider/signalsFmsRecording.csv file to the Databroker. The FMS Forwarder component then retrieves the data and writes it to InfluxDB.

Hariprasath-TE commented 10 months ago

Yes. It does publish the data to the databroker. But iam not sure if the fms forwarder is forwarding the data to influxdb. Because when I checked influxdb, no data similar to the fed data is showing up. 1. Any reason for that?

  1. And kindly brief the working solution, please.
sophokles73 commented 10 months ago

Because when I checked influxdb, no data similar to the fed data is showing up

What data are you talking about here? The data from csv-provider/signalsFmsRecording.csv? Or are you trying to feed in data from a csv file that you have created yourself?

Hariprasath-TE commented 10 months ago

How to feed a dataset that i have created myself? Can you please explain sophokles73

sophokles73 commented 10 months ago

In fms-blueprint-compose.yaml in the section

csv-provider:
    image: "ghcr.io/eclipse/kuksa.val.feeders/csv-provider:main"
    container_name: "csv-provider"
    cap_drop: *default-drops
    networks:
    - "fms-vehicle"
    depends_on:
      databroker:
        condition: service_started
    volumes:
    - "./csv-provider/signalsFmsRecording.csv:/dist/signals.csv"
    environment:
      PROVIDER_INFINITE: 1
      PROVIDER_LOG_LEVEL: "INFO"
      KUKSA_DATA_BROKER_ADDR: "databroker"
      KUKSA_DATA_BROKER_PORT: "55556"

just replace the reference to ./csv-provider/signalsFmsRecording.csv with the path to your own file. Reading the Docker Compose documentation would also be very helpful to better understand how things work together ...

Hariprasath-TE commented 10 months ago

i sent the data from in-vehicle to cloudside influxDB as mentioned in the documentation and now i tried to connect C2E hono instead of using hono sandbox to send the data to ditto however i couldn't. can you suggest the way to send in-vehicle data to ditto.

sophokles73 commented 10 months ago

I am not a Ditto expert but I guess you will need t o create a digital twin in Ditto for the vehicle and then create a Ditto Connection to Hono's Kafka broker for the Ditto tenant that the vehicle twin belongs to. The data published by the vehicle is a protocol buffer so you will need to add a Ditto mapping script to the Connection which parses the protocol buffer and transforms it into Ditto's (JSON based) data format ...

sophokles73 commented 9 months ago

@Hariprasath-TE can this be closed?

Hariprasath-TE commented 9 months ago

I mean, my issue is actually pending. Let it be open for a little while, as a request from me, for anymore clarifications. I will notify you on when this can be closed? Thank you @sophokles73