Open Hariprasath-TE opened 11 months ago
I am not sure about what you want to achieve. Have you followed the instructions given in README.md?
i wanna to push the data from csv_provider to influxDB and grafana for that i have done the below steps
1)For that i have started all services by run this docker compose "docker compose -f ./fms-blueprint-compose.yaml up --detach". 2)Then i have sent the data from signal.csv to kuksa databroker. Now how can i send those data from kuksa databroker to influxDB and grafana through fms forwarder.
Then i have sent the data from signal.csv to kuksa databroker.
You do not need to do that manually, the Docker Compose file already starts a CSV-Provider container which publishes the data from the csv-provider/signalsFmsRecording.csv
file to the Databroker. The FMS Forwarder component then retrieves the data and writes it to InfluxDB.
Yes. It does publish the data to the databroker. But iam not sure if the fms forwarder is forwarding the data to influxdb. Because when I checked influxdb, no data similar to the fed data is showing up. 1. Any reason for that?
Because when I checked influxdb, no data similar to the fed data is showing up
What data are you talking about here? The data from csv-provider/signalsFmsRecording.csv
? Or are you trying to feed in data from a csv file that you have created yourself?
How to feed a dataset that i have created myself? Can you please explain sophokles73
In fms-blueprint-compose.yaml
in the section
csv-provider:
image: "ghcr.io/eclipse/kuksa.val.feeders/csv-provider:main"
container_name: "csv-provider"
cap_drop: *default-drops
networks:
- "fms-vehicle"
depends_on:
databroker:
condition: service_started
volumes:
- "./csv-provider/signalsFmsRecording.csv:/dist/signals.csv"
environment:
PROVIDER_INFINITE: 1
PROVIDER_LOG_LEVEL: "INFO"
KUKSA_DATA_BROKER_ADDR: "databroker"
KUKSA_DATA_BROKER_PORT: "55556"
just replace the reference to ./csv-provider/signalsFmsRecording.csv
with the path to your own file. Reading the Docker Compose documentation would also be very helpful to better understand how things work together ...
i sent the data from in-vehicle to cloudside influxDB as mentioned in the documentation and now i tried to connect C2E hono instead of using hono sandbox to send the data to ditto however i couldn't. can you suggest the way to send in-vehicle data to ditto.
I am not a Ditto expert but I guess you will need t o create a digital twin in Ditto for the vehicle and then create a Ditto Connection to Hono's Kafka broker for the Ditto tenant that the vehicle twin belongs to. The data published by the vehicle is a protocol buffer so you will need to add a Ditto mapping script to the Connection which parses the protocol buffer and transforms it into Ditto's (JSON based) data format ...
@Hariprasath-TE can this be closed?
I mean, my issue is actually pending. Let it be open for a little while, as a request from me, for anymore clarifications. I will notify you on when this can be closed? Thank you @sophokles73
Using the CSV-Provider With the CSV-provider one can replay signals from a CSV-file to an instance of the Kuksa.val data broker. More details are available in the upstream repository. To execute the CSV-provider with the Docker Compose setup, add the argument '--profile csv':
docker compose -f ./fms-demo-compose.yaml --profile direct --profile csv up --detach
The above command necessitates "fms-demo-compose.yaml" file which i could not find in the git repo. Moreover, when i tried to feed data from signals.csv using provider.py file from upstream repository, i.e., Kuksa-val-feeders repo, i could not see the data on Grafana nor on InfluxDB.
However, I can get the data sent from kuksa databroker cli. From thereon, it is not clear. Could anybody help me with clear steps and proper commands to set it right up till end (visualization of data).
Any advice or information as to what i am missing out on and possible solutions to proceed with is appreciable.