Closed abossenbroek closed 5 years ago
Hello, @abossenbroek.
Thank you for your attention.
As I know Influx use 204 code as a result of a successful insert link.
So, I'm I right, you have 204 without inserting data?
Thanks for pointing out the doc, wasn't aware @fsanaulla. You are right, the example I provided does not lead to any data inserted into the database. Remarkably, it works if I create a Dataset
in scala without reading in a csv.
Few notes:
Few notes:
- can you provide a file sample?
See CSV section in my question
- check you timestamp format, because by default influx inspect to receive in nano precision.
I am aware of that, I took the data and wrote a bulk load myself and it works. Also tried
val data: Dataset[MeteringEntry] = Seq(
MeteringEntry("meaningOfLife", Option("zorg"), 442.401184,1451602860000000000L),
MeteringEntry("meaningOfLife", Option("zorg"), 442.083191,1451602920000000000L),
MeteringEntry("meaningOfLife", Option("zorg"), 442.161682,1451602980000000000L),
MeteringEntry("meaningOfLife", Option("zorg"), 442.598999,1451603040000000000L),
MeteringEntry("meaningOfLife", Option("zorg"), 442.052185,1451603100000000000L),
MeteringEntry("meaningOfLife", Option("zorg"), 442.718689,1451603160000000000L),
MeteringEntry("meaningOfLife", Option("zorg"), 442.338806,1451603220000000000L),
MeteringEntry("meaningOfLife", Option("zorg"), 442.322815,1451603280000000000L),
MeteringEntry("meaningOfLife", Option("zorg"), 442.588318,1451603340000000000L),
MeteringEntry("meaningOfLife", Option("zorg"), 442.041687,1451603400000000000L)).toDS
and this works properly. Somehow just the csv doesn't work.
- did you try structure-streaming for csv? Connector for it already exists in this project
Not sure how to do this. Any pointers would be useful, thanks.
Interestingly, adding .repartition(2)
solves the issue somehow. Test with:
data.repartition(2).filter(row => row.sensorID == sensorID).saveToInfluxDB(dbName = "galaxy", measName = sensorID,
onFailure = ex => {
throw new Exception(ex)
})
@abossenbroek 0.2.9
should be in a few hours. It should help you I think. Tell me if it helps.
Reopen if the error still occurs.
I am trying to load data from a CSV to InfluxDB using chronicler-spark. The program completes without errors but InfluxDB reports a 204 error. Example program that shows this behaviour on my laptop can be found below. May be related to a bug in influxdb.
sbt
Writing to InfluxDB
/tmp/toy_data.csv
docker-compose.yml
Logs
When running with
sbt run
I see the following on the spark logs,and the following in the docker logs
Diagnosis
To understand the issue better I set a breakpoint at
com.github.fsanaulla.chronicler.urlhttp.io.models.UrlWriter.scala:45
and stepped through the method. I can confirm the entity is properly formatted and works well when inserted with insert using the chronograf gui.