-
### Search before asking
- [X] I searched in the [issues](https://github.com/apache/paimon/issues) and found nothing similar.
### Paimon version
0.9.0
### Compute Engine
Flink 1.18.1
…
-
Since XML is a very much outdated technology and not usable in many recent technologies (like streaming data via Kafka, would require json to use most common feature like a schema registry) it would b…
-
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.datastream.KeyedStream;…
-
It has been on my mind for a couple of years that the existing Parser and Serializer interface is a mess.
The current standard Parser interface predates multigraph support (from before `Dataset` an…
-
Hi,
1th sorry for mi bad english write.
Hope somebody can help me.
I'm try with the demo programm JsonStreamingParser.ino.
my Json is like:
`char json[] = "{\"observations\":[{\"stationID\"…
-
Greate project, we want use this in our production dev. And use dify chat api as below:
`http://192.168.32.241:8088/`
`Request POST /chat-messages
curl -X POST 'http://192.168.32.241:8088/v1/chat-m…
-
streaming seems to halt after the first chunk when using the 'gemini' provider (claude working great)
-
### Priority
P2-High
### OS type
Ubuntu
### Hardware type
AI-PC
### Installation method
- [ ] Pull docker images from hub.docker.com
- [ ] Build docker images from source
###…
-
**Environment setup: AWS EMR serverless 6.9.0 version
Pyspark ETL job with multiple streaming queries, each streaming query writes to an iceberg table and redshift table, in microbatches, the trigger…
-
Is there (or will there be) an option to validate a very large JSON file (up to 5GB) in chunks, e.g. via streaming, so that the whole JSON file never has to be held in memory?
This would be awesom…