Open pooriaPoorsarvi opened 1 year ago
Hey friend, did you manage to solve this?
What stack are you using? I am using python fastapi to handle data (something like 800 Mb with polygons) and things works really fine. I have a csv with 5 million points and it takes something like 2 seconds to render
Is your feature request related to a problem? Please describe. We are using Kepler, but some of our data (the csv of our data) can be more than 500MB, but realistically, we can even expect more than 1 GB of data. This can't be stored in a variable in javascript, especially not as a string (https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/length).
Describe the solution you'd like It would be great if we could have a solution added to processors (https://docs.kepler.gl/docs/api-reference/processors/processors#processgeojson:~:text=%3E%20formatted%20fields-,processCsvData,-Process%20csv%20data) to send in blobs, especially for CSV files.
Describe alternatives you've considered We could divide up the data, but that would result in having separate data points for one source, which is not ideal or usable when working with multiple data sources.