Open kcbenny opened 2 years ago
@kcbenny thanks for reaching out.
A volume of 200k polygon objects will be very difficult to render without serious memory issues or performance degradation. I would recommend to pre-filter this data dynamically, and/or aggregate hierarchically. For example, if you're trying to show very geographically fragmented data, group it first into say countries, and only show it for a single country at a time when clicked.
thanks for reply, my goal is to show the color of those polygons base on temperature Here is my enquiry:
hope you can guide me to make it easy to solve this problem thank you
I tried to fetch the geojson file with around 200000 records, i already removed some unused variables inside geojson, and still cause browser memory crash when using polygonsData(), when i load 20000 records , the globe spins extremely slow, want to know if there is anything we can do to improve the globe.gl js polygonsdata logic, as inside it is using foreach to generate the polygons that make the loading slow.thank you.