Closed emrdgrmnci closed 8 months ago
Hi @emrdgrmnci, that's correct, at the moment, SwiftUI doesn't support clustering. In addition, clustering in UIKit is performed in the main thread, which is a bottleneck when working with many annotations. The current version, 1.1.0, doesn't yet support SwiftUI, and I'm working on SwiftUI support. The current main branch contains working functionality for integration with UIKit and Swift UI, and you can study this with SwiftUI-ExampleApp. At this moment, I'm in the process of writing documentation, and I'm going to release version 2.0 either this week or the next.
It's really good news to hear that SwiftUI support will be there soon 💪🏼
In the mean time, I'm already trying to implement clustering in my project through your SwiftUI example but just because of my experiment I'm confused in so many points.
I'm using only iOS 17 Map() in my app and I'm really struggling to use that QuadTree, ClusterManager and the other needed parts of your files.
Can you check out briefly?
https://github.com/emrdgrmnci/Plug-Me-17
struct LocationRequestIsOnView: View {
@Binding var searchResults: [MKMapItem]
@Binding var selectedTabBarButton: String
@Binding var visibleRegion: MKCoordinateRegion?
@Binding var route: MKRoute?
@Binding var isShowingBottomSheet: Bool
var mapView: some View {
Map(position: $cameraPosition, selection: $selectedResult) {
if !showSettingsView {
ForEach(searchResults, id:\.self) {
Marker(item: $0)
}
.annotationTitles(.hidden)
UserAnnotation()
if let route {
MapPolyline(route)
.stroke(.blue, lineWidth: 5)
}
}
}
}
}
Thanks in advance 🙏🏼
Thanks for a good example. I'll think about how to make seamless integration with vanilla SwiftUI. Currently, the library works regardless of frameworks (UIKit or SwfitUI).
For example, your View should contain data to display. At the moment, as I understand it, this is searchResults
. If you want to cluster this data, pass it to ClusterManager.
First, you need to apply the requirements (instead UUID, you can implement your ID):
struct YourOriginAnnotation: Identifiable, Hashable, CoordinateIdentifiable {
let id: UUID
var coordinate: CLLocationCoordinate2D
}
Next, you need to instantiate the cluster manager and add your annotations.
let clusterManager = ClusterManager<YourOriginAnnotation>()
clusterManager.add(#your annotations#)
Next, you need to reload clustering for the current map. You need to pass the size of your map view size (You can get this size by GeometryReader or use func readSize(onChange: @escaping (CGSize) -> Void)
from import ClusterMapSwiftUI
and map region (for example, you can get it from func onMapCameraChange(frequency: MapCameraUpdateFrequency = .onEnd, _ action: @escaping (MapCameraUpdateContext) -> Void))
await clusterManager.reload(mapViewSize: mapViewSize, coordinateRegion: region)
clusterManager.visibleAnnotations.forEach { annotation in
switch annotation {
case .annotation(let originalAnnotation):
annotations(originalAnnotation)
case .cluster(let clusterAnnotation):
clusters.append(clusterAnnotation)
}
}
As a result, you get enum with 2 types, not clustered annotations and a new type of annotation, as a cluster. You can get clustered annotations from the memberAnnotations variable (similar to how it works in UIKit).
And finally, you need to integrate this data into your map
@Observable final class DataSource {
var annotations = [YourOriginAnnotations]()
var clusters = [ClusterManager<YourOriginAnnotations>.ClusterAnnotation]()
}
struct ContentView: View {
@State private var dataSource = DataSource()
var body: some View {
Map(
content: {
ForEach(dataSource.annotations) { result in
Marker(result.title, systemImage: "mappin", coordinate: result.coordinate)
}
ForEach(dataSource.clusters) { result in
Marker("\(result.memberAnnotations.count)", systemImage: "mappin.square", coordinate: result.coordinate)
}
}
)
.readSize { newSize in
}
.onMapCameraChange(frequency: .onEnd, { context in
})
}
}
One crucial point is to call reload
whenever you need to recalculate clustering. For example, when changing the camera position.
It's complicated. I'll think about how it will be possible to simplify the process of working with the library in SwiftUI
Hi @vospennikov, sorry for bothering you again.
I'm having trouble to convert my 'searchResults' objects to be able to use them in 'ClusterManager'.
var searchResults = [MKMapItem]()
private func reloadAnnotations() async {
await clusterManager.reload(mapViewSize: mapSize, coordinateRegion: _region)
clusterManager.visibleAnnotations.forEach { annotation in
switch annotation {
case let .annotation(originalAnnotation):
annotations.append(originalAnnotation)
case .cluster(let clusterAnnotation):
clusters.append(clusterAnnotation)
}
}
}
func searchForEVChargingPoints(for query: String, region: MKCoordinateRegion, completion: @escaping ([MKMapItem]?) -> Void) {
let searchRequest = MKLocalSearch.Request()
searchRequest.naturalLanguageQuery = query
searchRequest.region = region
let localSearch = MKLocalSearch(request: searchRequest)
localSearch.start { (response, error) in
if let response = response {
completion(response.mapItems)
self.searchResults = response.mapItems // 🤯🤯🤯🤯
} else {
completion(nil)
}
}
}
func loadEVChargingPoints(for query: String) {
searchForEVChargingPoints(for: query, region: _region) { [weak self] mapItems in
guard let self = self else { return }
guard let mapItems = mapItems else { return }
let newAnnotations = mapItems.map {
MapAnnotation(
id: UUID(uuidString: $0.placemark.region?.identifier ?? "") ?? UUID(),
title: $0.placemark.title ?? "",
coordinate: $0.placemark.coordinate
)}
self.clusterManager.removeAll()
self.clusterManager.add(newAnnotations) // 🤯🤯🤯🤯
Task {
await self.reloadAnnotations()
}
}
}
var mapView: some View {
return Map(content: {
if !showSettingsView {
ForEach(searchResults, id: \.self) { result in // 🤔🤔🤔🤔🤔
Marker(result.placemark.title ?? "", systemImage: "mappin", coordinate: result.placemark.coordinate)
}
ForEach(dataSource.clusters) { result in
Marker("\(result.memberAnnotations.count)", systemImage: "mappin.square", coordinate: result.coordinate)
}
UserAnnotation()
if let route {
MapPolyline(route)
.stroke(.blue, lineWidth: 5)
}
}
}
)
.readSize(onChange: { newValue in
dataSource.mapSize = newValue
})
.onMapCameraChange(frequency: .onEnd, { context in
visibleRegion = context.region
dataSource.regionSubject.send(context.region)
})
}
I simplified example applications in my current work branch and added one with MKLocalSearchCompleter integration. https://github.com/vospennikov/ClusterMap/tree/feature/concurrency/Example/Example-SwiftUI/App/MapKitIntegration It's still not ready to work in production, but I hope it will help you.
really appreciated again. It's really perfect and way simpler now. But, I'm still having trouble just because of my view structure. I'd to divide views into smaller ones and now when I'm passing important data around no seeing annotations in my code. In addition to that async await and Tasks logic is quite confusing me. I've to study that things.
[SOLVED] 😮💨 It was completely about @State and @Binding mechanism. I focused on clustering but nothing and that's why I couldn't see the clustering result 👊🏼
I briefly reviewed your app, and everything is working great. If you have any questions, feel free to send me a message. It'll help me improve the API library and documentation.
The concept of the library is as follows:
When working with the UIKit Map, we add annotations to the map, which becomes our source of truth.
When working with the SwiftUI Map, we keep data (e.g., @State
). Our object is thus the source of truth.
The library changes the data structure. Whereas previously, it was simply an array or a dictionary, now it is a tree. The elements of our tree can be both the original annotation and the cluster, which will contain the original annotations folded.
If you are working with iOS 17, you can place the library in an object marked with the @Observable
macro and add it to your hierarchy via the .environment
modifier. Your map will call reload when needed (e.g., changing the camera position). You can add or remove annotations in any other views.
Asynchronous operation is a critical difference from the native implementation in UIKit. Apple handles adding data to the MainThread, which leads to lags when adding significant quantities of data. Furthermore, UIKit decides when to recalculate clustering and does it in MainThread, which causes lags when scrolling or zooming. Why is Apple executing this task in the main thread? Native clustering requires a view and its size to compute the grid size for clustering. As the library doesn't require rendering the view, you have to provide the dimensions of this grid on your own. You can achieve this by config:
let customConfig = Configuration(
cellSizeForZoomLevel: @escaping (Int) -> CGSize = { zoom in
switch zoom {
case 13...15: CGSize(width: 64, height: 64)
case 16...18: CGSize(width: 32, height: 32)
case 19...: CGSize(width: 16, height: 16)
default: CGSize(width: 88, height: 88)
}
}
)
let manager = ClusterManager(configuration: customConfig)
Apple's implementation is much more straightforward and convenient, but the performance issue could be critical to app quality.
Before this configuration I was using the default one and the Memory Usage was around ~500 MB and after custom configuration using your suggestion that's in the last line is around 232 MB. That's super smooth, man! 💪🏼
Hi! For the custom config you listed above to improve performance:
cellSizeForZoomLevel: @escaping (Int) -> CGSize = { zoom in
switch zoom {
case 13...15: CGSize(width: 64, height: 64)
case 16...18: CGSize(width: 32, height: 32)
case 19...: CGSize(width: 16, height: 16)
default: CGSize(width: 88, height: 88)
}
}
)```
Is there an equivalent for SwiftUI yet? I'm getting memory issues on my SwiftUI implementation with < 2000 points.
Hi @vospennikov, I've been looking for the SwiftUI Map clustering workaround for the past 12 days and I've just saw this repo. Can you describe how can we use that ClusterMapSwiftUI in our apps with a documentation?
As you know, SwiftUI lacks support for location annotation clustering in MapKit, which is an essential feature for building robust and user-friendly mapping applications. I believe that incorporating annotation clustering in SwiftUI would provide significant value to both developers and end-users.
Enabling annotation clustering in SwiftUI would provide consistency across the iOS development ecosystem. Currently, developers have to resort to UIKit for clustering annotations, which can be cumbersome when building SwiftUI-based applications.
I am confident that this addition would be well-received by the iOS developer community.
Thank you for your attention to this suggestion.
Best regards,
Emre Degirmenci