DanijelHuis / HDAugmentedReality

Augmented Reality component for iOS, written in Swift.
MIT License
480 stars 97 forks source link

CocoaPods Compatible

HDAugmentedReality

Augmented Reality component for iOS.

Description

HDAugmentedReality is designed to be used in areas with large concentration of static POIs where primary goal is the visibility of all POIs. This is achieved by stacking POIs vertically, meaning that farther POIs - ones that would normally be obscured by nearer POIs, are put higher. Altitudes of POIs are disregarded.

HDAugmentedReality

Features

What is next?

Important

Plist

CocoaPods

target "TargetName" do pod 'HDAugmentedReality', '~> 3.0' end


## Basic usage
Look at the demo project for a complete example.

Import
```swift
import HDAugmentedReality

Create annotations.

let annotation1 = ARAnnotation(identifier: "bakery", title: "Bakery", location: CLLocation(latitude: 45.13432, longitude: 18.62095))
let annotation2 = ARAnnotation(identifier: "supermarket", title: "Supermarket", location: CLLocation(latitude: 45.84638, longitude: 18.84610))
let annotation3 = ARAnnotation(identifier: "home", title: "Home", location: CLLocation(latitude: 45.23432, longitude: 18.65436))
let dummyAnnotations = [annotation1, annotation2, annotation3].compactMap{ $0 }

Create ARViewController and configure ARPresenter.

// Creating ARViewController. You can use ARViewController(nibName:bundle:) if you have custom xib.
let arViewController = ARViewController()

// Presenter - handles visual presentation of annotations
let presenter = arViewController.presenter!
presenter.presenterTransform = ARPresenterStackTransform()

arViewController.dataSource = self
arViewController.setAnnotations(dummyAnnotations)
self.present(arViewController, animated: true, completion: nil)

Implement ARDataSource and provide annotation view. This will be called for each annotation.

func ar(_ arViewController: ARViewController, viewForAnnotation: ARAnnotation) -> ARAnnotationView
{
// Annotation views should be lightweight views, try to avoid xibs and autolayout all together.
let annotationView = TestAnnotationView()
annotationView.frame = CGRect(x: 0,y: 0,width: 150,height: 50)
return annotationView;
}

Customization

Annotation customization

ARAnnotation holds data about your POI. You can subclass ARAnnotation and add your properties if you have the need (Look at TestAnnotation in the demo project).

AnnotationView customization/subclass

ARAnnotationView is just an empty view, you should subclass it and add your UI (labels, background etc.). Try to avoid xibs and constraints since they impact performance (keep in mind that you can have hundreds of these on screen). For example take a look at TestAnnotationView in the demo project.

ARAnnotationView has annotation property that holds annotation that you created. So if you subclassed ARAnnotation then this subclass instances would be passed to ARAnnotationView.

Subclassing:

ARViewController customization

Custom XIB
You can copy ARViewController.xib to your project, rename and edit it however you like and provide xib name to ARViewController(nibName:"MyARViewController", bundle: nil).

Adjust vertical offset by distance.

let presenter = arViewController.presenter!
presenter.distanceOffsetMode = .manual
presenter.distanceOffsetMultiplier = 0.05   // Pixels per meter
presenter.distanceOffsetMinThreshold = 500 // Tell it to not raise annotations that are nearer than this

Limit number of annotations shown by count and distance.

presenter.maxDistance = 5000               // Don't show annotations if they are farther than this
presenter.maxVisibleAnnotations = 100      // Max number of annotations on the screen

Adjust location tracking precision and heading/pitch movement.

let trackingManager = arViewController.trackingManager
trackingManager.userDistanceFilter = 15     // How often are distances and azimuths recalculated (in meters)
trackingManager.reloadDistanceFilter = 50   // How often are new annotations fetched (in meters)
trackingManager.filterFactor = 0.4          // Smoothing out the movement of annotation views
trackingManager.minimumTimeBetweenLocationUpdates = 2   // Minimum time between location updates

Radar

You can add radar with MKMapView and indicator ring. Please note that adding radar, and especially if radar.indicatorRingType is .precise, will significantly increase battery consumption.

let radar = RadarMapView()
radar.startMode = .centerUser(span: MKCoordinateSpan(latitudeDelta: 0.01, longitudeDelta: 0.01))
radar.trackingMode = .centerUserWhenNearBorder(span: nil)
radar.indicatorRingType = .segmented(segmentColor: nil, userSegmentColor: nil)
radar.maxDistance = 5000    // Limit bcs it drains battery if lots of annotations (>200), especially if indicatorRingType is .precise
arViewController.addAccessory(radar, leading: 15, trailing: nil, top: nil, bottom: 15 + safeArea.bottom / 4, width: nil, height: 150)

Ring around map indicates direction of out-of-bounds annotations. On the left is segmented indicator ring and on the right is precise indicator ring.

Radar

Custom accessories

You can make your accessories and use it with ARViewController. RadarMapView is an example of such accessory. In order to make accessory you must implement ARAccessory protocol (single method) and call ARViewController.addAccessory(...) or attach it from xib using accessoriesOutlet.

Custom presenter

If you don't like how annotation view are shown or positioned on screen, you can make your ARPresenter subclass and set it to ARViewController.presenter property.

Known issues

Axes: https://developer.apple.com/documentation/coremotion/getting_processed_device-motion_data/understanding_reference_frames_and_device_attitude

Components

ARTrackingManager: Class used internally by ARViewController for tracking and filtering location/heading/pitch etc. ARViewController takes all these informations and stores them in ARViewController.arStatus object, which is then passed to ARPresenter. This class is not intended for subclassing.

ARPresenter: Handles creation of annotation views and layouts them on the screen. Before anything is done, it first filters annotations by distance and count for improved performance. This class is also responsible for vertical stacking of the annotation views. It can be subclassed if custom positioning is needed, e.g. if you wan't to position annotations relative to its altitudes you would subclass ARPresenter and override xPositionForAnnotationView and yPositionForAnnotationView.

ARViewController: Glues everything together. Presents camera with ARPresenter above it. Takes all needed input from ARTrackingManager and passes it to ARPresenter.

ARAnnotation: Serves as the source of information(location, title etc.) about a single annotation. Annotation objects do not provide the visual representation of the annotation. It is analogue to MKAnnotation. It can be subclassed if additional information for some annotation is needed.

ARAnnotationView: Responsible for presenting annotations visually. Analogue to MKAnnotationView. It is usually subclassed to provide custom look.

ARStatus: Structure that holds all information about screen(FOV), device(location/heading/pitch) and all other informations important for layout of annotation views on the screen.

Version history:

License

HDAugmentedReality is released under the MIT license. See LICENSE for details.