Insta360Develop / CameraSDK-iOS

iOS SDK to control Insta360 cameras.
37 stars 5 forks source link

CameraSDK-iOS

You can learn how to control the Insta360 camera in the following section. And the camera supports the following two communication protocols:

  1. SCMP (Spherical Camera Messaging Protocol)

    All the native interfaces in the framework are based on the SCMP(Spherical Camera Messaging Protocol) developed by Insta360.

  2. OSC (Open Spherical Camera)

    Camera supports the control command of Open Spherical Camera API level 2, except preview stream. Familiarity with Open Spherical Camera API - Commands official documentation is a prerequisite for OSC development.

We suggest that you use the same protocol to control the camera in the whole development process. If you want to control the camera through the OSC protocol, please check the OSC section of the document directly.

Table of Contents

Integration

binary "#INSCoreMedia:By applying for authorization from Insta360#" == 1.25.30
binary "#INSCameraSDK-osc:By applying for authorization from Insta360#" == 3.0.5

Setup

  1. Embed the INSCameraSDK and INSCoreMedia frameworks to your project target.

  2. If you need to connect the camera via wired, you should add the following item into Info.plist. Otherwise, skip the following steps.

Connection

You can get the wifi name and password through Bluetooth's INSCamerasdSDK, And you can refer to the code in sample

/// Global Bluetooth management object,
let bluetoothManager = INSBluetoothManager()
/// Currently connected device
var connectedDevice: INSBluetoothDevice?

/// Scan the camera you want to connect
- (void)scanCamerasWithHandler:(void (^)(INSBluetoothDevice * _Nonnull device,
                                         NSNumber * _Nonnull RSSI,
                                         NSDictionary<NSString *, id> * _Nonnull advertisementData))handler;
/// Connect the scanned camera
- (id)connectDevice:(INSBluetoothDevice *)device
         completion:(void(^)(NSError  * _Nullable error))completion;

///sample
 self?.bluetoothManager.scanCameras(handler: { device, rssi, advertisementData in
     if (self?.connectedDevice != nil) {
         self?.bluetoothManager.disconnectDevice(self!.connectedDevice!);
     }
     self?.bluetoothManager.connect(device, completion: { (err) in
         print(device.peripheral.identifier)
        if (err == nil) {
             self?.connectedDevice = device
        }
    })
})

///get wifi info
guard let peripheral = self?.connectedDevice else { return }
let commandManager = self?.bluetoothManager.command(by: peripheral)
let optionTypes = [
                    NSNumber(value: INSCameraOptionsType.serialNumber.rawValue),
                    NSNumber(value: INSCameraOptionsType.wifiInfo.rawValue),
                    NSNumber(value: INSCameraOptionsType.wifiChannelList.rawValue),
                    NSNumber(value: INSCameraOptionsType.cameraType.rawValue)
                ]
commandManager?.getOptionsWithTypes(optionTypes, completion: { (err, options, successTypes) in
    if let err = err {
         self?.showAlert("\((err as NSError).code)", err.localizedDescription)
    } else {
        self?.showAlert(options?.wifiInfo?.ssid ?? "", options?.wifiInfo?.password ?? "")
    }
})

If you connect camera via wifi, you need to set the host to http://192.168.42.1. and if you connect camera via the Lightning interface, you need to change the host to http://localhost:9099.

We recommend using the following methods to convert the URL and path

/// convert (photo or video) resource uri to http url via http tunnel and Wi-Fi socket
extern NSURL *INSHTTPURLForResourceURI(NSString *uri);

/// convert local http url to (photo or video) resource uri
extern NSString *INSResourceURIFromHTTPURL(NSURL *url);

SCMP(Spherical Camera Messaging Protocol)

Add the following code in your AppDelegate, or somewhere your app is ready to work with Insta360 cameras via the wired (connect the camera through the Lightning interface).

And if you connect camera via WiFi, you should add [[INSCameraManager socketManager] setup] once where you need to start the socket connection.

The connection is asynchronous. You need to monitor the connection status and operate the camera when the connection status is INSCameraStateConnected.

What's more, call [[INSCameraManager sharedManager] shutdown] when your app won't listen on Insta360 cameras any more. see connection monitoring

#import <INSCameraSDK/INSCameraSDK.h>

- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
    // connet camera via wired
    [[INSCameraManager usbManager] setup];
    return YES;
}

Status

You can monitor the connection status of the camera in the following ways:

- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary<NSKeyValueChangeKey,id> *)change context:(void *)context {
    if (![object isKindOfClass:[INSCameraManager class]]) {
        [super observeValueForKeyPath:keyPath ofObject:object change:change context:context];
        return ;
    }
    INSCameraManager *manager = (INSCameraManager *)object;
    INSCameraState state = [change[NSKeyValueChangeNewKey] unsignedIntegerValue];
    switch (state) {
    case INSCameraStateFound: {
        NSLog(@"Found");
        break;
    }
    case INSCameraStateConnected: {
        NSLog(@"Connected");
    if (manager == [INSCameraManager socketManager]) {
        [self startSendingHeartbeats];
        }
    break;
    }
    case INSCameraStateConnectFailed: {
        NSLog(@"Failed");
    [self stopSendingHeartbeats];
    break;
    }
    default:
        NSLog(@"Not Connect");
    [self stopSendingHeartbeats];
    break;
    }
}

Heartbeat

When you connect your camera via wifi, you need to send heartbeat information to the camera at 2 Hz (every 500ms).

// Objective-C
[[INSCameraManager socketManager].commandManager sendHeartbeatsWithOptions:nil]

Commands

You can learn all the commands supported by the SDK from INSCameraCommands.h, and INSCameraCommandOptions.h shows the structure needed by all commands. All options that you can get from camera are list in INSCameraOptionsType.

NSNotification+INSCamera.h show that the application can get the notification of camera by monitoring.

Here is a sample code shows how to get storage & battery through getOptionsWithTypes:completion:

NSArray *optionTypes = @[@(INSCameraOptionsTypeStorageState),@(INSCameraOptionsTypeBatteryStatus)];
[[INSCameraManager sharedManager].commandManager getOptionsWithTypes:optionTypes completion:^(NSError * _Nullable error, INSCameraOptions * _Nullable options, NSArray<NSNumber *> * _Nullable successTypes) {
    if (!options) {
        NSLog(@"fetch options error: %@",error.description);
        return ;
    }
    NSLog(@"storage status: %@",options.storageStatus);
    NSLog(@"battery status: %@",options.batteryStatus);
}];
typedef NS_ENUM(NSUInteger, INSCameraCardState) {

    INSCameraCardStateNormal = 0,

    INSCameraCardStateNoCard = 1,

    INSCameraCardStateNoSpace = 2,

    INSCameraCardStateInvalidFormat = 3,

    INSCameraCardStateWriteProtectCard = 4,

    INSCameraCardStateUnknownError = 5,
};

@interface INSCameraStorageStatus : NSObject

@property (nonatomic) INSCameraCardState cardState;

@property (nonatomic) int64_t freeSpace;

@property (nonatomic) int64_t totalSpace;

@end
typedef NS_ENUM(NSUInteger, INSCameraPowerType) {
    INSCameraPowerTypeBattery = 0,
    INSCameraPowerTypeAdapter = 1,
};

@interface INSCameraBatteryStatus : NSObject

@property (nonatomic) INSCameraPowerType powerType;

@property (nonatomic) NSInteger batteryLevel;

@property (nonatomic) NSInteger batteryScale;

@end

Take Picture

Call takePictureWithOptions:completion, the photo will be saved into the SD card without transferring data to the iOS app

INSTakePictureOptions *options = [[INSTakePictureOptions alloc] init];
options.mode = INSPhotoModeAeb;
options.AEBEVBias = @[@(0), @(-2), @(-1), @(1), @(2)];
options.generateManually = YES;
/// x4
options.inerProccess = true;
/// other
options.inerProccess = false;
[[INSCameraManager sharedManager].commandManager takePictureWithOptions:options completion:^(NSError * _Nullable error, INSCameraPhotoInfo * _Nullable photoInfo) {
    NSLog(@"take hdr picture: %@, %@",photoInfo.uri,photoInfo.hdrUris);
}];

Video Capture

Call setOptions:completion to change shooting mode.

/// change to photo mode
INSCameraOptions *options = [[INSCameraOptions alloc] init];
options.photoSubMode = self.INSPhotoSubModeSingle;
[[INSCameraManager sharedManager].commandManager setOptions:options forTypes:@[@(INSCameraOptionsTypePhotoSubMode)] completion:^(NSError * _Nullable error, NSArray<NSNumber *> * _Nullable successTypes) {

}];
/// change to video mode
INSCameraOptions *options = [[INSCameraOptions alloc] init];
options.videoSubMode = self.INSVideoSubModeNormal;
[[INSCameraManager sharedManager].commandManager setOptions:options forTypes:@[@(INSCameraOptionsTypeVideoSubMode)] completion:^(NSError * _Nullable error, NSArray<NSNumber *> * _Nullable successTypes) {

}];

- When shooting x4, you need to switch to the corresponding shooting mode first.

Call `startCaptureWithOptions:completion` to start recording, and call `stopCaptureWithOptions:completion` to stop the recording.

```Objective-C
// start capture
INSCaptureOptions *options = [[INSCaptureOptions alloc] init];
[[INSCameraManager sharedManager].commandManager startCaptureWithOptions:options completion:^(NSError * _Nullable error) {
    if (error) {
        NSLog(@"start capture error: %@",error);
    }
}];

// stop capture
INSCaptureOptions *options = [[INSCaptureOptions alloc] init];
[[INSCameraManager sharedManager].commandManager stopCaptureWithOptions:options completion:^(NSError * _Nullable error, INSCameraVideoInfo * _Nullable videoInfo) {
    NSLog(@"video url: %@",videoInfo.uri);
}];

Set Photography Options

The INSCameraSDK also provide you the API to change photography options, such as EV, white balance, exposure program, iso and shutter.

As the shutter speed of still and video may be different, set stillExposure to manual program will not effect the live stream, so you need to call setPhotographyOptions again to set the liveStream's videoExposure. Note that the shutter speed of videoExposure should not be larger than 1.0/framerate.

// live stream, x4 does not require setup.
INSCameraExposureOptions *videoExposureOptions = [[INSCameraExposureOptions alloc] init];
videoExposureOptions.program = INSCameraExposureProgramManual;
videoExposureOptions.iso = 200;
videoExposureOptions.shutterSpeed = CMTimeMake(1, 30);

INSPhotographyOptions *options = [[INSPhotographyOptions alloc] init];
options.videoExposure = videoExposureOptions;

NSArray *types = @[@(INSPhotographyOptionsTypeVideoExposureOptions)];
[[INSCameraManager sharedManager].commandManager
 setPhotographyOptions:options forFunctionMode:INSCameraFunctionModeLiveStream
 types:types completion:^(NSError * _Nullable error, NSArray<NSNumber *> * _Nullable successTypes) {
    NSLog(@"Set Photogtaphy Options %@",error);
}];
// take normal picture
INSCameraExposureOptions *stillExposureOptions = [[INSCameraExposureOptions alloc] init];
stillExposureOptions.program = INSCameraExposureProgramManual;
stillExposureOptions.iso = 200;
stillExposureOptions.shutterSpeed = CMTimeMake(5, 1);

INSPhotographyOptions *options = [[INSPhotographyOptions alloc] init];
options.stillExposure = stillExposureOptions;

NSArray *types = @[@(INSPhotographyOptionsTypeStillExposureOptions)];
[[INSCameraManager sharedManager].commandManager
 setPhotographyOptions:options forFunctionMode:INSCameraFunctionModeNormalImage
 types:types completion:^(NSError * _Nullable error, NSArray<NSNumber *> * _Nullable successTypes) {
    NSLog(@"Set Photogtaphy Options %@",error);
}];
// On device processing pureshot only support on x4, default On once pureshot is set.
INSPhotographyOptions *options = [[INSPhotographyOptions alloc] init];
options.rawCaptureType = INSCameraRawCaptureTypePureshot;
NSArray *types = @[@(INSCameraRawCaptureType)];
[[INSCameraManager sharedManager].commandManager
 setPhotographyOptions:options forFunctionMode:INSCameraFunctionModeNormalImage
 types:types completion:^(NSError * _Nullable error, NSArray<NSNumber *> * _Nullable successTypes) {
    NSLog(@"Set Photogtaphy Options %@",error);
}];
// On device stitching only support on x4.
INSCameraOptions *options = [[INSCameraOptions alloc] init];
options.enableInternalSplicing = true;
NSArray *types = @[@(INSCameraOptionsTypeInternalSplicing)];
[[INSCameraManager sharedManager].commandManager setOptions:options forTypes:types completion:^(NSError * _Nullable error, NSArray<NSNumber *> * _Nullable successTypes) {

}];

#### <a name="Multi_PhotographyOptions" />Multi Photography Options</a>

When the capture mode is switched to `Wide angle`, you need to use `INSMultiPhotographyOptions` to modify the following capture parameters.

```Objective-C
typedef NS_ENUM(uint16_t, INSMultiPhotographyOptionsType) {
     /// Default
    INSMultiPhotographyOptionsTypeUnknown = 0,

    /// Capture resolution, readwrite. @available ONE X2
    INSMultiPhotographyOptionsTypeResolution = 1,

    /// Indicates whether the video is internal flowstate, readwrite. @available ONE X2
    INSMultiPhotographyOptionsTypeInternalFlowstate = 2,

    /// Indicates whether the captured file is a portrait file or a landscape file, readwrite. @available ONE X2
    INSMultiPhotographyOptionsTypeDimensionType = 3,

    /// Fov type, readwrite. @available ONE X2
    INSMultiPhotographyOptionsTypeFovType = 4,
};

Timelapse Options

Call setTimelapseOptions:forMode:completion: to config the timelapse options, and call startCaptureTimelapseWithOptions:completion: to start capture timelapse video.

P.s. The timelapseOptions in INSStartCaptureTimelapseOptions only be used to calculate timeout, will not be set to camera. If you want to change the config of timelapse video, you should call setTimelapseOptions:forMode:completion: first.

INSTimelapseOptions *options = [[INSTimelapseOptions alloc] init];
options.duration = #total record duration that you expect#;
options.lapseTime = #the time interval for capturing each picture#;

[[INSCameraManager sharedManager].commandManager
 setTimelapseOptions:options
 forMode:weakSelf.mode
 completion:^(NSError * _Nullable error) {
    if (error) {
        NSLog(@"error: %@",error.localizedDescription);
    } else {
        // success
    }
}];

List files

The file list is divided into the following interfaces:

[[INSCameraManager sharedManager].commandManager 
 fetchPhotoListWithCompletion:^(NSError * _Nullable error, NSArray<INSCameraPhotoInfo *> * _Nullable photoInfoList) {
    NSLog(@"files: %@",photoInfoList);
}];
[[INSCameraManager sharedManager].commandManager 
 fetchVideoListWithCompletion:^(NSError * _Nullable error, NSArray<INSCameraVideoInfo *> * _Nullable videoInfoList) {
    NSLog(@"files: %@",videoInfoList);
}];
[[INSCameraManager sharedManager].commandManager
 fetchRawPhotoListWithCompletion:^(NSError * _Nullable error, NSArray<INSCameraPhotoInfo *> * _Nullable photoInfoList) {
    NSLog(@"files: %@",photoInfoList);
}];

Working with audio & video stream

Audio and video stream is based on SCMP(Spherical Camera Messaging Protocol). If you need to preview the camera in real time, make sure that the INSCameraManager.cameraState is INSCameraStateConnected. see SCMP(Spherical Camera Messaging Protocol) connection

Control center - INSCameraMediaSession

INSCameraMediaSession is the central class to work with audio & video streams for Insta360 camera. It has these functions:

  1. You can configure the input (the camera) by setting the expectedAudioSampleRate, expectedVideoResolution and gyroPlayMode.
  2. Control the camera's input streams, turn on by calling startRunningWithCompletion:, turn off by calling stopRunningWithCompletion:.
  3. Parse and decode media data, stitch the video.
  4. Distribute outputs to INSCameraMediaPluggable such as INSCameraFlatPanoOutput.
  5. When the session is running, you can change the input configurations, plug or unplug pluggables, make the changes working by calling commitChangesWithCompletion:.

Preview

#import <UIKit/UIKit.h>
#import <INSCameraSDK/INSCameraSDK.h>

@interface ViewController () <INSCameraPreviewPlayerDelegate>

@property (nonatomic, strong) INSCameraMediaSession *mediaSession;

@property (nonatomic, strong) INSCameraPreviewPlayer *previewPlayer;

@property (nonatomic, strong) INSCameraStorageStatus *storageState;

@property (nonatomic, assign) INSVideoEncode videoEncode;

@end

@implementation ViewController

- (instancetype)initWithNibName:(NSString *)nibNameOrNil bundle:(NSBundle *)nibBundleOrNil {
    self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil];
    if (self) {
        _videoEncode = INSVideoEncodeH264;
        _mediaSession = [[INSCameraMediaSession alloc] init];
    }
    return self;
}

- (instancetype)initWithCoder:(NSCoder *)aDecoder {
    return [super initWithCoder:aDecoder];
}

- (void)dealloc {
    [_mediaSession stopRunningWithCompletion:^(NSError * _Nullable error) {
        NSLog(@"stop media session with err: %@", error);
    }];

    [[INSCameraManager usbManager] removeObserver:self forKeyPath:@"cameraState"];
    [[INSCameraManager socketManager] removeObserver:self forKeyPath:@"cameraState"];
}

- (void)viewDidLoad {
    [super viewDidLoad];

    [[INSCameraManager usbManager] addObserver:self forKeyPath:@"cameraState" options:NSKeyValueObservingOptionNew context:nil];
    [[INSCameraManager socketManager] addObserver:self forKeyPath:@"cameraState" options:NSKeyValueObservingOptionNew context:nil];

    [self setupRenderView];
}

- (void)viewWillAppear:(BOOL)animated {
    [super viewWillAppear:animated];

    if ([INSCameraManager sharedManager].currentCamera) {
        __weak typeof(self)weakSelf = self;
        [self fetchOptionsWithCompletion:^{
            [weakSelf updateConfiguration];
            [weakSelf runMediaSession];
        }];
    }
}

- (void)updateConfiguration {
    // main stream resolution
    _mediaSession.expectedVideoResolution = INSVideoResolution1920x960x30;

    // secondary stream resolution
    _mediaSession.expectedVideoResolutionSecondary = INSVideoResolution960x480x30;

    // use main stream or secondary stream to preview
    _mediaSession.previewStreamType = INSPreviewStreamTypeSecondary;

    // audio sample rate
    _mediaSession.expectedAudioSampleRate = INSAudioSampleRate48000Hz;

    // preview stream encode
    _mediaSession.videoStreamEncode = INSVideoEncodeH264;

    // gyroscope correction mode
    // If you are in panoramic preview, use `INSGyroPlayModeDefault`.
    // If you are in wide angle preview, use `INSGyroPlayModeFootageMotionSmooth`.
    _mediaSession.gyroPlayMode = INSGyroPlayModeDefault;
}

- (void)setupRenderView {
    CGFloat height = CGRectGetHeight(self.view.bounds) * 0.333;
    CGRect frame = CGRectMake(0, CGRectGetHeight(self.view.bounds) - height, CGRectGetWidth(self.view.bounds), height);
    _previewPlayer = [[INSCameraPreviewPlayer alloc] initWithFrame:frame
                                                        renderType:INSRenderTypeSphericalPanoRender];
    [_previewPlayer playWithGyroTimestampAdjust:30.f];
    _previewPlayer.delegate = self;
    [self.view addSubview:_previewPlayer.renderView];

    [_mediaSession plug:self.previewPlayer];

    // adjust field of view parameters
    NSString *offset = [INSCameraManager sharedManager].currentCamera.settings.mediaOffset;
    if (offset) {
        NSInteger rawValue = [[INSLensOffset alloc] initWithOffset:offset].lensType;
        if (rawValue == INSLensTypeOneR577Wide || rawValue == INSLensTypeOneR283Wide) {
            _previewPlayer.renderView.enablePanGesture = NO;
            _previewPlayer.renderView.enablePinchGesture = NO;

            _previewPlayer.renderView.render.camera.xFov = 37;
            _previewPlayer.renderView.render.camera.distance = 700;
        }
    }
}

- (void)fetchOptionsWithCompletion:(nullable void (^)(void))completion {
    __weak typeof(self)weakSelf = self;
    NSArray *optionTypes = @[@(INSCameraOptionsTypeStorageState),@(INSCameraOptionsTypeVideoEncode)];
    [[INSCameraManager sharedManager].commandManager getOptionsWithTypes:optionTypes completion:^(NSError * _Nullable error, INSCameraOptions * _Nullable options, NSArray<NSNumber *> * _Nullable successTypes) {
        if (!options) {
            [weakSelf showAlertWith:@"Get options" message:error.description];
            completion();
            return ;
        }
        weakSelf.storageState = options.storageStatus;
        weakSelf.videoEncode = options.videoEncode;
        completion();
    }];
}

- (void)runMediaSession {
    if ([INSCameraManager sharedManager].cameraState != INSCameraStateConnected) {
        return ;
    }

    __weak typeof(self)weakSelf = self;
    if (_mediaSession.running) {
        self.view.userInteractionEnabled = NO;
        [_mediaSession commitChangesWithCompletion:^(NSError * _Nullable error) {
            NSLog(@"commitChanges media session with error: %@",error);
            weakSelf.view.userInteractionEnabled = YES;
            if (error) {
                [weakSelf showAlertWith:@"commitChanges media failed" message:error.description];
            }
        }];
    }
    else {
        self.view.userInteractionEnabled = NO;
        [_mediaSession startRunningWithCompletion:^(NSError * _Nullable error) {
            NSLog(@"start running media session with error: %@",error);
            weakSelf.view.userInteractionEnabled = YES;
            if (error) {
                [weakSelf showAlertWith:@"start media failed" message:error.description];
                [weakSelf.previewPlayer playWithSmoothBuffer:NO];
            }
        }];
    }
}

- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary<NSKeyValueChangeKey,id> *)change context:(void *)context {
    if ([keyPath isEqualToString:@"cameraState"]) {
        INSCameraState state = [change[NSKeyValueChangeNewKey] unsignedIntegerValue];
        switch (state) {
            case INSCameraStateFound:
                break;
            case INSCameraStateConnected:
                [self runMediaSession];
                break;
            default:
                [_mediaSession stopRunningWithCompletion:nil];
                break;
        }
    }
}

#pragma mark INSCameraPreviewPlayerDelegate
- (NSString *)offsetToPlay:(INSCameraPreviewPlayer *)player {
    NSString *mediaOffset = [INSCameraManager sharedManager].currentCamera.settings.mediaOffset;
    if (([[INSCameraManager sharedManager].currentCamera.name isEqualToString:kInsta360CameraNameOneX]
         || [[INSCameraManager sharedManager].currentCamera.name isEqualToString:kInsta360CameraNameOneR]
         || [[INSCameraManager sharedManager].currentCamera.name isEqualToString:kInsta360CameraNameOneX2])
        && [INSLensOffset isValidOffset:mediaOffset]) {
        return [INSOffsetCalculator convertOffset:mediaOffset toType:INSOffsetConvertTypeOneX3040_2_2880];
    }

    return mediaOffset;
}

@end

For further preview config

/*!
 *  For one、nano s,  The expected video resolution, if you want to change the value when mediaSession is running, you need to invoke commitChangesWithCompletion:
 *  For one x, you should set resolution for both Main and Secondary stream. use 'INSPreviewStreamType' to choose which is used for preview stream.
 */
@property (nonatomic) INSVideoResolution expectedVideoResolution;

/*!
 *  The expected video resolution, if you want to change the value when mediaSession is running, you need to invoke commitChangesWithCompletion:
 */
@property (nonatomic) INSVideoResolution expectedVideoResolutionSecondary;

/*!
 *  For one X, use this to choose whether the main or secondary stream should be used as preview stream.
 *  INSPreviewStreamTypeMain : preview with main stream
 *  INSPreviewStreamTypeSecondary : preview with secondary stream
 */
@property (nonatomic) INSPreviewStreamType previewStreamType;

/*!
 *  VR180 and gyroPlayMode == RemoveYawRotations, this should be set to INSPreviewStreamRotationRaw180
 */
@property (nonatomic) INSPreviewStreamRotation previewStreamRotation;

/*!
 * The mode used to calibrate the video, default is INSGyroPlayModeDefault.
 */
@property (nonatomic) INSGyroPlayMode gyroPlayMode;

/*!
 *  The encoding format of video real-time stream, default is INSVideoEncodeH264.
 */
@property (nonatomic) INSVideoEncode videoStreamEncode;

Switching capture lens

The following camera can switch capture lens:

This camera can capture three types of files as following:

The following methods must be called in the order of the sample code.

// Before that, you need to call 'startRunningWithCompletion:' through 'INSCameraMediaSession' to start the preview stream

NSArray *modes = @[@"Panorama", @"Insta Pano", @"Wide angle"];
NSString *currentMode = @"pano";

if ([currentMode isEqualToString:@"Panorama"]) {
    [self.mediaSession stopRunningWithCompletion:^(NSError * _Nullable error) {
        if (error != nil) {
            NSLog(@"stop media session with err: %@", error.description);
            return;
        }

        INSCameraOptions *options = [[INSCameraOptions alloc] init];
        options.focusSensor = INSSensorDeviceAll;
        options.expectOutputType = INSCameraExpectOutputTypeDefault;

        NSArray *types = @[@(INSCameraOptionsTypeFocusSensor), @(INSCameraOptionsTypeExpectOutputType)];

        [[INSCameraManager sharedManager].commandManager setOptions:options forTypes:types completion:^(NSError * _Nullable error, NSArray<NSNumber *> * _Nullable successTypes) {
            if (error != nil) {
                [self showAlertWith:@"" message:error.localizedDescription];
                return;
            }

            [[INSCameraManager sharedManager].commandManager setActiveSensorWithDevice:INSSensorDeviceAll completion:^(NSError * _Nullable error, NSString * _Nullable mediaOffset) {
                if (error) {
                    [self showAlertWith:@"Failed" message:error.description];
                } else {
                    NSString *message = [NSString stringWithFormat:@"current offset: %@", mediaOffset];
                    [self showAlertWith:@"Success" message:message];
                    [self.previewPlayer.renderView clearCurrentPlayImage];
                    [self runMediaSession];
                }
            }];
        }];

    }];
} else if ([currentMode isEqualToString:@"Insta Pano"]) {
    [self.mediaSession stopRunningWithCompletion:^(NSError * _Nullable error) {
        if (error != nil) {
            NSLog(@"stop media session with err: %@", error.description);
            return;
        }

        INSCameraOptions *options = [[INSCameraOptions alloc] init];
        options.focusSensor = INSSensorDeviceRear;
        options.expectOutputType = INSCameraExpectOutputTypeInstaPano;

        NSArray *types = @[@(INSCameraOptionsTypeFocusSensor), @(INSCameraOptionsTypeExpectOutputType)];

        [[INSCameraManager sharedManager].commandManager setOptions:options forTypes:types completion:^(NSError * _Nullable error, NSArray<NSNumber *> * _Nullable successTypes) {
            if (error != nil) {
                [self showAlertWith:@"" message:error.localizedDescription];
                return;
            }

            [[INSCameraManager sharedManager].commandManager setActiveSensorWithDevice:INSSensorDeviceAll completion:^(NSError * _Nullable error, NSString * _Nullable mediaOffset) {
                if (error) {
                    [self showAlertWith:@"Failed" message:error.description];
                } else {
                    NSString *message = [NSString stringWithFormat:@"current offset: %@", mediaOffset];
                    [self showAlertWith:@"Success" message:message];
                    [self.previewPlayer.renderView clearCurrentPlayImage];
                    [self runMediaSession];
                }
            }];
        }];

    }];

} else if ([currentMode isEqualToString:@"Wide angle"]) {
    [self.mediaSession stopRunningWithCompletion:^(NSError * _Nullable error) {
        if (error != nil) {
            NSLog(@"stop media session with err: %@", error.description);
            return;
        }

        INSCameraOptions *options = [[INSCameraOptions alloc] init];
        options.focusSensor = INSSensorDeviceRear;
        options.expectOutputType = INSCameraExpectOutputTypeDefault;

        NSArray *types = @[@(INSCameraOptionsTypeFocusSensor), @(INSCameraOptionsTypeExpectOutputType)];

        [[INSCameraManager sharedManager].commandManager setOptions:options forTypes:types completion:^(NSError * _Nullable error, NSArray<NSNumber *> * _Nullable successTypes) {
            if (error != nil) {
                [self showAlertWith:@"" message:error.localizedDescription];
                return;
            }

            [[INSCameraManager sharedManager].commandManager setActiveSensorWithDevice:INSSensorDeviceRear completion:^(NSError * _Nullable error, NSString * _Nullable mediaOffset) {
                if (error) {
                    [self showAlertWith:@"Failed" message:error.description];
                } else {
                    NSString *message = [NSString stringWithFormat:@"current offset: %@", mediaOffset];
                    [self showAlertWith:@"Success" message:message];
                    [self.previewPlayer.renderView clearCurrentPlayImage];
                    [self runMediaSession];
                }
            }];
        }];

    }];
}

Stitched outputs

INSCameraFlatPanoOutput instance will produce flat panorama video and camera's audio for you.

INSCameraScreenOutput instance will produce screen captured video and camera's audio for you.

INSVideoResolution resolution = INSVideoResolution720x360x30;
_flatPanoOutput = [[INSCameraFlatPanoOutput alloc] initWithOutputWidth:resolution.width
                                                          outputHeight:resolution.height];
[_flatPanoOutput setDelegate:self onDispatchQueue:nil];

/**
 *  set output pixel format to kCVPixelFormatType_32BGRA
 *  if you want to receive the video in bgra instead of NV12 format
 *  flatPanoOutput?.outputPixelFormat = kCVPixelFormatType_32BGRA
 */
[_mediaSession plug:_flatPanoOutput];

RTMP live stream

INSFlatRTMPStreamer is the way to do rtmp live streaming.

- (void)startLive {
    NSInteger bitrate = 10 * 1024 * 1024;
    NSURL *url = #live url#;
    INSFlatRTMPStreamer *streamer =
    [[INSFlatRTMPStreamer alloc] initWithURL:url width:3840 height:1920 fps:30 bitrate:bitrate];
    streamer.delegate = self;

    [self runMediaSession];
    [streamer startLive];
}

- (void)runMediaSession {
    if ([INSCameraManager sharedManager].cameraState != INSCameraStateConnected) {
        return ;
    }

    __weak typeof(self)weakSelf = self;
    if (_mediaSession.running) {
        self.view.userInteractionEnabled = NO;
        [_mediaSession commitChangesWithCompletion:^(NSError * _Nullable error) {
            NSLog(@"commitChanges media session with error: %@",error);
            weakSelf.view.userInteractionEnabled = YES;
            if (error) {
                [weakSelf showAlertWith:@"commitChanges media failed" message:error.description];
            }
        }];
    }
    else {
        self.view.userInteractionEnabled = NO;
        [_mediaSession startRunningWithCompletion:^(NSError * _Nullable error) {
            NSLog(@"start running media session with error: %@",error);
            weakSelf.view.userInteractionEnabled = YES;
            if (error) {
                [weakSelf showAlertWith:@"start media failed" message:error.description];
                [weakSelf.previewPlayer playWithSmoothBuffer:NO];
            }
        }];
    }
}

Media

There is a special data segment in the video or photo captured by Insta360 camera, which is called INSExtraInfo. The INSExtraInfo contains the corresponding file's thumbnail, extra metedata, gyroscope data, etc. For more information, please check INSExtraInfo.h.

In general, we suggest that you obtain the above information from a file whose file name (VIN Channel)(Stream Num) is '00'. For example: IMG_19700101_000000_00_001.insp.

If you are working on a wide angle file, and the file is a selfies file ( how to knonw a file is a selfies file ), you should use IMG_19700101_000000_10_001.insp which (VIN Channel) is '1' instead .

INSExtraInfo

INSExtraInfo contains the corresponding file's thumbnail, extra metedata, gyroscope data, etc. You can get the above information through INSImageInfoParser/INSVideoInfoParser.

NSURL *url = #source url#
INSImageInfoParser *parser = [[INSImageInfoParser alloc] initWithURL:url];
if ([parser open]) {
    BOOL isLensSelfies = parser.extraInfo.metadata.lensSelfies;
}
NSURL *url = #source url#
INSVideoInfoParser *parser = [[INSVideoInfoParser alloc] initWithURL:url];
if ([parser openFast]) {
    BOOL isLensSelfies = parser.extraInfo.metadata.lensSelfies;
}

Thumbnail

Photos

You can get the pre stored thumbnail data in the file through INSImageInfoParser.

NSURL *url = #source url#
INSImageInfoParser *parser = [[INSImageInfoParser alloc] initWithURL:url];
if ([parser open]) {
    NSData *data = parser.extraInfo.thumbnail;
    UIImage *thumbnail = [[UIImage alloc] initWithData:data];
}

If you are working on a panoramic file, you also need to stitch the file, see how to stitch image.

Videos

You can get the pre stored thumbnail data in the file through INSVideoInfoParser.

NSURL *url = #source url#
INSVideoInfoParser *parser = [[INSVideoInfoParser alloc] initWithURL:url];

UIImage *thumbnail;
if ([parser openFast]) {
    INSThumbnailRender *render = [[INSThumbnailRender alloc] init];

    NSData *data = parser.extraInfo.thumbnail;
    CVPixelBufferRef buffer = [render copyPixelBufferWithDecodeH264Data:data];

    // if and only if the video resolution is 5.7k, the thumbnails are divided into thumbnail and ext_thumbnail
    if (parser.extraInfo.metadata.dimension.width == INSVideoResolution2880x2880x30.width
        && parser.extraInfo.metadata.dimension.height == INSVideoResolution2880x2880x30.height) {
        NSData *extData = parser.extraInfo.ext_thumbnail;
        CVPixelBufferRef extBuffer = [render copyPixelBufferWithDecodeH264Data: extData];
        thumbnail = [UIImage imageWithPixelBuffer:buffer rightPixelBuffer:extBuffer];
    } else {
        thumbnail = [UIImage imageWithPixelBuffer:buffer];
    }
}

If you are working on a panoramic file, you also need to do a splicing of the file, see how to stitch image.

Stitch

The following parameters are needed to correct and stitch a picture:

Here is a sample code shows how to get these parameters:

#pragma mark Photos

NSURL *url = #source url#
INSImageInfoParser *parser = [[INSImageInfoParser alloc] initWithURL:url];
if ([parser open]) {
    // offset
    NSString *offset = parser.extraInfo.metadata.offset;

    // resolution
    CGSize resolution = parser.extraInfo.metadata.dimension;

    // gyroscope data
    INSExtraGyroData *gyroData = parser.extraInfo.gyroData;
}
#pragma mark Videos

NSURL *url = #source url#
INSVideoInfoParser *parser = [[INSVideoInfoParser alloc] initWithURL:url];
if ([parser openFast]) {
    // offset
    NSString *offset = parser.extraInfo.metadata.offset;

    // resolution
    CGSize resolution = parser.extraInfo.metadata.dimension;

    // gyroscope data
    INSExtraGyroData *gyroData = parser.extraInfo.gyroData;
}

Photos

Using INSFlatPanoOffscreenRender to get a flat pano image. ( P.s. The parameter, offset is nonnull )

// if it is the original image, we recommend that outputsize be set to `parser.extraInfo.metadata.dimension`
CGSize outputSize = #output size#
UIImage *origin = #photo thumbnail to be stitched#

NSURL *url = #source url#
INSImageInfoParser *parser = [[INSImageInfoParser alloc] initWithURL:url];
if ([parser open]) {
    INSFlatPanoOffscreenRender *render = [[INSFlatPanoOffscreenRender alloc] initWithRenderWidth:outputSize.width height:outputSize.height];
    render.eulerAdjust = parser.extraInfo.metadata.euler;
    render.offset = parser.extraInfo.metadata.offset;

    render.gyroStabilityOrientation = GLKQuaternionIdentity;
    if (parser.extraInfo.gyroData) {
        INSGyroPBPlayer *gyroPlayer = [[INSGyroPBPlayer alloc] initWithPBGyroData:parser.extraInfo.gyroData];
        GLKQuaternion orientation = [gyroPlayer getImageOrientationWithRenderType:INSRenderTypeFlatPanoRender];
        render.gyroStabilityOrientation = orientation;
    }

    [render setRenderImage:origin];
    UIImage *output = [render renderToImage];
}

Videos

Using INSFlatPanoOffscreenRender to get a flat pano image. ( P.s. The parameter, offset is nonnull )

// if it is the original image, we recommend that outputsize be set to `parser.extraInfo.metadata.dimension`
CGSize outputSize = #output size#
CVPixelBufferRef buffer = #video thumbnail to be stitched#

// if and only if the video resolution is 5.7k, the thumbnails are divided into thumbnail and ext_thumbnail
CVPixelBufferRef extBuffer = #video thumbnail to be stitched#

NSURL *url = #source url#
INSVideoInfoParser *parser = [[INSVideoInfoParser alloc] initWithURL:url];
if ([parser openFast]) {
    INSFlatPanoOffscreenRender *render = [[INSFlatPanoOffscreenRender alloc] initWithRenderWidth:outputSize.width height:outputSize.height];
    render.eulerAdjust = parser.extraInfo.metadata.euler;
    render.offset = parser.extraInfo.metadata.offset;

    render.gyroStabilityOrientation = GLKQuaternionIdentity;
    if (parser.extraInfo.gyroData) {
        INSGyroPBPlayer *gyroPlayer = [[INSGyroPBPlayer alloc] initWithPBGyroData:parser.extraInfo.gyroData];
        GLKQuaternion orientation = [gyroPlayer getImageOrientationWithRenderType:INSRenderTypeFlatPanoRender];
        render.gyroStabilityOrientation = orientation;
    }

    if (buffer && extBuffer) {
        [render setRenderPixelBuffer:buffer right:extBuffer timestamp:parser.extraInfo.metadata.thumbnailGyroTimestamp];
    } else if (buffer) {
        [render setRenderPixelBuffer:buffer timestamp:parser.extraInfo.metadata.thumbnailGyroTimestamp];
    }
    UIImage *output = [render renderToImage];
}

Generate HDR image

Using INSHDRTask to generate HDR image. HDR synthesis takes a long time and takes about 5-10 seconds.

The URLs that is passed into INSHDROptions is ordered array, and the array order is [ 0ev, -ev, +ev ]. Through the photos taken by ONE X, the default ascending order of the file name is [ 0ev, -ev, +ev ].

NSArray *urls = @[
    INSHTTPURLForResourceURI(@"0ev"),
    INSHTTPURLForResourceURI(@"-ev"),
    INSHTTPURLForResourceURI(@"+ev"),
];

INSHDROptions *options = [[INSHDROptions alloc] init];
options.urls = urls;
options.seamlessType = INSSeamlessTypeOpticalFlow;

INSHDRTask *task = [[INSHDRTask alloc] initWithCommandManager:[INSCameraManager sharedManager].commandManager];
[task processWithOptions:options completion:^(NSError * _Nullable error, NSData * _Nullable photoData) {
    if (error) {
        NSLog(@"%@ failed with error: %@",sender.title, error);
        return ;
    }

    // do anything with the stitched image here, for example, display it
    if (photoData) {
        UIImage *image = [[UIImage alloc] initWithData:photoData];
        UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);
    }
}];

You can choose the following two lib for HDR synthesis, and the preferred lib is INSHDRLibInsImgProc.

typedef NS_ENUM(NSUInteger, INSHDRLib) {
    /// using `OpenCV` to generate hdr image
    INSHDRLibOpenCV,

    /// using `InsImgProLib` to generate hdr image
    INSHDRLibInsImgProc,
};

You can choose the following two algorithms for HDR synthesis, and the preferred algorithms is INSSeamlessTypeOpticalFlow.

typedef NS_ENUM(NSUInteger, INSSeamlessType) {
    /// default type
    INSSeamlessTypeTemplate,

    /// using Optical flow
    INSSeamlessTypeOpticalFlow,
};

Gyroscope data - INSMediaGyro

NSURL *url = #source url#
INSImageInfoParser *parser = [[INSImageInfoParser alloc] initWithURL:url];
if ([parser open]) {
    NSLog(@"%@",parser.extraInfo.metadata.gyro);
}
NSURL *url = #source url#
INSImageInfoParser *parser = [[INSImageInfoParser alloc] initWithURL:url];
if ([parser open]) {
    NSLog(@"%@",parser.extraInfo.gyroData);
}

Media gyro ajust

You can use INSFlatGyroAdjustOffscreenRender to correct the stitched image:

// if it is the original image, we suggests that using `parser.extraInfo.metadata.dimension`
CGSize outputSize = #output size#
UIImage *origin = #photo thumbnail that has been stitched#

NSURL *url = #source url#
INSImageInfoParser *parser = [[INSImageInfoParser alloc] initWithURL:url];
if ([parser open]) {
    CGSize outputSize = parser.extraInfo.metadata.dimension;
    INSFlatGyroAdjustOffscreenRender *render = [[INSFlatGyroAdjustOffscreenRender alloc] initWithRenderWidth:outputSize.width height:outputSize.height];
    render.eulerAdjust = parser.extraInfo.metadata.euler;
    render.offset = parser.extraInfo.metadata.offset;

    render.gyroStabilityOrientation = GLKQuaternionIdentity;
    if (parser.extraInfo.gyroData) {
        INSGyroPBPlayer *gyroPlayer = [[INSGyroPBPlayer alloc] initWithPBGyroData:parser.extraInfo.gyroData];
        GLKQuaternion orientation = [gyroPlayer getImageOrientationWithRenderType:INSRenderTypeFlatPanoRender];
        render.gyroStabilityOrientation = orientation;
    }

    [render setRenderImage:origin];
    UIImage *output = [render renderToImage];
}

EXIF & XMP

You can read or modify the EXIF and XMP information of the image through INSImageMetadataProcessor.

The EXIF and XMP information will be lost after correction and stitch. If you still need to use the above information, you need to temporarily store the EXIF and XMP of the original file, and finally write back to the file after correction and stitch.

UIImage *origin = #origin image#;
// Reading metatdata & xmp
INSImageMetadataProcessor *processor =
    [[INSImageMetadataProcessor alloc] initWithUIImage:origin ouputType:INSImageDataTypeJPEG compression:1.0];
NSLog(@"you can get exif from jpeg image: %@",processor.exif);
NSLog(@"you can get xmp from jpeg image: %@",processor.xmp);

// modify the source create time of jpeg image
processor.xmp.sourceImageCreateTime = [NSDateFormatter localizedStringFromDate:[NSDate date]
                                                                     dateStyle:NSDateFormatterFullStyle
                                                                     timeStyle:NSDateFormatterFullStyle];

// modify the exif of jpeg image
processor.exif = [[INSImageExif alloc] initDefaultWithWidth:origin.size.width
                                                     height:origin.size.height cameraType:@"Insta360 xxx"];

// retrieve jpeg data containing exif and xmp
NSData *result = [processor getImageData];

Playback

Using INSRenderView to display the pano file, and using INSPreviewer2 to play video.

- (void)setupRenderView {
    /**
     *  if you are working with a double fish eyes file, you should set renderType to `INSRenderTypeSphericalPanoRender`
     *  if you are working with a file which has been stitched, you should set renderType to `INSRenderTypeSphericalRender`
     */
    INSRenderType renderType = #renderType#;
    INSRenderView *renderView = [[INSRenderView alloc] initWithFrame:self.view.bounds renderType: renderType];
    [self.view addSubview:renderView];
    self.renderView = renderView;
}

- (void)setupPreviewerWithRenderView:(INSRenderView *)renderView {
    INSPreviewer2 *previewer = [[INSPreviewer2 alloc] init];
    previewer.displayDelegate = renderView;
    self.previewer = previewer;
}

// pano image playback 
- (void)playImageWithData:(NSData *)data {
    NSString *offset = nil;
    switch (_renderView.render.type) {
        // double fish eyes image
        case INSRenderTypeSphericalPanoRender: {
            INSImageInfoParser *parser = [[INSImageInfoParser alloc] initWithData: data];
            if ([parser open]) {
                offset = parser.offset;
            }
            break;
        }
        // flat pano image which has been stitched
        case INSRenderTypeSphericalRender:
            // do nothing
            break;
        default:
            break;
    }

    UIImage *image = [[UIImage alloc] initWithData:data];
    [_renderView playImage:image offset:offset];
}

// pano video playback
- (void)playVideoWithURLs:(NSArray<NSURL *> *)urls {
    NSTimeInterval duration = 0;
    CGFloat framerate = 0;
    NSString *offset = nil;
    NSInteger mediaFileSize = 0;

    INSVideoInfoParser *parser = [[INSVideoInfoParser alloc] initWithURLs:urls];
    if ([parser openFast]) {
        offset = parser.offset;
        duration = parser.duration;
        framerate = parser.framerate;
        mediaFileSize = parser.mediaFileSize;
    }

    // (The actual framerate of the video) / (Expected framerate for playback)
    CGFloat factor = framerate / 30;
    NSInteger durationMs = duration * 1000;
    INSTimeScale *timeScale = [[INSTimeScale alloc] initWithFactor:factor startTimeMs:0 endTimeMs:durationMs];

    INSFileClip *videoClip =
    [[INSFileClip alloc] initWithURLs:urls
                          startTimeMs:0
                            endTimeMs:durationMs
                   totalSrcDurationMs:durationMs
                           timeScales:@[timeScale]
                             hasAudio:YES
                        mediaFileSize:mediaFileSize
              videoTrackCount:parser.extraInfo.metadata.videoTrackCount
           reverseVideoTrackOrder:parser.extraInfo.metadata.reverseVideoTrackOrder];
    [_previewer setVideoSource:@[videoClip] bgmSource:nil videoSilent:NO];

    // you can set the playback begin time. default is 0.
    [_previewer prepareAsync:0];
    [_renderView playVideoWithOffset:offset];
}

Internal parameters

Using INSOffsetParser to get the INSOffsetParameter internal parameters

NSURL *url = #source url#
INSImageInfoParser *parser = [[INSImageInfoParser alloc] initWithURL:url];
if ([parser open]) {
    INSExtraInfo *extraInfo = parser.extraInfo;

    INSOffsetParser *offsetParser =
    [[INSOffsetParser alloc] initWithOffset:extraInfo.metadata.offset
                                      width:extraInfo.metadata.dimension.width
                                     height:extraInfo.metadata.dimension.height];
    for (INSOffsetParameter *param in offsetParser.parameters) {
        NSLog(@"Internal parameters: %@", param);
    }
}

OSC

The camera already supports the /osc/info and /osc/state commands. You can use these commands to get basic information about the camera and the features it supports.

Commands

Execute commands via Open Sepherial Camera API/osc/commands/execute

You need to poll yourself to call /osc/commands/status to get the execution status of the camera on the current command. The polling cycle can be adjusted according to specific conditions. Here is a sample code shows how to execute a camera.takePicture command.

#import <Foundation/Foundation.h>

NSDictionary *headers = @{ @"Content-Type": @"application/json",
                           @"X-XSRF-Protected": @"1",
                           @"Accept": @"application/json" };
NSDictionary *parameters = @{ @"name": @"camera.takePicture" };

NSData *postData = [NSJSONSerialization dataWithJSONObject:parameters options:0 error:nil];

NSURL *url = [NSURL URLWithString:@"#CameraHost#/commands/execute"];
NSMutableURLRequest *request = [NSMutableURLRequest requestWithURL:url
                                                       cachePolicy:NSURLRequestUseProtocolCachePolicy
                                                   timeoutInterval:10.0];
[request setHTTPMethod:@"POST"];
[request setAllHTTPHeaderFields:headers];
[request setHTTPBody:postData];

NSURLSession *session = [NSURLSession sharedSession];
[[session dataTaskWithRequest:request
            completionHandler:^(NSData *data, NSURLResponse *response, NSError *error) {
    if (error) {
        NSLog(@"%@", error);
    } else {
        NSHTTPURLResponse *httpResponse = (NSHTTPURLResponse *) response;
        NSLog(@"%@", httpResponse);
    }
}] resume];

Options

Take picture & Record

List files

You can use camera.listFiles to get the files list.