Object Fusion Tracker iOS Tutorial

1. Overview
2. iOS Development
2.1 Create Instants
2.2 Start / Stop Tracker
2.3 Use Tracking Information
2.4 Set Map
2.5 Add / Replace Map
3. Reference
3.1 API Reference
3.2 Sample


1. Overview

Start developing MAXST ARSDK Object Fusion Tracker on iOS Platform. Refer to Object Tracker Introduction for detailed information.

Generate 3D Mapfile of the target for Object Tracker via Visual SLAM Tool. The Object Tracker loads the map file and renders 3D object on it. Refer to Visual SLAM Learning Guide to create a map more precisely while scanning 3D space.
Refer to Tracker Coordinate System to better understand 3D coordinate system of Object Fusion Tracker iOS Tutorial .

The Object Tracker loads the map file and renders 3D object on it.
After target recognition and initial poses are acquired through the MAXST SDK, use AR Kit for tracking.

※To use the AR Kit, you must enter the actual size. (See Start / Stop Tracker)

Prerequisites
Object Tracker Introduction
Visual SLAM Tool
Tracker Coordinate System
Visual SLAM Learning Guide


2. iOS Development

Start developing on xCode using Swift. Refer to Requirements & Supports to find out which devices are supported.

ARSDK has to properly integrate on iOS UIViewController. Refer to Life Cycle documents for detail.


2.1 Create Instants

ObjectFusionTrackerViewController.swift

var trackingManager:MasTrackerManager = MasTrackerManager()

2.2 Start / Stop Tracker

trackingManager.isFusionSupported ()
This function checks whether or not your device supports Fusion.
Return value is bool type. If true, it supports the device in use. If it is false, it does not support the device.

trackingManager.getFusionTrackingState ()
Pass the tracking status of the current Fusion.
The return value is an int of -1, which means that tracking isn't working properly, and 1 means that it's working properly.

To start / stop the tracker after loading the map, refer to the following code.

ObjectFusionTrackerViewController.swift

func startEngine() {
     ...
        
if(trackingManager.isFusionSupported()) {
     trackingManager.start(.TRACKER_TYPE_OBJECT_FUSION)
     let objectTrackerMapPath:String = Bundle.main.path(forResource: "obj_190712_7", ofType: "3dmap", inDirectory: "data/SDKSample")!
     trackingManager.addTrackerData(objectTrackerMapPath)
     trackingManager.addTrackerData("{\"object_fusion\":\"set_length\",\"object_name\":\"obj_190712_7\", \"length\":0.53}");
     trackingManager.loadTrackerData()
     }
}

@objc func resumeAR()
{
    ...
     trackingManager.start(.TRACKER_TYPE_OBJECT_FUSION)
}

@objc func pauseAR()
{
     trackingManager.stopTracker()
    ...
}

On the first call to addTrackerData (), the parameter must be passed the path of the 3dmap.
The addTrackerData () in the second call is used to determine the actual size of the 3dmap. (Unit: m)
Object_name is the file name of the 3dmap.
Length is the actual size of the two anchors you marked last when creating the 3dmap. (Please refer to Visual SLAM Tool.)
You must enter the actual size of the target. If you do not enter the correct actual size, the content will not be augmented properly.
It must be run in the following order: startTracker (), addTrackerData (), loadTrackerData ().


2.3 Use Tracking Information

To use the Tracking information, refer to the following code.

ObjectFusionTrackerViewController.swift

func draw(in view: MTKView) {
    ...
    
    let trackingState:MasTrackingState = trackingManager.updateTrackingState()
    let result:MasTrackingResult = trackingState.getTrackingResult()
    
    let backgroundImage:MasTrackedImage = trackingState.getImage()
    var backgroundProjectionMatrix:matrix_float4x4 = cameraDevice.getBackgroundPlaneProjectionMatrix()
    
    let projectionMatrix:matrix_float4x4 = cameraDevice.getProjectionMatrix()
    
    if let cameraQuad = backgroundCameraQuad {
        cameraQuad.setProjectionMatrix(projectionMatrix: backgroundProjectionMatrix)
        cameraQuad.draw(commandEncoder: commandEncoder, image: backgroundImage)
    }
    
    let trackingCount:Int32 = result.getCount()
    
    for i in stride(from: 0, to: trackingCount, by: 1) {
        let trackable:MasTrackable = result.getTrackable(i)
        let poseMatrix:matrix_float4x4 = trackable.getPose()
        
        textureCube.setProjectionMatrix(projectionMatrix: projectionMatrix)
        textureCube.setPoseMatrix(poseMatrix: poseMatrix)
        textureCube.setTranslation(x: 0.0, y: 0.0, z: -0.15)
        textureCube.setScale(x: 0.3, y: 0.3, z: 0.3)
        textureCube.draw(commandEncoder: commandEncoder)
    }
    ...
}

2.4 Set Map

By calling function addTrackerData to register the map file and calling function loadTrackerData, Space can be tracked. To set a map, refer to the following code.

ObjectFusionTrackerViewController.swift

func startEngin()
{
    ...
    trackingManager.addTrackerData(objectTrackerMapPath)
    trackingManager.loadTrackerData()
}

2.5 Add / Replace Map

※ You can add multiple 3Dmaps to recognize one of multiple objects. We recommend to add up to three maps.

  1. Create a map file refer to Visual SLAM Tool.

  2. Copy the received map file to the desired path.

  3. Set a map.

  4. If you have an existing map file, call function 'addTrackerData' and function 'loadTrackerData' after calling trackingManager.removeTrackerData()


3. References

These are additional references to develop Object Fusion Tracker


3.1 API Reference

Following documents explain classes and functions used to run Object Fusion tracker.

MaxstAR Class

TrackerManager Class

CameraDevice Class


3.2 Sample

For information regarding sample build and run of Object Fusion Tracker, refer to Sample

ObjectFusionTrackerViewController.swift