Try MAXST AR Fusion Tracker Now ✨

Visual SLAM

Related documentations
Map Manager
Tracker Coordinate System
Visual SLAM Learning Guide

The Visual SLAM(Simultaneous Localization and Mapping) creates a map by scanning 3D space.

Please refer Visual SLAM Learning Guide to create a map more precisely while scanning 3D space.

Create Instants
Start / Stop Tracker
Start / Stop Map Creation & Map Saving
Draw Object by SceneKit

Create Instants

>SlamGameViewController.swift

var cameraDevice:MasCameraDevice = MasCameraDevice()
var sensorDevice:MasSensorDevice = MasSensorDevice()
var trackingManager:MasTrackerManager = MasTrackerManager()

Start / Stop Tracker

To start / stop Tracker, refer to the following code.

>SlamGameViewController.swift

@objc func resumeAR()
{
    ...
    trackingManager.start(.TRACKER_TYPE_SLAM)
}

@objc func pauseAR()
{
    trackingManager.stopTracker()
    ...
}

Start / Stop Map Creation & Map Saving

  • To start a map generation, refer to the following code.
trackingManager.findSurface()
  • To stop generating a map, refer to the following code.
trackingManager.quitFindingSurface()

Draw Object by SceneKit

You must use Convert Matrix for Metal.('metalConverter')

>SlamGameViewController.swift

func renderer(_ renderer: SCNSceneRenderer, willRenderScene scene: SCNScene, atTime time: TimeInterval) {
        
    if backgroundCameraQuad == nil {
        backgroundCameraQuad = BackgroundCameraQuad(device: renderer.device, pixelFormat: renderer.colorPixelFormat)
    }
    
    let commandEncoder:MTLRenderCommandEncoder = renderer.currentRenderCommandEncoder!
    
    let trackingState:MasTrackingState = trackingManager.updateTrackingState()
    let result:MasTrackingResult = trackingState.getTrackingResult()
    
    let surfaceMesh:MasSurfaceMesh! = trackingManager.getSurfaceMesh()
    
    if surfaceMesh != nil {
        let value:Float = surfaceMesh.getInitializingProgress() * 0.01
        DispatchQueue.main.async(execute: {
            self.progressBar.setProgress(value, animated: false)
        })
    }
    
    let backgroundImage:MasTrackedImage = trackingState.getImage()
    
    var backgroundProjectionMatrix:matrix_float4x4 = cameraDevice.getBackgroundPlaneProjectionMatrix()
    
    backgroundProjectionMatrix = backgroundProjectionMatrix * metalConverter
    
    let projectionMatrix:matrix_float4x4 = cameraDevice.getProjectionMatrix()
    
    if let cameraQuad = backgroundCameraQuad {
        if(backgroundImage.getData() != nil) {
            cameraQuad.setProjectionMatrix(projectionMatrix: backgroundProjectionMatrix)
            cameraQuad.draw(commandEncoder: commandEncoder, image: backgroundImage)
        }
    }
    
    let trackingCount:Int32 = result.getCount()
    
    if trackingCount > 0 {
        for i in stride(from: 0, to: trackingCount, by: 1) {
            let trackable:MasTrackable = result.getTrackable(i)

            let projectionSCNMatrix4 = SCNMatrix4.init(projectionMatrix)
            let poseSCNMatrix4 = SCNMatrix4.init(trackable.getPose())
            cameraNode.camera?.projectionTransform = projectionSCNMatrix4
            monsterNode.transform = SCNMatrix4Mult(monsterTransform, poseSCNMatrix4)
        }
    }
    else {
        monsterNode.scale = SCNVector3Make(0.0, 0.0, 0.0)
    }
}

Preferably, use landscape left mode (screen orientation) for Visual SLAM.