Object Fusion Tracker

Related documentations
Visual SLAM Tool
Tracker Coordinate System in Unity
Visual SLAM Learning Guide

The Object Fusion Tracker loads the map file and renders a 3D object on it. After target recognition and initial poses are acquired through the MAXST SDK, use AR Core/AR Kit for tracking.
※To use the AR Core/AR Kit, you must enter the actual size. (See Start / Stop Tracker)

The biggest difference from the existing Object Tracker is that the existing Object Tracker tracks through the frame input from the RGB camera. Due to the nature of RGB cameras, tracking will be lost if the target deviates from the camera frame or if there are few feature points. Object Fusion Tracker, on the other hand, tracks through the AR Core/AR Kit, which allows the target to deviate from the camera frame or keep the feature point at least without tracking, due to the nature of learning the environment in which the current target lies.

Please refer Visual SLAM Learning Guide to create a map more precisely while scanning 3D space.

Make Object Fusion Tracker Scene
Set Map
Add / Replace Map
Start / Stop Tracker
Use Tracking Information
Change Tracking Mode

Make Object Tracker Scene

  1. Install the MAXST AR SDK for Unity.

  2. Create the new scene.

  3. Delete the Main Camera that exists by default and add 'Assets > MaxstAR > Prefabs > ARCamera, ObjectTrackable' to the scene.

    objectPrefab

    ※ If you build, you should add License Key to ARCamera.

  4. Create an empty object and add 'Assets > MaxstARSamples > Scripts > ObjetFusionTrackerSample' as a component.

    objectSample

  5. Place the map file created by Visual SLAM scene in Assets > StreamingAssets > MaxstAR, and drag map file to Inspector of ObjectTrackable to set map file. After entering the measured distance of the last two pins, Press the Load button to set the map in the map viewer described later.
    ※ If the map file is not placed under the StreamingAssets folder, the map file is not recognized.

    ObjectFusionMapSetting

  6. After the map file is loaded, you can see pins with which the trained object is overlaid in Scene View and Game View. You can check some pins in the red rectangles in the below figure.

    ![after_load.PNG] (https://mdpkcstorage.blob.core.windows.net/testpublic/slug/rf/20190227/e076e3e6694d45eeb8a34ba29757a40e.PNG)

  7. In MapViewer's Inspector, 'Show Mesh' option is checked as default. Choose a keyframe where you can work easily with moving the slider bar.

    inspector.PNG

    ※ MapViewer

    • If you set a map file in ObjectTrackable, MapViewer is automatically created as a child of ObjectTrackable.

    • Keyframe Id: When creating a map, the camera looks at the target in various directions. At this time, a Keyframe is created for each direction that satisfies a certain condition. If you change Keyframe Id, Mesh / Image of the Keyframe is displayed in Game View. By positioning Cube in each Keyframe, you can position Cube more precisely.

    • Show Mesh: If checked, the image displayed in the Game View changes to Mesh.

    • Transparent: If checked, Mesh / Image becomes transparent.

  8. Place a virtual object as a child of ObjectTrackable. 'maxst_cube' is placed as default.

  9. The position of a pin means where to locate a virtual object while someone creates an object map via Visual SLAM Tool. Place a virtual object referring to pin positions.

    arrangement.PNG

  10. Connect a webcam to PC and press 'play' button. Point the webcam at the target object and check whether your virtual objects are located places where you want.

    augment.PNG

Set Map

By calling addTrackerData () to register the map file and calling loadTrackerData (), Space can be tracked. To set a map, refer to the following code.

>ObjectFusionTrackerSample.cs

private void AddTrackerData()
{
    foreach (var trackable in objectTrackablesMap)
    {
        if (trackable.Value.TrackerDataFileName.Length == 0)
        {
            continue;
        }

        if (trackable.Value.StorageType == StorageType.AbsolutePath)
        {
            TrackerManager.GetInstance().AddTrackerData(trackable.Value.TrackerDataFileName);
        }
        else
        {
            if (Application.platform == RuntimePlatform.Android)
            {
                TrackerManager.GetInstance().AddTrackerData(trackable.Value.TrackerDataFileName, true);
            }
            else
            {
                TrackerManager.GetInstance().AddTrackerData(Application.streamingAssetsPath + "/" + trackable.Value.TrackerDataFileName);
            }
        }
    }

    TrackerManager.GetInstance().LoadTrackerData();
}

Add / Replace Map

  1. Create a map file refer to Visual SLAM Tool

  2. Copy the received map file to the desired path.

  3. Set a map.

  4. If you have an existing map file, call AddTrackerData () and LoadTrackerData () after calling TrackerManager.GetInstance ().RemoveTrackerData().

Start / Stop Tracker

TrackerManager.getInstance().IsFusionSupported ()
This function checks whether or not your device supports Fusion.
Return value is bool type. If true, it supports the device in use. If it is false, it does not support the device.

TrackerManager.getInstance().GetFusionTrackingState ()
Pass the tracking status of the current Fusion.
The return value is an int of -1, which means that tracking isn't working properly, and 1 means that it's working properly.

To start / stop Tracker after loading the map, refer to the following code.

>ObjectFusionTrackerSample.cs

void Start()
{
    ...
CameraDevice.GetInstance().SetARCoreTexture();
TrackerManager.GetInstance().StartTracker(TrackerManager.TRACKER_TYPE_OBJECT);
TrackerManager.GetInstance().AddTrackerData("ObjectTarget/obj_1010_7.3dmap", true);
TrackerManager.GetInstance().AddTrackerData("{\"object_fusion\":\"set_length\",\"object_name\":\"obj_1010_7\", \"length\":0.12}", true);
    TrackerManager.GetInstance().LoadTrackerData();
    ...
}

void OnApplicationPause(bool pause)
{
    ...
    TrackerManager.GetInstance().StopTracker();
    ...
}

void OnDestroy()
{
    TrackerManager.GetInstance().StopTracker();
    TrackerManager.GetInstance().DestroyTracker();
    ...
}

On the first call to addTrackerData (), the parameter must be passed the path of the 3dmap.
The addTrackerData () in the second call is used to determine the actual size of the 3dmap. (Unit: m)
Object_name is the file name of the 3dmap.
Length is the actual size of the two anchors you marked last when creating the 3dmap. (Please refer to Visual SLAM Tool.)
You must enter the actual size of the target. If you do not enter the correct actual size, the content will not be augmented properly.
It must be run in the following order: startTracker (), addTrackerData (), loadTrackerData ().

Use Tracking Information

To use the Tracking information, refer to the following code.

>ObjectFusionTrackerSample.cs

void Update()
{
    ...
    TrackingState state = TrackerManager.GetInstance().UpdateTrackingState();
    TrackingResult trackingResult = state.GetTrackingResult(); 

    for (int i = 0; i < trackingResult.GetCount(); i++)
    {
        Trackable trackable = trackingResult.GetTrackable(i);

        if (!objectTrackablesMap.ContainsKey(trackable.GetName()))
        {
            return;
        }

        objectTrackablesMap[trackable.GetName()].OnTrackSuccess(trackable.GetId(), trackable.GetName(), trackable.GetPose());
    }
}