Object Fusion Tracker

Related documentations
Visual SLAM Tool
Tracker Coordinate System in Unity
Visual SLAM Learning Guide

The Object Fusion Tracker loads the map file and renders a 3D object on it. After target recognition and initial poses are acquired through the MAXST SDK, use ARCore/ARKit for tracking.
※To use the ARCore/ARKit, you must enter the actual size. (See Start / Stop Tracker)

The biggest difference from the existing Object Tracker is that the existing Object Tracker tracks through the frame input from the RGB camera. Due to the nature of RGB cameras, tracking will be lost if the target deviates from the camera frame or if there are few feature points. Object Fusion Tracker, on the other hand, tracks through the ARCore/ARKit, which allows the target to deviate from the camera frame or keep the feature point at least without tracking, due to the nature of learning the environment in which the current target lies.

Please refer Visual SLAM Learning Guide to create a map more precisely while scanning 3D space.

Make Object Fusion Tracker Scene
Set Map
Add / Replace Map
Start / Stop Tracker
Use Tracking Information
Change Tracking Mode

Make Object Fusion Tracker Scene

  1. Install the MAXST AR SDK for Unity.

  2. Create the new scene.

  3. Delete the Main Camera that exists by default and add 'Assets > MaxstAR > Prefabs > ARCamera, ObjectTrackable' to the scene.

    objectPrefab ※ If you build an application, you must add a License Key to ARCamera.

  4. Create an empty object and add 'Assets > MaxstARSamples > Scripts > ObjetFusionTrackerSample' as a component.

    objectSample

  5. Place the map file created by Visual SLAM scene in Assets > StreamingAssets > MaxstAR, and drag map file to Inspector of ObjectTrackable to set map file. After entering the measured distance of the last two pins, Press the Load button to set the map in the map viewer described later.
    ※ If the map file is not placed under the StreamingAssets folder, the map file is not recognized.

    ObjectFusionMapSetting

  6. After the map file is loaded, you can see pins with which the trained object is overlaid in Scene View and Game View. You can check some pins in the red rectangles in the below figure.

    ![after_load.PNG] (https://mdpkcstorage.blob.core.windows.net/testpublic/slug/rf/20190227/e076e3e6694d45eeb8a34ba29757a40e.PNG)

  7. In MapViewer's Inspector, 'Show Mesh' option is checked as default. Choose a keyframe where you can work easily with moving the slider bar.

    inspector.PNG

    ※ MapViewer

    • If you set a map file in ObjectTrackable, MapViewer is automatically created as a child of ObjectTrackable.

    • Keyframe Id: When creating a map, the camera looks at the target in various directions. At this time, a Keyframe is created for each direction that satisfies a certain condition. If you change Keyframe Id, Mesh / Image of the Keyframe is displayed in Game View. By positioning Cube in each Keyframe, you can position Cube more precisely.

    • Show Mesh: If checked, the image displayed in the Game View changes to Mesh.

    • Transparent: If checked, Mesh / Image becomes transparent.

  8. Place a virtual object as a child of ObjectTrackable. 'maxst_cube' is placed as default.

  9. right-click in the Hierachy pane and select UI > Legacy > Text to create a Text and type in the phrase "Move Your Device." Then select Object Fusion Tracker Sample and drag the Text to add it to the Guide View in the Inspector window.

  10. The position of a pin means where to locate a virtual object while someone creates an object map via Visual SLAM Tool. Place a virtual object referring to pin positions.

    arrangement.PNG

  11. Build and run the app. ARKit/ARCore require an initial spatial recognition process. Move the camera until the "Move Your Device" text disappears. Then,point the camera at the target object and the virtual content will augment where you place it.

  12. ObjectFusionTracker does not support testing in the Unity Editor environment.

    ※ You can download the target Object from the Object Target link.

Set Map

By calling addTrackerData () to register the map file and calling loadTrackerData (), Space can be tracked. To set a map, refer to the following code.

>ObjectFusionTrackerSample.cs

private void AddTrackerData()
{
    foreach (var trackable in objectTrackablesMap)
    {
        if (trackable.Value.TrackerDataFileName.Length == 0)
        {
            continue;
        }

        if (trackable.Value.StorageType == StorageType.AbsolutePath)
        {
            TrackerManager.GetInstance().AddTrackerData(trackable.Value.TrackerDataFileName);
        }
        else
        {
            if (Application.platform == RuntimePlatform.Android)
            {
                TrackerManager.GetInstance().AddTrackerData(trackable.Value.TrackerDataFileName, true);
            }
            else
            {
                TrackerManager.GetInstance().AddTrackerData(Application.streamingAssetsPath + "/" + trackable.Value.TrackerDataFileName);
            }
        }
    }

    TrackerManager.GetInstance().LoadTrackerData();
}

Note - Map Compatibility
Starting with Visual Slam Tool 5.1.0, the file format for 3dmaps has changed. 3dmaps created in Visual Slam Tool 5.1.0 or later are supported by AR SDK 6.2.x or later. If you want to continue using AR SDK 6.0.x or earlier, please use Visual Slam Tool 5.0.8 Android.
Also, you can only add 3dmaps from the same version to Object Fusion Tracker.


Set Package

If you are using a multi-object map, packaging it in a 3dpkg file will greatly improve map loading speed and recognition speed.
Create a 3dpkg with MapPackager, call AddTrackerData() to register the package file, and call LoadTrackerData() to make the target object ready for tracking. See the following code for how to set up an object package.

private IEnumerator AddTrackerData()
{
    string packagePath = ... // The file must exist in the streaming asset.

    yield return new WaitForEndOfFrame();

    if (Application.platform == RuntimePlatform.Android)
    {
        List<string> fileList = new List<string>();
        yield return StartCoroutine(MaxstARUtil.ExtractAssets(packagePath, fileList));
        TrackerManager.GetInstance().AddTrackerData(fileList[0], false);
    }
    else
    {
        TrackerManager.GetInstance().AddTrackerData(Application.streamingAssetsPath + "/" + packagePath);
    }

    TrackerManager.GetInstance().LoadTrackerData();
}

※ Object Package(3dpkg) is available from AR SDK 6.1.0.
※ Only 3dmaps created in Visual Slam Tool 5.1.0 or later can be packaged as 3dpkg.
※ Only one 3dpkg can be added to Object Fusion Tracker.

Add / Replace Map

  1. Create a map file refer to Visual SLAM Tool

  2. Copy the received map file to the desired path.

  3. Set a map.

  4. If you have an existing map file, call AddTrackerData () and LoadTrackerData () after calling TrackerManager.GetInstance ().RemoveTrackerData().

Start / Stop Tracker

TrackerManager.getInstance().IsFusionSupported ()
This function checks whether or not your device supports Fusion.
Return value is bool type. If true, it supports the device in use. If it is false, it does not support the device.

TrackerManager.getInstance().GetFusionTrackingState ()
Pass the tracking status of the current Fusion.
The return value is an int of -1, which means that tracking isn't working properly, and 1 means that it's working properly.

To start / stop Tracker after loading the map, refer to the following code.

>ObjectFusionTrackerSample.cs

void Start()
{
    ...
CameraDevice.GetInstance().SetARCoreTexture();
TrackerManager.GetInstance().StartTracker(TrackerManager.TRACKER_TYPE_OBJECT);
TrackerManager.GetInstance().AddTrackerData("ObjectTarget/obj_1010_7.3dmap", true);
TrackerManager.GetInstance().AddTrackerData("{\"object_fusion\":\"set_length\",\"object_name\":\"obj_1010_7\", \"length\":0.12}", true);
    TrackerManager.GetInstance().LoadTrackerData();
    ...
}

void OnApplicationPause(bool pause)
{
    ...
    TrackerManager.GetInstance().StopTracker();
    ...
}

void OnDestroy()
{
    TrackerManager.GetInstance().StopTracker();
    TrackerManager.GetInstance().DestroyTracker();
    ...
}

On the first call to addTrackerData (), the parameter must be passed the path of the 3dmap.
The addTrackerData () in the second call is used to determine the actual size of the 3dmap. (Unit: m)
Object_name is the file name of the 3dmap.
Length is the actual size of the two anchors you marked last when creating the 3dmap. (Please refer to Visual SLAM Tool.)
You must enter the actual size of the target. If you do not enter the correct actual size, the content will not be augmented properly.
It must be run in the following order: startTracker (), addTrackerData (), loadTrackerData ().

Use Tracking Information

To use the Tracking information, refer to the following code.

>ObjectFusionTrackerSample.cs

void Update()
{
    ...
    TrackingState state = TrackerManager.GetInstance().UpdateTrackingState();
    TrackingResult trackingResult = state.GetTrackingResult(); 

    for (int i = 0; i < trackingResult.GetCount(); i++)
    {
        Trackable trackable = trackingResult.GetTrackable(i);

        if (!objectTrackablesMap.ContainsKey(trackable.GetName()))
        {
            return;
        }

        objectTrackablesMap[trackable.GetName()].OnTrackSuccess(trackable.GetId(), trackable.GetName(), trackable.GetPose());
    }
}