AR Pointer Tracker
Related Documents |
---|
Tracker Coordinate System |
1. Definition
2. Video See-Through
2.1. ARPoint Activity Setting
[2.3. Entire Code for ARPointTracker]
1. Definition
The user can augment augmented reality (AR) content onto specific locations of objects viewed through the camera.
2. Video See-Through
ARPoint only supports Video See-Through.
2.1. ARPoint Activity Setting
- Create an Android Activity in ARPointTrackerActivity.java and add a Renderer. Set up SurfaceView in the layout obtained from the resource XML.
@Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_vivar_tracker); arPointTargetRenderer = new ARPointTrackerRenderer(this); glSurfaceView = (GLSurfaceView) findViewById(R.id.gl_surface_view); glSurfaceView.setEGLContextClientVersion(2); glSurfaceView.setRenderer(arPointTargetRenderer); glSurfaceView.setOnTouchListener(this); findViewById(R.id.btn_find).setOnClickListener(this); arPointTargetRenderer.listener = resultListener; }
- Set up the start and stop functions for the MaxstAR Engine in ARPointTrackerActivity.java.
@Override protected void onResume() { super.onResume(); glSurfaceView.onResume(); CameraDevice.getInstance().start(0, 1280, 720); TrackerManager.getInstance().startTracker(0x1000); MaxstAR.onResume(); } @Override protected void onPause() { super.onPause(); glSurfaceView.onPause(); CameraDevice.getInstance().stop(); TrackerManager.getInstance().stopTracker(); MaxstAR.onPause(); }
- Code to detect the real-time environment with a button in ARPointTrackerActivity.java.
@Override public void onClick(View view) { switch (view.getId()) { case R.id.btn_find: TrackerManager.getInstance().findSurface(); break; } }
- To use tracking information, add the following code in the onDrawFrame function of ARPointTrackerRenderer.java. This code calculates the position of the trained Anchor based on the tracking location from the camera and applies it to texturedCube.
@Override public void onDrawFrame(GL10 unused) { GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT); GLES20.glViewport(0, 0, surfaceWidth, surfaceHeight); TrackingState state = TrackerManager.getInstance().updateTrackingState(); TrackingResult trackingResult = state.getTrackingResult(); TrackedImage image = state.getImage(); float[] projectionMatrix = CameraDevice.getInstance().getProjectionMatrix(); float[] backgroundPlaneInfo = CameraDevice.getInstance().getBackgroundPlaneInfo(); backgroundRenderHelper.drawBackground(image, projectionMatrix, backgroundPlaneInfo); featurePointRenderer.setProjectionMatrix(projectionMatrix); GuideInfo gi = TrackerManager.getInstance().getGuideInformation(); featurePointRenderer.draw(gi, trackingResult); int trackingCount = trackingResult.getCount(); if(trackingCount == 0) { return; } GLES20.glEnable(GLES20.GL_DEPTH_TEST); float[] basePose = new float[16]; for (int i = 0; i < trackingCount; i++) { Trackable trackable = trackingResult.getTrackable(i); listener.sendData(trackable.getName()); if (trackable.getName().compareTo("status") == 0) { continue; } else if (trackable.getName().compareTo("default") == 0) { System.arraycopy(trackable.getPoseMatrix(), 0, basePose, 0, 16); System.arraycopy(trackable.getPoseMatrix(), 0, lastPose, 0, 16); continue; } float[] pose = new float[16]; System.arraycopy(trackable.getPoseMatrix(), 0, pose, 0, 16); float[] eachPose = new float[16]; multiplyPose(eachPose, basePose, pose); texturedCube.setProjectionMatrix(projectionMatrix); texturedCube.setTransform(eachPose); texturedCube.draw(); } }
- Method to obtain the Anchor to be tracked using the touch coordinates on the screen.
@Override public boolean onTouch(View v, final MotionEvent event) { float x = event.getX(); float y = event.getY(); switch (event.getAction()) { case MotionEvent.ACTION_DOWN: { Log.i("ARPointTrackerActivity", "touch action down"); System.arraycopy(arPointTargetRenderer.lastPose, 0, lastPose, 0, 16); break; } case MotionEvent.ACTION_UP: { Log.i("ARPointTrackerActivity", "touch action up"); Point3f ncoord = arPointTargetRenderer.screenToLocal(x, y); anchorName = getAnchorName(true); String strPose = ""; for(int i=0; i<15; i++) strPose += lastPose[i] + ","; strPose += lastPose[15]; String addCommand = "{\"vivar\":\"add_prev_anchor_normalized\",\"name\":\"" + anchorName + "\",\"x_pos\":" + ncoord.x + ",\"y_pos\":" + ncoord.y + ",\"search_pose\":\"" + strPose + "\",\"z_offset\":0.2}"; TrackerManager.getInstance().addTrackerData(addCommand, false); Log.i("ARPointTrackerActivity", "add_prev_anchor_normalized command : " + addCommand); break; } } return true; }
- Code to normalize the touch location based on the screen size, used in the onTouch function above.
public Point3f screenToLocal(float screenX, float screenY) { // calculate normalized screen coordiante float sw = surfaceWidth; float sh = surfaceHeight; float cw = camWidth; float ch = camHeight; float nw = (sw > sh) ? sw : sh; float nh = (sw > sh) ? sh : sw; float px = (sw > sh) ? screenX : screenY; float py = (sw > sh) ? screenY : nh - screenX; float rw = nw / cw; float rh = nh / ch; float margin_nh = 0.f; float margin_nw = 0.f; // cut height if (rw > rh) { float vh = rw * ch; margin_nh = (vh-nh) * 0.5f; py += margin_nh; Log.i(TAG, "margin h = " + margin_nh); } else { float vw = rh * cw; margin_nw = (vw-nw) * 0.5f; px += margin_nw; Log.i(TAG, "margin w = " + margin_nw); } float nx = (px - (nw * 0.5f + margin_nw)) / nw; float ny = (py - (nh * 0.5f + margin_nh)) / nw; return new Point3f(nx, ny, 1.0f); }
- Code to add an Anchor via touch, used in the onTouch function in ARPointTrackerActivity.java.
String addCommand = "{\"vivar\":\"add_prev_anchor_normalized\",\"name\":\"" + anchorName + "\",\"x_pos\":" + ncoord.x + ",\"y_pos\":" + ncoord.y + ",\"search_pose\":\"" + strPose + "\",\"z_offset\":0.2}";
2.2. Entire Code for ARPointTracker
Below is the entire code for ARPointTrackerActivity.java and ARPointTrackerRenderer.java.