ARCore with Kotlin

applaudo
By applaudo
2020-08-13

An introduction to Augmented Reality using ARCore and Kotlin.

Hello guys! This is my first time around here, I was someone like this: Trying to cast some good gits and libs for my projects, until now. I hope this article will be useful for you as an introduction to AR using ARCore and Kotlin.

Now that I have your attention. I want to start with two examples that I will explain the code in this post but now I’ll help you to find some key knowledge and give you my interpretation on each point.

First of all, we will follow this travel guide:

But first…

What ARCore truly is?

According to the official site, ARcore is an SDK that provides native APIs for all of the essential AR features like motion tracking, environmental understanding, and light estimation.

So, let’s break down the important things:

Motion Tracking

Motion tracking is used to estimate where the phone is located. It tracks the movement and uses a process called concurrent odometry and mapping to estimate where the phone is, relative to the world around it. It also uses visual information and the combination of the inertial measurement from the device to estimate the orientation and position.

Environmental understanding

ARCore looks for a cluster of feature points that appear to lie on command horizontal and vertical surfaces. Such as tables and walls making these surfaces available to your app as planes.

Light estimation 

ARCore can detect information about the light of its environment and provide you with the average intensity and color correction of a given camera image.

These are the most important key points, but if you need more information here are some other things that I think are important to know:

And that’s it. Easy right? Without further ado, let’s begin!

What are we gonna need?

Before we fully begin, I want to say thank you for taking the time to see my code, I’ll love to have some feedback at the end of the post or wherever you would like to leave it.

Up next is my survival guide for the Gradle, Manifest, and a basic example of ARCore. I’ll post the important thing or key line and I’ll share this git with the example.

1. Gradle and Implementation

a) It’s required to enable AR sceneform plugin.

b) Enable Java 8, just for performance reasons.

c) Implement ARCore Scene Form Ux. As I said at the beginning, ARCore provides some APIs for different goals. For example, we are going to use only anchors and trackable to render models in a point with x, y, z coordinates. However, if you want to read more about other APIs here is the official site about all of this.

2. Permission and feature filters

a) Camera permissions and uses feature for App Store filter.

We need to grant the camera permission and filter the app for devices with AR support adding the uses feature. Here is more information about this filter.

b) Meta Data for AR Google ARCore:

We need to add this metadata to enable the ARCore bundle for the app.

And here we have the final result for the Manifest:

Before we continue, we will need to install an Android Studio Plugin that is necessary to import the OBJ assets for this example.

Unfortunately, the plugin is a beta tool, so Android Studio crashes sometimes with some assets, but here are 2 assets that worked for me. We need the OBJ files to generate the *.SFA and *.SFB (Here is more about assets).

a. Big Lam Post

b. Couch

3. Now we have our Android Studio and project ready to import the assets and start with some code

What I did is add the OBJ files as sample data to my app and then start importing each file with the ARcore Plugin.

This will generate the following files:

And also add to our Gradle the path to the files. Here is additional info about import assets.

4. Layout

For layout, we will use a fragment for sceneform.

And it will create a gallery to handle multi models on the example video.

5. Create and add a node to scenes to place objects in the real world

a) This function is used to create a node and attach it to a scene as a son. We will need also some parameters:

fragment: The scene from Fragment

anchor: An anchor as param

renderableObject: An object generated with the renderable builder class

b) Function to attach a model to an anchor. This will use a builder function from the sceneform API to create renderable object that will be attached to the view as a child node on the AddNodeToScene.

fragment: The sceneform Fragment

anchor: An anchor as param

modelUri: URI with the *.sfb file path

6. OnCreate for the Main Activity

As I mentioned before, this is a starting guide or story. It’s only a first view on what we can do using ARcore APIs with Kotlin.

This is the onCreate function for the main activity and is where we can handle all the functions for the ArFragment. With this flow, we can add an OBJ model to a plane that is upward facing. For that, we will validate the type of plane and add a listener to the tap on the screen.

a) Init the ArFragment

b) Init de model URI with the model OBJ that we will use

c) Add a listener to the ArFragment to handle the tap. Thanks to this, we can check where the space in R3 is touched, what plane, and what point exactly in coordinates x,y,z are we touching.

The parameters will be:

hitResults contain the info about where the RAY hits a trackable object. In this case, a Plane, so we will use the position of the hitResult to create an anchor.

And we can validate if the surface we touch has a plane upward facing.

If it’s not, we can return an empty value for the tap listener.

We have the final result for the Main Activity like this:

As you see here is a normal ArFragment to handle taps on the screen, how to validate if a surface has a plane and if the plane is facing upward, or not. We have a lot more things to do. For example, I have included a guide on how to use Augmented Images and a Data Base for images to use 2D objects to trigger a 3D model over the 2D object.

I hope this has been helpful to you and if you have any feedback for my post or my code I’ll really appreciate your time.

About the author

Jose Arteaga

Graduated from Computer Science Engineering from Universidad Don Bosco. Jose has over 8 years of experience in Software Development. He’s currently an Android Developer at Applaudo Studios.