When Apple announced ARKit with the release of iOS 11, there was a lot of hype about it. You could create AR objects and look at them in the real world using your camera. You can create apps that use ARKit for fun purposes or learning purposes.
In this article, we’ll create an AR planetarium step by step. Something fun and interesting at the same time.
Step-by-step AR planetarium
Let’s start creating a new project. You can name it as you wish, I’m calling it ARPlanetarium. This project is going to use Storyboards.
Now that we have the starter project, let’s use the default created viewController on the Main Storyboard.
Now we move to the Main Storyboard and we are going to add an ARSceneView to the empty ViewController. We can find the ARSceneView by pressing the plus button to see the list of View Objects that UIKit Provides.
When you find the ARSceneView, drag and drop it to the ViewController.
Now we have the ARSceneView on the ViewController. We have to add constraints to cover all the viewController’s View because we want the AR Content to be shown in full screen.
In Apple’s words “The ARSCNView class provides the easiest way to create augmented reality experiences that blend virtual 3D content with a device camera view of the real world.“
For more information about the ARSCNView, you can read Apple’s documentation. The documentation about ARKit is very solid:
We have our scene where we can see all the AR content we want, but first, we need to set up the privacy settings in the info.plist file. We need to add the “Privacy – Camera Usage Description” key. On Value, I’ll set “We need access to your camera to show the planetarium”.
You can show the message you think is correct.
Now if we run our app, we’ll see the alert requesting permission to use our device camera. We press accept and Ka-Chow! … There is nothing. We are watching the ARSceneView that is using our device camera, so nothing is rendering at 60 FPS.
Let’s add some AR Objects to make it look great.
Create an Outlet that connects the ARSceneView with the ViewController. We also have to import ARKit to start adding AR Objects.
Now we need to set up the AR World Tracking configuration (ARWorldTrackingConfiguration)
According to Apple’s documentation ARWorldTrackingConfiguration object is:
“A configuration that monitors the iOS device’s position and orientation while enabling you to augment the environment that’s in front of the user.”
“World tracking provides 6 degrees of freedom tracking of the device. By finding feature points in the scene, world tracking enables performing hit-tests against the frame. Tracking can no longer be resumed once the session is paused.”
Now let’s add some debug options to manage that everything we add is working properly.
We’ll add the showWorldOrigin debug option so we can see the position of any AR object we add to the scene in an X, Y, Z space.
We will also turn on the feature points’ visibility.
What are feature points?
According to Apple, a feature point is:
“These points represent notable features detected in the camera image. Their positions in 3D world coordinate space are extrapolated as part of the image analysis that ARKit performs in order to accurately track the device’s position, orientation, and movement.“
Thanks to the feature points, an AR Object keeps its position in the real world even if it is not being shown on the screen and even if we move big distances. (Hansel and Gretel would have loved this)
If we run our app we’ll see the feature points (yellow points) tracking the real world. But wait! Where is the world origin? The world origin is where your phone was when you ran the app. Move 1 step back and you’ll see the world’s origin.
Let’s add our first Celestial Object.
Before we begin, I want to clarify that the objects we are adding are Scene nodes (SCNNode).
Scene nodes represent a point in space. It’s an object that has a position, orientation, and scale. By itself, a node does not have any visible content. It corresponds to us to add any visual layer to make it look as we wish.
For more documentation about Scene Nodes, you can visit Apple’s documentation: SCNNode
First, create a function called “createPlanets”
Inside that function add the following code:
Now we have the node that corresponds to our Sun. It is a point in space so it is not visible by itself, but we can make it visible by assigning a geometry to our node. Since we want to add planets, we need our nodes to be spheres. Add the next code after the Sun node definition.
Now our node has a sphere geometry with a radius of 30 centimeters, making 60 centimeters of diameter Sun. When we add a geometry, a white color texture is added as the default texture for our node.
Our node has geometry but it still needs a position in space on X, Y, Z coordinates.
Add the next code at the end of the function.
This means that the Sun node position will be 0 on X, 0 on Y coordinates but it will be -1 meters away from the origin on the Z Coordinate.
You’ll be asking where is the origin position for the Sun node, in other words, the 0,0,0 position, every node that should be rendered in the AR world needs to have a parent node. The relative position of X, Y, Z coordinates are based on the parent node as its world origin.
For example, if the parent node of our Sun is 1 meter away from our camera (0,0, -1), that means our Sun will take that position as it’s (0,0,0) and its relative position will be 1 meter away from that parent node. As result, the Sun would be 2 meters away from our camera when the render starts
We will set the Sun node as the child of the root node of the sceneView. When the AR Session begins, a root node is created. This root node represents the starting position of our device when our AR Session begins. This means the Sun will appear 1 meter away from our device when we start rendering.
Add the following code at the end of the createSun function.
In the end, our function should look like this:
Override the viewWillAppear function and call the createSun function.
Now run the app.
Now we will see a white sphere 1 meter away from our camera. This is because the camera will be always facing the negative Z coordinate when the AR Session begins rendering. That’s why we set the position 0 for X and Y coordinates but -1 for the Z coordinate.
Let’s make our sphere look more like a Sun.
Go to the Solar System textures website, move to the Sun, Moon, and Stars section and download the Sun’s 2K assets.
Now move the Sun’s asset into the Xcode assets folder. I will create a folder to separate them by planets since we will download assets for all the Solar System so I can find them easily.
Rename the Sun’s texture to “sunDiffuse”.
Now add the next line at the end of the createSun function.
Run the app.
There it is! A sun floating in the room.
We changed the diffuse material that represents the color and light reflection of our node. The image textures provided on the website were created to be used as AR Materials, that’s why it fits perfectly on our node.
You can read more about ARKit diffuse materials on Apple’s documentation: diffuse.
Using UIImage(named: ) works but it can be done better, we are going to create an enum that handles our images. Let’s create a new Swift file in the root project.
I’ll call my file PlanetImages.
Write the following code in the Swift file.
You can see this is a String enum and the cases have the same name as the images in the XCAssets.
Now we can change the code where we set the diffuse image to the Sun node.
Now our “createPlanets” function should look like this:
If we keep like this we would repeat a lot of code so let’s move the planet creation to the following function.
Now we have a function that creates SCNNodes for us, with any geometry, relative position in space, and the textures we wish. You’ll notice that we are now using another initializer ARKit provides for SCNNode that requires a geometry parameter. Now we can initialize our nodes in one line.
Now let’s change the code in our createPlanets function.
The Solar System Scope website only provides the diffuse image for the sun, so we are not using the optional parameters specular, normal, and emission which is default nil.
Adding the Earth
Add the following code at the end of the createPlanets function.
We have created another sphere node with a radius of 20 centimeters, and its position will be 1 meter away from the sun on the X coordinate. This is because we did set the Earth as the child of the Sun node, so the Sun is the (0,0,0) for the Earth node, moving 1 meter on the Y coordinate taking the Sun’s position as the origin.
Now run the app and you’ll see the Sun floating in front of you. Move your camera 1 meter to the right and you’ll see the earth.
The planet Earth looks good but it can look better, let’s add all the other materials to our node.
Change the call of the create node function for the Earth.
We are using all the textures that were created for different types of color and light reflection.
Run the app and you’ll notice the planet Earth looks great! Now it has clouds and an atmosphere. You should try adding one by one and run the app to see what changed.
It looks great but our planet is not reflecting light and this is because we haven’t enabled the AR light. Add the following code and the end of the viewDidLoad function.
Now run the app and look at that planet Earth. So perfect.
You see, even if the AR content is shown in the real world, those objects do not reflect the real world light ( that would be awesome), so we need to simulate illumination so our objects reflect them and look great.
Let’s animate them
Add the following function.
You’ll notice an error about the .toRadians expression, we’ll fix it by adding an Int extension that converts Int degrees to radians.
Add the following code at the end of the viewController.
We can animate Nodes using SCNActions. SCNActions are reusable animations to change the structure of our nodes. In this case, we will make our nodes rotate on the Y coordinate, in other words, make a 360-degree rotation. SCNActions are done once per call, that’s why we used SCNAction.repeatForever function sending the animation we want to repeat as a parameter.
The SCNAction receives a TimeInterval variable that represents the time that the node will take to complete the action.
Now add the following code at the end of createPlanets function
With this, the sun will complete a 360º rotation in 15 seconds over the Y coordinate.
Now run the app.
Wow, the Sun is rotating! But something is wrong, the Earth is moving along with the Sun at the same speed and we only wanted to rotate the Sun, we haven’t set an action for the Earth yet.
Here is the reason for that:
When you add a node, you set the coordinates to make them have a position in space. These coordinates are based on the parent node as the (0,0,0). If the parent node rotates, its coordinates also rotate, now the X coordinate is not on the right, it has moved and it is pointing in another direction, making its child nodes move to respect the SCNVector3 position you set.
It is like a merry-go-round. If every holder had a name and you were told to be always in front of the one with your name, you’ll have to move every time the merry-go-round rotates. (Just an example)
The solution for this is adding an invisible node that will be the parent node of the Earth node. The position of this invisible node will be the same as our Sun node, so we don’t have to change the SCNVector3 coordinates of our Earth.
First, let’s create a constant that represents our Sun’s position at the beginning of the createPlanet function.
Replace the line
And replace it with the following
The Earth parent node will be in the same position as our Sun node and will be the Earth node origin. Nothing has changed for the Earth node in terms of position.
Now run the app.
Yes! The Earth stays still while the Sun is rotating.
Now let’s rotate our Earth.
Add the following code at the end of the function.
Run the app.
Now our Earth is rotating. Making a full rotation every 8 seconds. Looks good!
Now let’s make the Earth translate with the speed we want.
Add the following code at the end of the function.
We are adding 5 seconds to complete a translation animation because I want the example to be quick so you can continue with the next steps, we can modify the speed when all is finished.
Run the app.
When the Earth parent node rotates, the Earth follows its parent changing coordinates, making it translate. Yes, just like the problem when the Sun was the Earth’s parent but this time, the Earth’s speed does not depend on the Sun-s speed.
Just to clarify, this is the size comparison of our Solar System.
We can make our Sun look bigger and the Earth smaller, it all depends on how much space you have to appreciate the planets.
I’ll reduce the radius size of the Earth to 5 centimeters. I’ll assign the same size to Mercury, Venus, and Mars, while Jupiter, Saturn, Uranus, and Neptune will have a 10 centimeters radius. This requires a lot of space in the room so use the sizes you think are good for you or just add the planets you want to see.
For Mercury, we have to repeat the steps above and create a function that creates parent nodes.
Now add the following code at the end of the createPlanet function.
Now make the sunPosition constant at the beginning of the createPlanet function, a global constant.
Now we can separate the creation of every planet. Now add the following function.
Don’t forget to call it at the end of the createPlanet function.
Now run the app.
Mercury is 45 centimeters away from the Sun, it completes a rotation every 5 seconds and completes a translation every 2 seconds.
Now we can move the code that creates our Earth and our Sun to a function.
Now other “createPlanets” function looks clean.
The steps won’t change, except that we also have the specular image for Venus’s atmosphere.
Remember to call the function from the createPlanets function.
Run the app.
Now we have three planets moving in our Solar System. Let’s add the rest.
Mars to Neptune
From Mars to Neptune we only have diffuse images, so I’m going to skip the explanation for those planets since we already know how to show them.
The position for every planet will be 25 centimeters more distant than the previous planet. The earth is 1 meter away from the Sun, so Mars will be 1.25 meters away from the Sun and so on.
At the end the createPlanets function should look like this:
Now we have all the planets. You may notice I reduced the speed, now we can appreciate the AR models.
Saturn is not complete without the asteroids ring. The website that provides the textures has a Saturn ring texture.
Add the following code in the createSaturn function just after the Saturn node declaration.
Now your create Saturn node should look like this:
We add the node that represents Saturn’s asteroid ring. The geometry we need is SCNTube, the one used to represent cylinders. SCNTube requires 2 radiuses, one representing the inner space and the other representing the size of the rendered object. SCNTube also requires a height, so we add 0.005 millimeters as height value to make it look plain.
The position of the ring is (0, 0, 0) and we set Saturn as the ring’s parent. This means that the center of the ring is Saturn’s position in space.
For the last step, we rotate the ring so we can notice it easily. If we leave the ring at 0 degrees, you would have to move your camera below or above the ring to appreciate it.
Now run the app.
You may realize that I reduced Saturn’s translation so we can appreciate it in the video.
Let’s add the Moon to finish this tutorial.
Add the following at the end of the createEarth function
We create an empty node that will be the Moon’s parent node, called “moonParent”. The moonParent position will be the same as the Earth node. Even the moonParent needs a parent node so we set the earthParent node as the moonParent’s parent.
When the earthParent rotates, the moonParent will translate along with the Earth, along with the Moon. When the moonParent that is in the Earth’s center, rotates, the Moon will translate around the Earth.
This is how the Earth looks with a Moon.
Run the app to see the complete Solar System.
I reduced the size of all nodes so we can appreciate it in a small space. You can play with the radius and positions if you want to see a planet in high resolution. Just download the repo and play.
Here is the source code for this tutorial: ARKit Planetarium
ARKit provides us with the tools we need to create amazing things. It is widely used in the Gaming and Entertainment sectors, but it also is changing the way the markets offer their products to provide a better customer experience. Like showing how your furniture will look in your room before you buy it.
Augmented Reality and Machine Learning are aiming to the future, and they can be mixed to create a new way to provide solutions to the users. (Blog Spoiler Alert)
That’s all folks, see you next time for more Augmented Reality!
About the author
Oscar graduated from Informatics Engineering in Universidad de El Salvador and has over 4 years of experience in mobile development. He’s currently an iOS Developer at Applaudo Studios.