Creating Mobile AR experiences with ARCore
Posted By : Daljeet Singh | 05-May-2020
The Emerging Mobile AR Market
Augmented Reality(AR) has been one of the most prominent technologies, being employed to cater to the exponentially increasing demand for immersive and interactive user experiences of the real world on a virtual device. AR works by superimposing digital media such as Images,Videos,Text and Sound onto a user's viewport of the physical world.
Therefore, it is imperative for mobile application developers to embrace the AR revolution and start designing their applications around it for providing Augmented Reality App Development Services. Some of the modern SDK and tools available for mobile developers to build AR rich applications are :
- ARKit
- ARCore
- Vuforia
- Unity
While Vuforia and Unity have steep learning curves from the perspective of a iOS/Android mobile developer, the ARKit SDK launched by Apple only supports iOS devices for now. The ARCore SDK introduced by Google provides native support for both Android and iOS devices, and is therefore a viable alternative for building AR experiences inside native mobile apps.
ARCore
According to the official Google docs, the ARCore SDK utilizes three core capabilities to render a virtual world on top of the real world as seen through a user's phone camera :
- Motion Tracking : This enables the device to understand and track its position relative to the world,this is achieved by a process known as Concurrent Odometry and Mapping(COM). Motion tracking uses the phone's camera to recognize interesting points, known as feature points, and tracks how these points progress over time.
- Environmental Understanding : This enables the phone to detect the location and size of all types of surfaces be it vertical, horizontal or angled. ARCore observes clusters of feature points that lie on vertical or horizontal surfaces and makes these surfaces available to our application in the form of planes.It can also determine a plane's bounds and relay the information to our application.
- Light Estimation : This enables the phone to detect the current lighting conditions of the environment and gives us the average intensity and color correction of a provided camera image. This information can be utilized to light our virtual objects under the same lighting conditions as the real world environment ,thereby amplifying the sense of realism.
For the purpose of understanding,we will work on adding ARCore to our Android app with the help of Android Studio.
Prerequisites
- Android Studio 3.1 or higher
- Android SDK Platform Version 7.0 or higher
- Android App Development knowledge
Enabling ARCore
- Add AR Required or AR Optional entries to the manifest : Apps that have optional AR functionalities, which are only activated for ARCore supported devices are known as AR Optional apps.AR Optional apps can be run on devices that do not support ARCore and the Play Store doesn't automatically install ARCore with these apps. On the other hand, applications that cannot be installed on devices without ARCore support are known as AR Required apps. The Play Store automatically installs ARCore when a user installs an AR Required application, however, the application must still have additional runtime checks in place in case the user later uninstalls ARCore. The AR Required and AR Optional tags are specified in the Android Manifest file of the application :
<!-- For AR Optional apps --> <uses-sdk android:minSdkVersion="14" /> <uses-permission android:name="android.permission.CAMERA" /> <application> <!-- App supports, but does not require ARCore ("AR Optional").--> <meta-data android:name="com.google.ar.core" android:value="optional" /> </application> <!-- For AR Required apps --> <!-- "AR Required" apps must declare minSdkVersion >= 24 --> <uses-sdk android:minSdkVersion="24" /> <uses-permission android:name="android.permission.CAMERA" /> <!-- Indicates that app requires ARCore ("AR Required") --> <uses-feature android:name="android.hardware.camera.ar" /> <application> <!-- Indicates that app requires ARCore ("AR Required")--> <meta-data android:name="com.google.ar.core" android:value="required" /> </application>
- Add build dependency : Add the ARCore library as a dependency in the app's build.gradle file :
dependencies { implementation 'com.google.ar:core:1.7.0' }
Add Sceneform to your app
Sceneform makes it easy to render 3D scenes in AR apps without the need to learn OpenGL. It includes a high level scene graph API along with a physically based renderer which is provided by Google's Filament engine.In order to use Sceneform with ARCore in your app, you need to perform the following steps :
- Import the Sceneform Plugin : The Sceneform plugin lets us view,build and import 3D assets in the Sceneform SDK for AR based applications in Android Studio. To install the plugin navigate to the Plugins section by following the File->Settings->Plugins path and search for Google Sceneform Tools(Beta) inside the browse repositories option.
- Add the following lines to the app level build.gradle :
defaultConfig { // Sceneform requires minSdkVersion >=24. minSdkVersion 24 } // Sceneform libraries use language constructs from Java 8. // Add these compile options if targeting minSdkVersion < 26. compileOptions { sourceCompatibility JavaVersion.VERSION_1_8 targetCompatibility JavaVersion.VERSION_1_8 } dependencies { // Provides ArFragment, and other Sceneform UX resources: implementation "com.google.ar.sceneform.ux:sceneform-ux:1.7.0" }
Creating a Scene View
The ArFragment provided by the Sceneform SDK simplifies the process of setting up a scene view as it automatically takes care of ARCore session management following the runtime checking of all the necessary ARCore permissions and functionalities.These checks usually include inspecting for a compatible ARCore version(prompting the user to update or install if necessary) and checking if the app has been granted permission to access the device's camera.
Once the aforementioned checks are passed ,the ArFragment builds an ARSceneView which is accessible by the getArSceneView() method. The ArSceneView renders the images from the camera for the session on to its surface while also rendering a built-in Sceneform animation that directs the users on how to rotate the phone in order to activate the AR experience. The ArSceneView also highlights the Planes that have been detected with the help of the default PlaneRenderer.
To add an ArFragment to your activity ,add the following code inside your activity's layout file :
<fragment android:name="com.google.ar.sceneform.ux.ArFragment" android:id="@+id/ar_fragment" android:layout_width="match_parent" android:layout_height="match_parent" />
Building a Renderable
A Renderable is a 3D model which can be set at any place in the scene and it usually is composed of Textures, Meshes and Materials. A renderable can be created by creating a 3D model and importing it inside Android Studio with the help of Sceneform Tools plugin,which automatically updates the app's build.gradle to apply the plugin and append a sceneform.asset() entry for the model imported,as shown below :
apply plugin: 'com.google.ar.sceneform.plugin'
sceneform.asset('sampledata/models/fox_face.fbx',
'sampledata/models/fox_face_material.mat',
'sampledata/models/fox_face.sfa',
'src/main/res/raw/fox_face')
The imported model is then used to build a ModelRenderable inside your desired activity :
ModelRenderable.builder()
.setSource(this, R.raw.fox_face)
.build()
.thenAccept(renderable -> modelRenderable = renderable);
//Here fox_face is the sfb file for our imported model
Creating the Scene
The ArSceneView has an associated Scene attached to it, which is essentially a data structure similar to a tree containing the Node(s).The Nodes depict the virtual objects that need to be rendered on the SceneView and each node consists of the required information that Sceneform needs to render it such as its renderable object, orientation and position. Here is a code snippet which shows how to attach our modelRenderable created above to the Scene node(the root node) :
Node node = new Node();
node.setParent(arFragment.getArSceneView().getScene());
node.setRenderable(modelRenderable);
This blog discusses a very rudimentary application of ARCore in tandem with Sceneform to render a 3D object on the screen. ARCore can be utilized to build much more advanced applications such as detecting a user's face & adding different layers to the detected face(using the Augmented Faces API), rendering an AR experience after detecting a specific 2D image(Augmented Images API) and creating collaborative AR experiences among multiple users(by employing Cloud Anchors).You can find more details about the ARCore SDK here.
Cookies are important to the proper functioning of a site. To improve your experience, we use cookies to remember log-in details and provide secure log-in, collect statistics to optimize site functionality, and deliver content tailored to your interests. Click Agree and Proceed to accept cookies and go directly to the site or click on View Cookie Settings to see detailed descriptions of the types of cookies and choose whether to accept certain cookies while on the site.
About Author
Daljeet Singh
Daljeet has experience developing android applications across a range of domains such as Cryptocurrency, Travel & Hotel Booking, Video Streaming and e-commerce. In his free time, he can be found playing/watching a game of football or reading up on either