Creating Immersive AR Experiences with RealityKit

Posted By : Saurabh Singh | 15-Nov-2024

       Creating Immersive AR Experiences with RealityKit

 

What We Will Build

We'll create an AR scene where materials on a Sphere and a Cube change in real-time as the user moves around. By detecting the user's position and orientation relative to the objects, We can generate a dynamic, engaging AR experience tracking how the user is positioned relative to objects.

Key Concepts

  • RealityKit: Apple's highK-performance framework for 3D AR experiences.
  • ARKit: Controls AR sessions and position tracking in the environment of AR.
  • User Position Tracking: We can alter the textures and materials using this system, depending on the user's position related to objects.

Step 1: Setting Up the Basic AR Scene with RealityKit

To get started, we'll create a RealityKit scene with ARKit to build an AR session.

  1. Import RealityKit and ARKit in your project.
  2. Create an ARView, the main view for displaying RealityKit content.

import RealityKit

import ARKit

let arView = ARView(frame: .zero)

arView.automaticallyConfigureSession = true

This basic AR setup creates a full-screen view that automatically manages the AR session.

Step 2: Adding Objects to the Scene

Next we are going to add a Sphere and a Cube to our RealityKit scene.

// Create a Sphere and Cube entity

let sphere = ModelEntity(mesh: .generateSphere(radius: 0.1))

let cube = ModelEntity(mesh: .generateBox(size: [0.1, 0.1, 0.1]))

// Position them in the AR scene

sphere.position = SIMD3(0, 0, -0.5)

cube.position = SIMD3(0.3, 0, -0.5)

// Add entities to the ARView

let anchor = AnchorEntity()

anchor.addChild(sphere)

anchor.addChild(cube)

arView.scene.anchors.append(anchor)

Here, we position the Sphere and Cube slightly apart so the user can see both objects clearly.

Step 3: Tracking the User's Position

To make the experience dynamic, we have to track how the user is positioned relative to the objects. We are going to use ARKit to get the user position and orientation.

func getUserPosition() -> SIMD3<Float>? {

    guard let cameraTransform = arView.session.currentFrame?.camera.transform else { return nil }

    return cameraTransform.columns.3.xyz

}

This is extracting the user's current position in AR space and can be called periodically for updating the user's location.

Step 4: Changing Materials Based on User Position

Now we will establish the logic to update the materials of the Sphere and the Cube depending upon the orientation of the user with respect to these objects. We determine the angle between the position of the user and the position of the Sphere.

func updateMaterialBasedOnUserPosition() {

    guard let userPosition = getUserPosition() else { return }

 

    let directionToUser = userPosition - sphere.position

    let angle = atan2(directionToUser.x, directionToUser.z)

    let degreeAngle = radiansToDegrees(angle)

    var imageName: String

    switch degreeAngle {

    case 45..<135:

        imageName = "Texture1"

    case -135..<(-45):

        imageName = "Texture2"

    case -45..<45:

        imageName = "Texture3"

    default:

        imageName = "Texture4"

    }

    applyTexture(named: imageName, to: sphere)

    applyTexture(named: imageName, to: cube)

}

This script now scans the user's position and changes with the texture on both the Sphere and the Cube according to which direction the user is facing.

Step 5: Applying Textures to Models

To apply the textures, we'll create a function that loads the texture and updates the material on our objects.

 

func applyTexture(named imageName: String, to entity: ModelEntity) {

    if let texture = try? TextureResource.load(named: imageName) {

        var material = SimpleMaterial()

        material.color = .init(tint: .white.withAlphaComponent(0.7), texture: .init(texture))

        material.metallic = 0.8

        material.roughness = 0.2

        entity.model?.materials = [material]

    }

}

The applyTexture function loads the texture from the assets and applies it to the entity, creating a reflective, slightly metallic material with the new texture.

Step 6: Continuous Position-Based Updates

We subscribe to the SceneEvents.Update event at the end to keep polling the user position and update material in real time.

arView.scene.subscribe(to: SceneEvents.Update.self) { _ in

    self.updateMaterialBasedOnUserPosition()

}.store(in: &cancellables)

The view of the user is now being constantly tracked so that the new materials are applied because the user will move around to create this AR experience interactively.

Conclusion

An augmented reality environment was created using RealityKit and ARKit, where the materials of the objects dynamically change position and appearance depending on the user's movement and position. This kind of interaction contributes to more fun and immersive AR experiences and can be generalized into more complex applications, including educational tools and interactive showcases.

Did this put new fun into working with RealityKit and ARKit? Though it is a fast approach, it paves the way for building sophisticated and interactive AR environments-sometimes fun, sometimes really lifelike-with enhancement of visual fidelity from app-based implementation of real-life interactions. 

 

 

 

About Author

Author Image
Saurabh Singh

I’m Saurabh Singh, a passionate Unity developer with extensive experience in creating immersive and engaging applications. Specializing in interactive 3D environments and real-time simulations, I focus on delivering high-quality solutions tailored to client needs. My expertise spans game development, VR/AR experiences, and custom Unity integrations. Committed to innovation and excellence, I strive to push the boundaries of interactive technology and bring creative visions to life.

Request for Proposal

Name is required

Comment is required

Sending message..