Building Spatial Experiences with RealityKit

Building Spatial Experiences with RealityKit

This video goes over the creation of the Hello World Application, focusing on RealityKit in conjunction with SwiftUI. We will learn about how RealityKit works with SwiftUI.

RealityKit + SwiftUI

Entities and Components

  • Entity is a container object displayed by the 3d Engine Reality Kit
  • Components enable some specific behavior for an entity
  • Example
    • Earth / Satellite
      • Model -> material -> textures + shaders -> How surface mesh responses to light
      • Transform -> Places entity in 3D space position, orientation, scale
        • Note RealityView coordinate system flips the y axis in comparison to SwiftUI so a conversion is needed.
      • Collison and Input Target
        • Needed for interactive objects to gestures
      • Hover effect
        • Needed to highlight object for interaction

RealityView (SwiftUI View Contains RealityKit Entities)

Input, animation, and Audio

  • Input
    • Gesture Event on Entity with collision and input target component
      • Add collision and input target components via Reality Composer Pro
        • Create a new USD file and reference the desired USD file
        • Add the Collision and input target component
        • Update the Collision shape to match
      • Add Hover Effect to identify the entity as dragable
    • For more see Meet Reality Composer Pro
  • Animation
    • From-to-by
    • Orbit
    • Sampling
  • Audio
    • Spatial Audio -> Fixed to subject
    • Ambient Audio -> Fixed to environment
    • Channel Audio -> BG music

Custom Systems

Notes mentioning this note


Here are all the notes in this garden, along with their links, visualized as a graph.