Windows: The gateway to 3D content
At the heart of visionOS are Windows, which are akin to SwiftUI Scenes on Mac, capable of housing traditional views and 3D content. Each app can feature one or more windows, offering a versatile platform for developers to present their content. The innovation doesn’t stop at traditional 2D views; windows in visionOS support 3D content, extending the canvas for creators to paint their visions.
Volumes: Elevating 3D content
Volumes, introduced through WindowGroup + .windowStyle(.volumetric), represent another type of scene specifically designed for showcasing 3D content. Unlike windows, volumes are optimized for 3D, making them ideal for presenting complex models like a heart or the Earth. This distinction allows developers to choose the right container for their content, ensuring the best possible user experience.
Spaces: The final frontier of immersion
Spaces in visionOS introduce a new dimension of app interaction. With ImmersiveSpace and .immersionStyle(selection:in:), developers can create shared or full spaces, defining how their apps integrate with the user’s environment. Shared Spaces mimic a desktop environment where apps coexist, while Full Spaces offer a more controlled, immersive experience, potentially leveraging ARKit APIs for enhanced reality.
If you want to make the most out of SwiftUI, ARKit and RealityKit, we encourage you to use ImmersiveSpace together with the new RealityView, since they were designed to be used together.
The Core of Apple Vision Pro: USDZ and Reality Composer Pro
USDZ files
USDZ files, co-developed by Apple and Pixar, serve as the cornerstone for placing 3D objects in augmented reality environments.
Image sources: Understand USD fundamentals, Apple
We can get these files from Reality Composer Pro – which is a pivotal tool, offering a graphical interface for composing, editing, and previewing 3D content directly within Xcode, enhancing the creation process with features like particle emitters and audio authoring.
To dive into the USD ecosystem and Reality Composer Pro, we recommend reviewing these sessions from WWDC22 and WWDC23 which delve into the intricacies of the USD ecosystem, providing insights into tools, rendering techniques, and the broader landscape of USDZ files. These resources are invaluable for developers looking to master spatial computing with Apple Vision Pro:
Enhancing Interactivity with RealityKit and ARKit
Interactions in spatial computing go beyond traditional input methods, incorporating taps, drags, and hand tracking. RealityKit and ARKit form the backbone of this interactive framework, allowing for the augmentation of app windows with 3D content and the creation of fully immersive environments.
RealityKit: A closer look
RealityKit simplifies the integration of AR features, supporting accurate lighting, shadows, and animations. It operates on the entity-component-system (ECS) model, where entities are augmented with components to achieve desired behaviors and appearances. Custom components and built-in animations enrich the spatial experience, further bridged to SwiftUI through RealityView attachments.
Here is a code snippet of the RealityView structure, with some comments explaining the purpose of each code block.