About This Page

This page covers VR and AR game and application development — from hardware fundamentals to production-ready XR experiences. For engine setup see Unity, Unreal Engine, Godot. For rendering concepts see Advanced Graphics. For game design see Game Design.

History

  • How: VR began with military flight simulators (1960s), became a consumer curiosity with Nintendo Virtual Boy (1995), then truly arrived with Oculus Rift DK1 (2012) and consumer headsets (2016). AR went mainstream with Pokemon GO (2016) and Apple Vision Pro (2024).
  • Who: Key players — Oculus/Meta (Quest), Valve (Index/SteamVR), Sony (PSVR), Apple (Vision Pro), Microsoft (HoloLens), Google (ARCore), Apple (ARKit), Khronos Group (OpenXR).
  • Why: VR creates the highest possible immersion — presence. AR overlays digital information on the real world. Both are transforming gaming, training, medicine, architecture, and social interaction.

XR Timeline

timeline
    title VR/AR Evolution
    1968 : The Sword of Damocles
         : First head-mounted display
         : Ivan Sutherland MIT
    1995 : Nintendo Virtual Boy
         : First consumer VR attempt
         : Commercial failure
    2012 : Oculus Rift DK1
         : Kickstarter success
         : Modern VR era begins
    2016 : Consumer VR Launch
         : HTC Vive, Oculus Rift CV1, PSVR
         : Pokemon GO AR mainstream
    2019 : Oculus Quest 1
         : Standalone VR, no PC required
         : VR becomes accessible
    2020 : OpenXR 1.0 Standard
         : Cross-platform XR API
         : Industry unification begins
    2023 : Meta Quest 3
         : Mixed reality passthrough
         : Affordable high-quality VR
    2024 : Apple Vision Pro
         : Spatial computing era
         : $3499 premium device

Introduction

XR Knowledge Map

mindmap
  root((XR Development))
    Hardware
      Headsets
      Controllers
      Tracking
      Display Tech
    SDKs and APIs
      OpenXR
      SteamVR
      Meta SDK
      ARCore
      ARKit
    Interaction Design
      Locomotion
      Hand Tracking
      Gaze Input
      Haptics
    Rendering
      Stereo Rendering
      Foveated Rendering
      Reprojection
      Performance
    Platforms
      Meta Quest
      PC VR
      PSVR2
      iOS AR
      Android AR

VR vs AR vs MR

TypeReal WorldDigital WorldDeviceExample
VRBlocked outFully immersiveHeadsetBeat Saber, Half-Life Alyx
ARVisible (camera)OverlaidPhone / glassesPokemon GO, IKEA Place
MRVisible (passthrough)Interacts with realHeadsetHoloLens, Quest 3
Spatial ComputingSeamless blendContext-awareVision ProApple Vision Pro apps

VR Hardware Fundamentals

Headset Types

TypeDescriptionExamplesProsCons
StandaloneSelf-contained, no PCMeta Quest 3, Quest 2Wireless, affordable, easy setupLimited GPU power
PC VR (tethered)Connected to gaming PCValve Index, PimaxMaximum qualityCable, expensive PC needed
PC VR (wireless)PC power, wireless linkQuest 3 + Air LinkBest of bothLatency, compression
Console VRConnected to consolePSVR2Plug-and-playPlatform locked
Standalone premiumHigh-end standaloneApple Vision ProBest standalone qualityVery expensive

Display Technology

SpecMinimumGoodExcellentNotes
Resolution per eye1080×12001832×19202064×2208+Higher = less screen door effect
Refresh rate72 Hz90 Hz120 HzHigher = less motion sickness
FOV (Field of View)90°110°120°+Wider = more immersive
IPD adjustmentFixedSoftwareHardwareInterpupillary distance
Panel typeLCDFast-LCDOLED/MicroOLEDOLED = better blacks, contrast

Tracking Systems

graph TD
    subgraph Inside["📷 Inside-Out Tracking (Modern Standard)"]
        I1["Cameras on headset\nlook outward"]
        I2["SLAM algorithm\nmaps environment"]
        I3["6DOF tracking\nno external hardware"]
        I1 --> I2 --> I3
    end
    subgraph Outside["📡 Outside-In Tracking (Legacy)"]
        O1["External base stations\n(Valve Lighthouse)"]
        O2["Sensors on headset\ndetect laser sweeps"]
        O3["Very precise tracking\nrequires room setup"]
        O1 --> O2 --> O3
    end
DOFDescriptionExample
3DOFRotation only (pitch, yaw, roll) — no positionCardboard, Gear VR
6DOFRotation + position (x, y, z) — full movementQuest 3, Index, PSVR2

VR Headset Comparison (2024)

HeadsetTypeResolution/EyeRefreshPriceBest For
Meta Quest 2Standalone1832×192090 Hz~$250Budget, beginners
Meta Quest 3Standalone MR2064×2208120 Hz$500Best value
Valve IndexPC VR1440×1600144 Hz$999PC VR enthusiasts
PSVR2Console VR2000×2040120 Hz$550PlayStation gamers
Apple Vision ProStandalone3660×3142100 Hz$3499Spatial computing
Pimax CrystalPC VR2880×2880160 Hz$1599Sim enthusiasts

OpenXR

OpenXR Architecture

graph TD
    App["Your XR Application"]
    OpenXR["OpenXR API\n(Khronos Standard)"]
    subgraph Runtimes["OpenXR Runtimes"]
        Meta["Meta OpenXR Runtime\n(Quest devices)"]
        Valve["SteamVR Runtime\n(PC VR)"]
        WMR["Windows MR Runtime\n(HoloLens, WMR)"]
        PSVR["Sony Runtime\n(PSVR2)"]
    end
    subgraph Hardware["Hardware"]
        Q3["Meta Quest 3"]
        Index["Valve Index"]
        HL["HoloLens 2"]
        PS["PSVR2"]
    end
    App --> OpenXR
    OpenXR --> Meta --> Q3
    OpenXR --> Valve --> Index
    OpenXR --> WMR --> HL
    OpenXR --> PSVR --> PS

OpenXR Core Concepts

ConceptDescription
InstanceOpenXR context — created once at startup
SystemRepresents the XR hardware (headset + controllers)
SessionActive XR session — manages rendering and input
SpaceCoordinate reference frame (local, stage, view)
ActionAbstract input (trigger press, thumbstick axis)
Action SetGroup of related actions
SwapchainImages rendered to and presented to the headset
FrameOne rendered frame — predict, wait, begin, render, end

OpenXR C++ Initialization

#include <openxr/openxr.h>
 
XrInstance instance = XR_NULL_HANDLE;
XrSession  session  = XR_NULL_HANDLE;
XrSystemId systemId = XR_NULL_SYSTEM_ID;
 
// 1. Create Instance
XrInstanceCreateInfo instanceInfo{XR_TYPE_INSTANCE_CREATE_INFO};
instanceInfo.applicationInfo.apiVersion = XR_CURRENT_API_VERSION;
strcpy(instanceInfo.applicationInfo.applicationName, "My VR Game");
strcpy(instanceInfo.applicationInfo.engineName, "My Engine");
 
const char* extensions[] = {
    XR_KHR_VULKAN_ENABLE2_EXTENSION_NAME,  // or OpenGL
    XR_EXT_HAND_TRACKING_EXTENSION_NAME,   // hand tracking
};
instanceInfo.enabledExtensionCount = 2;
instanceInfo.enabledExtensionNames = extensions;
 
xrCreateInstance(&instanceInfo, &instance);
 
// 2. Get System (headset)
XrSystemGetInfo systemInfo{XR_TYPE_SYSTEM_GET_INFO};
systemInfo.formFactor = XR_FORM_FACTOR_HEAD_MOUNTED_DISPLAY;
xrGetSystem(instance, &systemInfo, &systemId);
 
// 3. Create Session
XrSessionCreateInfo sessionInfo{XR_TYPE_SESSION_CREATE_INFO};
sessionInfo.systemId = systemId;
// + graphics binding (Vulkan/OpenGL specific struct)
xrCreateSession(instance, &sessionInfo, &session);
 
// 4. Create Reference Space (stage = room-scale)
XrReferenceSpaceCreateInfo spaceInfo{XR_TYPE_REFERENCE_SPACE_CREATE_INFO};
spaceInfo.referenceSpaceType = XR_REFERENCE_SPACE_TYPE_STAGE;
spaceInfo.poseInReferenceSpace = {{0,0,0,1}, {0,0,0}}; // identity pose
 
XrSpace stageSpace = XR_NULL_HANDLE;
xrCreateReferenceSpace(session, &spaceInfo, &stageSpace);

OpenXR Input System

// OpenXR uses an abstract action system
// Actions are bound to physical inputs via interaction profiles
 
XrActionSet actionSet = XR_NULL_HANDLE;
XrAction triggerAction, gripAction, thumbstickAction;
 
// Create action set
XrActionSetCreateInfo setInfo{XR_TYPE_ACTION_SET_CREATE_INFO};
strcpy(setInfo.actionSetName, "gameplay");
strcpy(setInfo.localizedActionSetName, "Gameplay");
xrCreateActionSet(instance, &setInfo, &actionSet);
 
// Create trigger action (boolean)
XrActionCreateInfo triggerInfo{XR_TYPE_ACTION_CREATE_INFO};
triggerInfo.actionType = XR_ACTION_TYPE_BOOLEAN_INPUT;
strcpy(triggerInfo.actionName, "shoot");
strcpy(triggerInfo.localizedActionName, "Shoot");
xrCreateAction(actionSet, &triggerInfo, &triggerAction);
 
// Create thumbstick action (2D vector)
XrActionCreateInfo stickInfo{XR_TYPE_ACTION_CREATE_INFO};
stickInfo.actionType = XR_ACTION_TYPE_VECTOR2F_INPUT;
strcpy(stickInfo.actionName, "move");
xrCreateAction(actionSet, &stickInfo, &thumbstickAction);
 
// Suggest bindings for Meta Touch controllers
XrActionSuggestedBinding bindings[] = {
    {triggerAction,    xrStringToPath(instance, "/user/hand/right/input/trigger/value")},
    {thumbstickAction, xrStringToPath(instance, "/user/hand/left/input/thumbstick")},
};
 
XrInteractionProfileSuggestedBinding suggested{
    XR_TYPE_INTERACTION_PROFILE_SUGGESTED_BINDING};
suggested.interactionProfile =
    xrStringToPath(instance, "/interaction_profiles/oculus/touch_controller");
suggested.suggestedBindings    = bindings;
suggested.countSuggestedBindings = 2;
xrSuggestInteractionProfileBindings(instance, &suggested);
 
// Per frame — read action state
XrActionStateBoolean triggerState{XR_TYPE_ACTION_STATE_BOOLEAN};
XrActionStateGetInfo getInfo{XR_TYPE_ACTION_STATE_GET_INFO};
getInfo.action = triggerAction;
xrGetActionStateBoolean(session, &getInfo, &triggerState);
 
if (triggerState.isActive && triggerState.currentState) {
    // Trigger is pressed — shoot!
}

SteamVR

SteamVR in Unity (OpenXR)

// Unity XR Interaction Toolkit — works with SteamVR via OpenXR
using UnityEngine;
using UnityEngine.XR.Interaction.Toolkit;
using UnityEngine.XR;
 
public class VRPlayerController : MonoBehaviour
{
    [SerializeField] XRNode leftHandNode  = XRNode.LeftHand;
    [SerializeField] XRNode rightHandNode = XRNode.RightHand;
    [SerializeField] float  moveSpeed = 3f;
    [SerializeField] CharacterController characterController;
 
    void Update() {
        // Get thumbstick input from left controller
        InputDevice leftDevice = InputDevices.GetDeviceAtXRNode(leftHandNode);
        Vector2 moveInput;
        if (leftDevice.TryGetFeatureValue(CommonUsages.primary2DAxis, out moveInput)) {
            // Move relative to HMD forward direction
            Transform hmd = Camera.main.transform;
            Vector3 forward = Vector3.ProjectOnPlane(hmd.forward, Vector3.up).normalized;
            Vector3 right   = Vector3.ProjectOnPlane(hmd.right,   Vector3.up).normalized;
            Vector3 move    = (forward * moveInput.y + right * moveInput.x) * moveSpeed;
            characterController.Move(move * Time.deltaTime);
        }
 
        // Get trigger from right controller
        InputDevice rightDevice = InputDevices.GetDeviceAtXRNode(rightHandNode);
        float triggerValue;
        if (rightDevice.TryGetFeatureValue(CommonUsages.trigger, out triggerValue)) {
            if (triggerValue > 0.8f) Shoot();
        }
 
        // Haptic feedback
        rightDevice.SendHapticImpulse(0, 0.5f, 0.1f); // channel, amplitude, duration
    }
 
    void Shoot() { /* fire weapon */ }
}

XR Interaction Toolkit (Unity)

// XR Interaction Toolkit — high-level VR interaction
using UnityEngine.XR.Interaction.Toolkit;
 
// XRGrabInteractable — make any object grabbable
// Add component to GameObject in Inspector:
// XRGrabInteractable + Rigidbody + Collider
 
public class VRWeapon : XRGrabInteractable
{
    [SerializeField] GameObject bulletPrefab;
    [SerializeField] Transform  muzzle;
 
    protected override void OnSelectEntered(SelectEnterEventArgs args) {
        base.OnSelectEntered(args);
        // Called when player grabs the weapon
        Debug.Log("Weapon grabbed by: " + args.interactorObject.transform.name);
    }
 
    protected override void OnActivated(ActivateEventArgs args) {
        base.OnActivated(args);
        // Called when trigger is pressed while holding
        Fire();
    }
 
    void Fire() {
        var bullet = Instantiate(bulletPrefab, muzzle.position, muzzle.rotation);
        bullet.GetComponent<Rigidbody>().AddForce(muzzle.forward * 1000f);
 
        // Haptic feedback on firing hand
        if (interactorsSelecting.Count > 0) {
            var interactor = interactorsSelecting[0] as XRBaseControllerInteractor;
            interactor?.SendHapticImpulse(0.7f, 0.1f);
        }
    }
}

Meta Quest SDK

Meta Quest Features

FeatureDescriptionSDK
Hand TrackingController-free hand inputMeta XR SDK
PassthroughSee real world through headset camerasMeta XR SDK
Scene UnderstandingDetect walls, floors, furnitureMeta XR SDK
Spatial AnchorsPersistent anchors in real worldMeta XR SDK
MultiplayerVoice chat, social featuresMeta Platform SDK
App LabSideload without full store reviewMeta Developer
Mixed RealityBlend virtual and real worldMeta XR SDK

Hand Tracking (Unity)

using Oculus.Interaction;
using Oculus.Interaction.Input;
 
public class HandTrackingExample : MonoBehaviour
{
    [SerializeField] Hand leftHand;
    [SerializeField] Hand rightHand;
 
    void Update() {
        // Check if hand tracking is active
        if (!leftHand.IsTrackedDataValid) return;
 
        // Get joint pose (e.g., index fingertip)
        Pose indexTip;
        if (leftHand.GetJointPose(HandJointId.HandIndexTip, out indexTip)) {
            // indexTip.position = world position of fingertip
            // indexTip.rotation = orientation of fingertip
            CheckPinch(indexTip.position);
        }
 
        // Detect pinch gesture
        float pinchStrength = leftHand.GetFingerPinchStrength(HandFinger.Index);
        if (pinchStrength > 0.9f) {
            OnPinch();
        }
    }
 
    void CheckPinch(Vector3 fingertipPos) { }
    void OnPinch() { Debug.Log("Pinch detected!"); }
}

Passthrough & Mixed Reality (Unity)

using UnityEngine;
using Oculus.Platform;
 
public class PassthroughManager : MonoBehaviour
{
    [SerializeField] OVRPassthroughLayer passthroughLayer;
 
    void Start() {
        // Enable passthrough (see real world)
        passthroughLayer.enabled = true;
 
        // Set camera background to transparent
        Camera.main.clearFlags = CameraClearFlags.SolidColor;
        Camera.main.backgroundColor = Color.clear;
    }
 
    public void TogglePassthrough(bool enabled) {
        passthroughLayer.enabled = enabled;
        // When disabled — fully virtual environment
        // When enabled — mixed reality
    }
 
    // Set passthrough opacity (0 = fully virtual, 1 = fully real)
    public void SetOpacity(float opacity) {
        passthroughLayer.textureOpacity = opacity;
    }
}

Meta Quest in Godot

# Godot 4 — Meta Quest via OpenXR plugin
# Install: Godot OpenXR Vendors plugin
 
extends Node3D
 
@onready var xr_interface = XRServer.find_interface("OpenXR")
@onready var left_controller  = $XROrigin3D/LeftController
@onready var right_controller = $XROrigin3D/RightController
 
func _ready() -> void:
    if xr_interface and xr_interface.is_initialized():
        DisplayServer.window_set_vsync_mode(DisplayServer.VSYNC_DISABLED)
        get_viewport().use_xr = true
    else:
        push_error("OpenXR not initialized")
 
func _process(_delta: float) -> void:
    # Read controller input
    var trigger = Input.get_action_strength("trigger_right")
    if trigger > 0.8:
        shoot()
 
    # Get controller position
    var right_pos = right_controller.global_position
    var right_rot = right_controller.global_rotation
 
func shoot() -> void:
    # Trigger haptic feedback
    Input.start_joy_vibration(0, 0.5, 0.5, 0.1)

ARCore (Android AR)

ARCore Core Features

graph TD
    subgraph Tracking["📍 World Tracking"]
        MT["Motion Tracking\n6DOF phone position\nusing camera + IMU"]
        EP["Environmental Understanding\nDetect horizontal/vertical planes\nFloors, tables, walls"]
        LI["Light Estimation\nEstimate real-world lighting\nfor realistic shadows"]
    end
    subgraph Anchors["⚓ Anchors"]
        LA["Local Anchors\nPersist in session"]
        CA["Cloud Anchors\nShared across devices\nMultiplayer AR"]
        GA["Geospatial Anchors\nGPS + Street View\nOutdoor AR"]
    end
    subgraph Content["🎮 AR Content"]
        Plane["Plane Detection\nPlace objects on surfaces"]
        Depth["Depth API\nOcclusion — objects behind real surfaces"]
        Face["Face Tracking\nAR face filters"]
        Image["Image Tracking\nTrack printed markers"]
    end

ARCore in Unity (AR Foundation)

// AR Foundation — Unity's cross-platform AR framework
// Works with ARCore (Android) and ARKit (iOS) via same API
 
using UnityEngine;
using UnityEngine.XR.ARFoundation;
using UnityEngine.XR.ARSubsystems;
using System.Collections.Generic;
 
public class ARPlacementManager : MonoBehaviour
{
    [SerializeField] ARRaycastManager   raycastManager;
    [SerializeField] ARPlaneManager      planeManager;
    [SerializeField] GameObject          objectToPlace;
 
    List<ARRaycastHit> hits = new List<ARRaycastHit>();
    GameObject         placedObject;
 
void Update() {
        // Tap to place object on detected plane
        if (Input.touchCount == 0) return;
        Touch touch = Input.GetTouch(0);
        if (touch.phase != TouchPhase.Began) return;
 
        // Raycast against detected AR planes
        if (raycastManager.Raycast(touch.position, hits,
            TrackableType.PlaneWithinPolygon)) {
 
            Pose hitPose = hits[0].pose;
 
            if (placedObject == null) {
                // First placement
                placedObject = Instantiate(objectToPlace,
                    hitPose.position, hitPose.rotation);
            } else {
                // Move existing object
                placedObject.transform.SetPositionAndRotation(
                    hitPose.position, hitPose.rotation);
            }
        }
    }
 
    // Toggle plane visualization
    public void TogglePlaneVisualization(bool visible) {
        foreach (var plane in planeManager.trackables) {
            plane.gameObject.SetActive(visible);
        }
    }
}

ARCore Depth API (Occlusion)

// Depth API — virtual objects occluded by real-world surfaces
using UnityEngine.XR.ARFoundation;
 
public class ARDepthOcclusion : MonoBehaviour
{
    [SerializeField] AROcclusionManager occlusionManager;
 
    void Start() {
        // Enable environment depth occlusion
        occlusionManager.requestedEnvironmentDepthMode =
            EnvironmentDepthMode.Best;
 
        // Enable human segmentation (separate people from background)
        occlusionManager.requestedHumanDepthMode =
            HumanSegmentationDepthMode.Best;
    }
}
// With depth enabled, virtual objects automatically hide behind
// real-world surfaces — a chair leg blocks a virtual ball, etc.

ARKit (iOS AR)

ARKit vs ARCore

FeatureARKit (iOS)ARCore (Android)
Plane detectionExcellentGood
LiDAR depthYes (Pro devices)Some devices
Face trackingYes (TrueDepth camera)Yes (front camera)
Image trackingYesYes
Object detectionYesLimited
People occlusionYesYes (Depth API)
GeospatialYes (ARKit 6+)Yes (Geospatial API)
Collaborative sessionsYesCloud Anchors
Quality consistencyVery highVaries by device

ARKit in Unity (AR Foundation)

// AR Foundation works identically for ARKit and ARCore
// Same code — different platform subsystem underneath
 
using UnityEngine.XR.ARFoundation;
using UnityEngine.XR.ARKit;
 
public class ARKitFeatures : MonoBehaviour
{
    [SerializeField] ARFaceManager faceManager;
    [SerializeField] ARAnchorManager anchorManager;
 
    void Start() {
        // Face tracking — AR face filters
        faceManager.facesChanged += OnFacesChanged;
    }
 
    void OnFacesChanged(ARFacesChangedEventArgs args) {
        foreach (var face in args.added) {
            // face.vertices — mesh of detected face
            // face.blendShapeCoefficients — 52 blend shapes
            // (smile, blink, jaw open, etc.)
            Debug.Log("Face detected at: " + face.transform.position);
        }
    }
 
    // LiDAR — scene reconstruction (iPhone 12 Pro+)
    void EnableSceneReconstruction() {
        var sessionSubsystem = (ARKitSessionSubsystem)
            ARSession.subsystem;
        if (sessionSubsystem.coachingOverlaySupported) {
            sessionSubsystem.requestedCoachingGoal =
                ARCoachingGoal.HorizontalPlane;
        }
    }
}

ARKit in Swift (Native)

import ARKit
import RealityKit
 
class ARViewController: UIViewController, ARSessionDelegate {
 
    @IBOutlet var arView: ARView!
 
    override func viewDidLoad() {
        super.viewDidLoad()
 
        // Configure AR session
        let config = ARWorldTrackingConfiguration()
        config.planeDetection = [.horizontal, .vertical]
        config.environmentTexturing = .automatic
 
        // Enable LiDAR scene reconstruction (Pro devices)
        if ARWorldTrackingConfiguration.supportsSceneReconstruction(.mesh) {
            config.sceneReconstruction = .mesh
        }
 
        arView.session.run(config)
        arView.session.delegate = self
 
        // Add tap gesture for placement
        let tap = UITapGestureRecognizer(target: self,
            action: #selector(handleTap))
        arView.addGestureRecognizer(tap)
    }
 
    @objc func handleTap(_ gesture: UITapGestureRecognizer) {
        let location = gesture.location(in: arView)
 
        // Raycast against detected planes
        let results = arView.raycast(from: location,
            allowing: .estimatedPlane, alignment: .horizontal)
 
        if let result = results.first {
            // Place 3D model at tap location
            let anchor = AnchorEntity(world: result.worldTransform)
            let box = ModelEntity(mesh: .generateBox(size: 0.1),
                materials: [SimpleMaterial(color: .blue, isMetallic: true)])
            anchor.addChild(box)
            arView.scene.addAnchor(anchor)
        }
    }
}

VR Interaction Design

Locomotion Systems

graph TD
    subgraph Comfort["😌 High Comfort"]
        TP["Teleportation\nPoint and teleport\nNo motion sickness\nBreaks immersion slightly"]
        ARM["Arm Swinger\nSwing arms to walk\nModerate comfort\nMore natural"]
    end
    subgraph Medium["😐 Medium Comfort"]
        SM["Smooth Locomotion\nThumbstick movement\nMost immersive\nCan cause sickness"]
        DASH["Dash / Blink\nInstant short movement\nGood compromise"]
    end
    subgraph Physical["🏃 Physical Movement"]
        RW["Room-Scale Walking\nPhysically walk in room\nBest comfort\nLimited by room size"]
        TREAD["Treadmill\nOmni-directional treadmill\nBest immersion\nExpensive hardware"]
    end
LocomotionComfortImmersionBest For
Teleportation⭐⭐⭐⭐⭐⭐⭐⭐Casual, accessibility
Smooth locomotion⭐⭐⭐⭐⭐⭐⭐Action games, experienced users
Arm swinger⭐⭐⭐⭐⭐⭐⭐⭐Adventure, exploration
Room-scale⭐⭐⭐⭐⭐⭐⭐⭐⭐⭐Puzzle, horror, social
Dash/blink⭐⭐⭐⭐⭐⭐⭐⭐Action, platformer

Comfort & Motion Sickness Prevention

CausePrevention
Smooth locomotionVignette/tunnel vision during movement
Low frame rateMaintain 90 FPS minimum — never drop below 72
LatencyKeep motion-to-photon latency < 20ms
Artificial head bobNever add camera bob — only real head movement
AccelerationInstant velocity changes — no gradual acceleration
Conflicting motionDon’t move camera independently of player input
Small FOVDon’t artificially restrict FOV

VR UI Design

UI TypeDescriptionUse Case
World-space UIPanels floating in 3D worldMenus, inventory, HUD
Diegetic UIUI on in-game objects (watch, tablet)Immersive games
Gaze-basedLook at button to activateAccessibility, no controller
Hand-attachedUI follows controller/handQuick access menus
Laser pointerRay from controller selects UIStandard VR interaction
VR UI Distance Guidelines:
  Too close  < 0.5m  — eye strain, hard to focus
  Comfortable: 1–3m  — ideal reading distance
  Far        > 5m    — hard to read small text
  Sky/world  > 10m   — environmental elements only

Minimum button size: 5cm × 5cm in world space
Minimum text size: 14pt at 1m distance

Hand Tracking Interaction

graph TD
    subgraph Gestures["✋ Hand Gestures"]
        Pinch["Pinch\nIndex + thumb\nSelect, grab"]
        Point["Point\nIndex extended\nAim, indicate"]
        Grab["Grab\nAll fingers closed\nHold objects"]
        Open["Open palm\nAll fingers extended\nMenu, stop"]
    end
    subgraph Direct["🤏 Direct Manipulation"]
        Touch["Touch\nFinger touches object\nPress buttons"]
        Poke["Poke\nFingertip interaction\nKeyboard, piano"]
        TwoHand["Two-handed\nScale, rotate objects\nManipulation"]
    end

VR Rendering & Performance

VR Rendering Pipeline

graph LR
    subgraph Stereo["👁️ Stereo Rendering"]
        L["Left Eye\nRender"] 
        R["Right Eye\nRender"]
    end
    subgraph Techniques["Rendering Techniques"]
        MR["Multi-View Rendering\nRender both eyes\nin one pass (GPU)"]
        SR["Single Pass Instanced\nInstanced draw calls\nfor both eyes"]
    end
    subgraph Reprojection["🔄 Reprojection"]
        ATW["Asynchronous TimeWarp\nReproject last frame\nif new frame late"]
        ASW["Asynchronous SpaceWarp\nSynthesize frames\nat half rate"]
    end
    Stereo --> Techniques
    Techniques --> Reprojection

Foveated Rendering

TypeDescriptionHardware Required
Fixed FoveatedAlways render center at full resAny VR headset
Eye-tracked FoveatedFollow gaze directionEye tracking (Quest Pro, PSVR2)
Variable Rate ShadingGPU-level resolution controlModern GPUs
// Unity — Fixed Foveated Rendering (Meta Quest)
using Unity.XR.Oculus;
 
void Start() {
    // Enable fixed foveated rendering
    OculusSettings.fixedFoveatedRenderingLevel =
        OculusSettings.FixedFoveatedRenderingLevel.High;
    OculusSettings.useDynamicFixedFoveatedRendering = true;
}

VR Performance Targets

PlatformTarget FPSFrame BudgetResolution/Eye
Meta Quest 272–90 Hz11–14ms1832×1920
Meta Quest 390–120 Hz8–11ms2064×2208
PC VR (SteamVR)90–144 Hz7–11msVaries
PSVR290–120 Hz8–11ms2000×2040
Apple Vision Pro100 Hz10ms3660×3142

VR Optimization Checklist

Engine VR Setup

Unity XR Setup

Unity XR Setup Steps:
1. Window → Package Manager → Install:
   - XR Plugin Management
   - OpenXR Plugin
   - XR Interaction Toolkit
   - AR Foundation (for AR)
   - Meta XR SDK (for Quest features)

2. Edit → Project Settings → XR Plug-in Management:
   - Android: Enable OpenXR
   - iOS: Enable ARKit
   - PC: Enable OpenXR

3. OpenXR Feature Groups:
   - Enable: Meta Quest Support
   - Enable: Hand Tracking Subsystem
   - Enable: Eye Gaze Interaction

4. Scene Setup:
   - Add XR Origin (Camera Rig)
   - Add XR Interaction Manager
   - Add Left/Right Controller objects
   - Configure Input Action Assets

Godot XR Setup

# Godot 4 XR Setup
# 1. Install OpenXR plugin from Asset Library
# 2. Project Settings → XR → Enable OpenXR
 
extends Node3D
 
func _ready() -> void:
    var xr_interface = XRServer.find_interface("OpenXR")
    if xr_interface and xr_interface.initialize():
        print("OpenXR initialized")
        get_viewport().use_xr = true
        DisplayServer.window_set_vsync_mode(
            DisplayServer.VSYNC_DISABLED)
    else:
        push_error("OpenXR failed to initialize")
 
# Scene structure for VR:
# XROrigin3D (root)
#   ├── XRCamera3D (head)
#   ├── XRController3D (left hand)
#   │     └── MeshInstance3D (controller model)
#   └── XRController3D (right hand)
#         └── MeshInstance3D (controller model)

Unreal Engine VR Setup

Unreal Engine VR Setup:
1. Edit → Plugins → Enable:
   - OpenXR
   - OpenXR Hand Tracking
   - Meta XR (for Quest)

2. Project Settings → Engine → Input:
   - Add VR motion controller bindings

3. Use VR Template as starting point:
   - File → New Project → VR Template
   - Includes: locomotion, grabbing, UI interaction

4. VR Pawn setup:
   - VRPawn Blueprint
   - Camera component (head)
   - MotionControllerComponent (left/right)
   - GrabComponent for interaction

More Learn

Official Documentation

Free Learning Resources

Tools