This page covers VR and AR game and application development — from hardware fundamentals to production-ready XR experiences.
For engine setup see Unity, Unreal Engine, Godot.
For rendering concepts see Advanced Graphics. For game design see Game Design.
History
How: VR began with military flight simulators (1960s), became a consumer curiosity with Nintendo Virtual Boy (1995), then truly arrived with Oculus Rift DK1 (2012) and consumer headsets (2016). AR went mainstream with Pokemon GO (2016) and Apple Vision Pro (2024).
Who: Key players — Oculus/Meta (Quest), Valve (Index/SteamVR), Sony (PSVR), Apple (Vision Pro), Microsoft (HoloLens), Google (ARCore), Apple (ARKit), Khronos Group (OpenXR).
Why: VR creates the highest possible immersion — presence. AR overlays digital information on the real world. Both are transforming gaming, training, medicine, architecture, and social interaction.
XR Timeline
timeline
title VR/AR Evolution
1968 : The Sword of Damocles
: First head-mounted display
: Ivan Sutherland MIT
1995 : Nintendo Virtual Boy
: First consumer VR attempt
: Commercial failure
2012 : Oculus Rift DK1
: Kickstarter success
: Modern VR era begins
2016 : Consumer VR Launch
: HTC Vive, Oculus Rift CV1, PSVR
: Pokemon GO AR mainstream
2019 : Oculus Quest 1
: Standalone VR, no PC required
: VR becomes accessible
2020 : OpenXR 1.0 Standard
: Cross-platform XR API
: Industry unification begins
2023 : Meta Quest 3
: Mixed reality passthrough
: Affordable high-quality VR
2024 : Apple Vision Pro
: Spatial computing era
: $3499 premium device
Introduction
XR Knowledge Map
mindmap
root((XR Development))
Hardware
Headsets
Controllers
Tracking
Display Tech
SDKs and APIs
OpenXR
SteamVR
Meta SDK
ARCore
ARKit
Interaction Design
Locomotion
Hand Tracking
Gaze Input
Haptics
Rendering
Stereo Rendering
Foveated Rendering
Reprojection
Performance
Platforms
Meta Quest
PC VR
PSVR2
iOS AR
Android AR
VR vs AR vs MR
Type
Real World
Digital World
Device
Example
VR
Blocked out
Fully immersive
Headset
Beat Saber, Half-Life Alyx
AR
Visible (camera)
Overlaid
Phone / glasses
Pokemon GO, IKEA Place
MR
Visible (passthrough)
Interacts with real
Headset
HoloLens, Quest 3
Spatial Computing
Seamless blend
Context-aware
Vision Pro
Apple Vision Pro apps
VR Hardware Fundamentals
Headset Types
Type
Description
Examples
Pros
Cons
Standalone
Self-contained, no PC
Meta Quest 3, Quest 2
Wireless, affordable, easy setup
Limited GPU power
PC VR (tethered)
Connected to gaming PC
Valve Index, Pimax
Maximum quality
Cable, expensive PC needed
PC VR (wireless)
PC power, wireless link
Quest 3 + Air Link
Best of both
Latency, compression
Console VR
Connected to console
PSVR2
Plug-and-play
Platform locked
Standalone premium
High-end standalone
Apple Vision Pro
Best standalone quality
Very expensive
Display Technology
Spec
Minimum
Good
Excellent
Notes
Resolution per eye
1080×1200
1832×1920
2064×2208+
Higher = less screen door effect
Refresh rate
72 Hz
90 Hz
120 Hz
Higher = less motion sickness
FOV (Field of View)
90°
110°
120°+
Wider = more immersive
IPD adjustment
Fixed
Software
Hardware
Interpupillary distance
Panel type
LCD
Fast-LCD
OLED/MicroOLED
OLED = better blacks, contrast
Tracking Systems
graph TD
subgraph Inside["📷 Inside-Out Tracking (Modern Standard)"]
I1["Cameras on headset\nlook outward"]
I2["SLAM algorithm\nmaps environment"]
I3["6DOF tracking\nno external hardware"]
I1 --> I2 --> I3
end
subgraph Outside["📡 Outside-In Tracking (Legacy)"]
O1["External base stations\n(Valve Lighthouse)"]
O2["Sensors on headset\ndetect laser sweeps"]
O3["Very precise tracking\nrequires room setup"]
O1 --> O2 --> O3
end
// Unity XR Interaction Toolkit — works with SteamVR via OpenXRusing UnityEngine;using UnityEngine.XR.Interaction.Toolkit;using UnityEngine.XR;public class VRPlayerController : MonoBehaviour{ [SerializeField] XRNode leftHandNode = XRNode.LeftHand; [SerializeField] XRNode rightHandNode = XRNode.RightHand; [SerializeField] float moveSpeed = 3f; [SerializeField] CharacterController characterController; void Update() { // Get thumbstick input from left controller InputDevice leftDevice = InputDevices.GetDeviceAtXRNode(leftHandNode); Vector2 moveInput; if (leftDevice.TryGetFeatureValue(CommonUsages.primary2DAxis, out moveInput)) { // Move relative to HMD forward direction Transform hmd = Camera.main.transform; Vector3 forward = Vector3.ProjectOnPlane(hmd.forward, Vector3.up).normalized; Vector3 right = Vector3.ProjectOnPlane(hmd.right, Vector3.up).normalized; Vector3 move = (forward * moveInput.y + right * moveInput.x) * moveSpeed; characterController.Move(move * Time.deltaTime); } // Get trigger from right controller InputDevice rightDevice = InputDevices.GetDeviceAtXRNode(rightHandNode); float triggerValue; if (rightDevice.TryGetFeatureValue(CommonUsages.trigger, out triggerValue)) { if (triggerValue > 0.8f) Shoot(); } // Haptic feedback rightDevice.SendHapticImpulse(0, 0.5f, 0.1f); // channel, amplitude, duration } void Shoot() { /* fire weapon */ }}
XR Interaction Toolkit (Unity)
// XR Interaction Toolkit — high-level VR interactionusing UnityEngine.XR.Interaction.Toolkit;// XRGrabInteractable — make any object grabbable// Add component to GameObject in Inspector:// XRGrabInteractable + Rigidbody + Colliderpublic class VRWeapon : XRGrabInteractable{ [SerializeField] GameObject bulletPrefab; [SerializeField] Transform muzzle; protected override void OnSelectEntered(SelectEnterEventArgs args) { base.OnSelectEntered(args); // Called when player grabs the weapon Debug.Log("Weapon grabbed by: " + args.interactorObject.transform.name); } protected override void OnActivated(ActivateEventArgs args) { base.OnActivated(args); // Called when trigger is pressed while holding Fire(); } void Fire() { var bullet = Instantiate(bulletPrefab, muzzle.position, muzzle.rotation); bullet.GetComponent<Rigidbody>().AddForce(muzzle.forward * 1000f); // Haptic feedback on firing hand if (interactorsSelecting.Count > 0) { var interactor = interactorsSelecting[0] as XRBaseControllerInteractor; interactor?.SendHapticImpulse(0.7f, 0.1f); } }}
Meta Quest SDK
Meta Quest Features
Feature
Description
SDK
Hand Tracking
Controller-free hand input
Meta XR SDK
Passthrough
See real world through headset cameras
Meta XR SDK
Scene Understanding
Detect walls, floors, furniture
Meta XR SDK
Spatial Anchors
Persistent anchors in real world
Meta XR SDK
Multiplayer
Voice chat, social features
Meta Platform SDK
App Lab
Sideload without full store review
Meta Developer
Mixed Reality
Blend virtual and real world
Meta XR SDK
Hand Tracking (Unity)
using Oculus.Interaction;using Oculus.Interaction.Input;public class HandTrackingExample : MonoBehaviour{ [SerializeField] Hand leftHand; [SerializeField] Hand rightHand; void Update() { // Check if hand tracking is active if (!leftHand.IsTrackedDataValid) return; // Get joint pose (e.g., index fingertip) Pose indexTip; if (leftHand.GetJointPose(HandJointId.HandIndexTip, out indexTip)) { // indexTip.position = world position of fingertip // indexTip.rotation = orientation of fingertip CheckPinch(indexTip.position); } // Detect pinch gesture float pinchStrength = leftHand.GetFingerPinchStrength(HandFinger.Index); if (pinchStrength > 0.9f) { OnPinch(); } } void CheckPinch(Vector3 fingertipPos) { } void OnPinch() { Debug.Log("Pinch detected!"); }}
Passthrough & Mixed Reality (Unity)
using UnityEngine;using Oculus.Platform;public class PassthroughManager : MonoBehaviour{ [SerializeField] OVRPassthroughLayer passthroughLayer; void Start() { // Enable passthrough (see real world) passthroughLayer.enabled = true; // Set camera background to transparent Camera.main.clearFlags = CameraClearFlags.SolidColor; Camera.main.backgroundColor = Color.clear; } public void TogglePassthrough(bool enabled) { passthroughLayer.enabled = enabled; // When disabled — fully virtual environment // When enabled — mixed reality } // Set passthrough opacity (0 = fully virtual, 1 = fully real) public void SetOpacity(float opacity) { passthroughLayer.textureOpacity = opacity; }}
Meta Quest in Godot
# Godot 4 — Meta Quest via OpenXR plugin# Install: Godot OpenXR Vendors pluginextends Node3D@onready var xr_interface = XRServer.find_interface("OpenXR")@onready var left_controller = $XROrigin3D/LeftController@onready var right_controller = $XROrigin3D/RightControllerfunc _ready() -> void: if xr_interface and xr_interface.is_initialized(): DisplayServer.window_set_vsync_mode(DisplayServer.VSYNC_DISABLED) get_viewport().use_xr = true else: push_error("OpenXR not initialized")func _process(_delta: float) -> void: # Read controller input var trigger = Input.get_action_strength("trigger_right") if trigger > 0.8: shoot() # Get controller position var right_pos = right_controller.global_position var right_rot = right_controller.global_rotationfunc shoot() -> void: # Trigger haptic feedback Input.start_joy_vibration(0, 0.5, 0.5, 0.1)
ARCore (Android AR)
ARCore Core Features
graph TD
subgraph Tracking["📍 World Tracking"]
MT["Motion Tracking\n6DOF phone position\nusing camera + IMU"]
EP["Environmental Understanding\nDetect horizontal/vertical planes\nFloors, tables, walls"]
LI["Light Estimation\nEstimate real-world lighting\nfor realistic shadows"]
end
subgraph Anchors["⚓ Anchors"]
LA["Local Anchors\nPersist in session"]
CA["Cloud Anchors\nShared across devices\nMultiplayer AR"]
GA["Geospatial Anchors\nGPS + Street View\nOutdoor AR"]
end
subgraph Content["🎮 AR Content"]
Plane["Plane Detection\nPlace objects on surfaces"]
Depth["Depth API\nOcclusion — objects behind real surfaces"]
Face["Face Tracking\nAR face filters"]
Image["Image Tracking\nTrack printed markers"]
end
ARCore in Unity (AR Foundation)
// AR Foundation — Unity's cross-platform AR framework// Works with ARCore (Android) and ARKit (iOS) via same APIusing UnityEngine;using UnityEngine.XR.ARFoundation;using UnityEngine.XR.ARSubsystems;using System.Collections.Generic;public class ARPlacementManager : MonoBehaviour{ [SerializeField] ARRaycastManager raycastManager; [SerializeField] ARPlaneManager planeManager; [SerializeField] GameObject objectToPlace; List<ARRaycastHit> hits = new List<ARRaycastHit>(); GameObject placedObject;void Update() { // Tap to place object on detected plane if (Input.touchCount == 0) return; Touch touch = Input.GetTouch(0); if (touch.phase != TouchPhase.Began) return; // Raycast against detected AR planes if (raycastManager.Raycast(touch.position, hits, TrackableType.PlaneWithinPolygon)) { Pose hitPose = hits[0].pose; if (placedObject == null) { // First placement placedObject = Instantiate(objectToPlace, hitPose.position, hitPose.rotation); } else { // Move existing object placedObject.transform.SetPositionAndRotation( hitPose.position, hitPose.rotation); } } } // Toggle plane visualization public void TogglePlaneVisualization(bool visible) { foreach (var plane in planeManager.trackables) { plane.gameObject.SetActive(visible); } }}
ARCore Depth API (Occlusion)
// Depth API — virtual objects occluded by real-world surfacesusing UnityEngine.XR.ARFoundation;public class ARDepthOcclusion : MonoBehaviour{ [SerializeField] AROcclusionManager occlusionManager; void Start() { // Enable environment depth occlusion occlusionManager.requestedEnvironmentDepthMode = EnvironmentDepthMode.Best; // Enable human segmentation (separate people from background) occlusionManager.requestedHumanDepthMode = HumanSegmentationDepthMode.Best; }}// With depth enabled, virtual objects automatically hide behind// real-world surfaces — a chair leg blocks a virtual ball, etc.
ARKit (iOS AR)
ARKit vs ARCore
Feature
ARKit (iOS)
ARCore (Android)
Plane detection
Excellent
Good
LiDAR depth
Yes (Pro devices)
Some devices
Face tracking
Yes (TrueDepth camera)
Yes (front camera)
Image tracking
Yes
Yes
Object detection
Yes
Limited
People occlusion
Yes
Yes (Depth API)
Geospatial
Yes (ARKit 6+)
Yes (Geospatial API)
Collaborative sessions
Yes
Cloud Anchors
Quality consistency
Very high
Varies by device
ARKit in Unity (AR Foundation)
// AR Foundation works identically for ARKit and ARCore// Same code — different platform subsystem underneathusing UnityEngine.XR.ARFoundation;using UnityEngine.XR.ARKit;public class ARKitFeatures : MonoBehaviour{ [SerializeField] ARFaceManager faceManager; [SerializeField] ARAnchorManager anchorManager; void Start() { // Face tracking — AR face filters faceManager.facesChanged += OnFacesChanged; } void OnFacesChanged(ARFacesChangedEventArgs args) { foreach (var face in args.added) { // face.vertices — mesh of detected face // face.blendShapeCoefficients — 52 blend shapes // (smile, blink, jaw open, etc.) Debug.Log("Face detected at: " + face.transform.position); } } // LiDAR — scene reconstruction (iPhone 12 Pro+) void EnableSceneReconstruction() { var sessionSubsystem = (ARKitSessionSubsystem) ARSession.subsystem; if (sessionSubsystem.coachingOverlaySupported) { sessionSubsystem.requestedCoachingGoal = ARCoachingGoal.HorizontalPlane; } }}
ARKit in Swift (Native)
import ARKitimport RealityKitclass ARViewController: UIViewController, ARSessionDelegate { @IBOutlet var arView: ARView! override func viewDidLoad() { super.viewDidLoad() // Configure AR session let config = ARWorldTrackingConfiguration() config.planeDetection = [.horizontal, .vertical] config.environmentTexturing = .automatic // Enable LiDAR scene reconstruction (Pro devices) if ARWorldTrackingConfiguration.supportsSceneReconstruction(.mesh) { config.sceneReconstruction = .mesh } arView.session.run(config) arView.session.delegate = self // Add tap gesture for placement let tap = UITapGestureRecognizer(target: self, action: #selector(handleTap)) arView.addGestureRecognizer(tap) } @objc func handleTap(_ gesture: UITapGestureRecognizer) { let location = gesture.location(in: arView) // Raycast against detected planes let results = arView.raycast(from: location, allowing: .estimatedPlane, alignment: .horizontal) if let result = results.first { // Place 3D model at tap location let anchor = AnchorEntity(world: result.worldTransform) let box = ModelEntity(mesh: .generateBox(size: 0.1), materials: [SimpleMaterial(color: .blue, isMetallic: true)]) anchor.addChild(box) arView.scene.addAnchor(anchor) } }}
VR Interaction Design
Locomotion Systems
graph TD
subgraph Comfort["😌 High Comfort"]
TP["Teleportation\nPoint and teleport\nNo motion sickness\nBreaks immersion slightly"]
ARM["Arm Swinger\nSwing arms to walk\nModerate comfort\nMore natural"]
end
subgraph Medium["😐 Medium Comfort"]
SM["Smooth Locomotion\nThumbstick movement\nMost immersive\nCan cause sickness"]
DASH["Dash / Blink\nInstant short movement\nGood compromise"]
end
subgraph Physical["🏃 Physical Movement"]
RW["Room-Scale Walking\nPhysically walk in room\nBest comfort\nLimited by room size"]
TREAD["Treadmill\nOmni-directional treadmill\nBest immersion\nExpensive hardware"]
end
Locomotion
Comfort
Immersion
Best For
Teleportation
⭐⭐⭐⭐⭐
⭐⭐⭐
Casual, accessibility
Smooth locomotion
⭐⭐
⭐⭐⭐⭐⭐
Action games, experienced users
Arm swinger
⭐⭐⭐⭐
⭐⭐⭐⭐
Adventure, exploration
Room-scale
⭐⭐⭐⭐⭐
⭐⭐⭐⭐⭐
Puzzle, horror, social
Dash/blink
⭐⭐⭐⭐
⭐⭐⭐⭐
Action, platformer
Comfort & Motion Sickness Prevention
Cause
Prevention
Smooth locomotion
Vignette/tunnel vision during movement
Low frame rate
Maintain 90 FPS minimum — never drop below 72
Latency
Keep motion-to-photon latency < 20ms
Artificial head bob
Never add camera bob — only real head movement
Acceleration
Instant velocity changes — no gradual acceleration
Conflicting motion
Don’t move camera independently of player input
Small FOV
Don’t artificially restrict FOV
VR UI Design
UI Type
Description
Use Case
World-space UI
Panels floating in 3D world
Menus, inventory, HUD
Diegetic UI
UI on in-game objects (watch, tablet)
Immersive games
Gaze-based
Look at button to activate
Accessibility, no controller
Hand-attached
UI follows controller/hand
Quick access menus
Laser pointer
Ray from controller selects UI
Standard VR interaction
VR UI Distance Guidelines:
Too close < 0.5m — eye strain, hard to focus
Comfortable: 1–3m — ideal reading distance
Far > 5m — hard to read small text
Sky/world > 10m — environmental elements only
Minimum button size: 5cm × 5cm in world space
Minimum text size: 14pt at 1m distance
Hand Tracking Interaction
graph TD
subgraph Gestures["✋ Hand Gestures"]
Pinch["Pinch\nIndex + thumb\nSelect, grab"]
Point["Point\nIndex extended\nAim, indicate"]
Grab["Grab\nAll fingers closed\nHold objects"]
Open["Open palm\nAll fingers extended\nMenu, stop"]
end
subgraph Direct["🤏 Direct Manipulation"]
Touch["Touch\nFinger touches object\nPress buttons"]
Poke["Poke\nFingertip interaction\nKeyboard, piano"]
TwoHand["Two-handed\nScale, rotate objects\nManipulation"]
end
VR Rendering & Performance
VR Rendering Pipeline
graph LR
subgraph Stereo["👁️ Stereo Rendering"]
L["Left Eye\nRender"]
R["Right Eye\nRender"]
end
subgraph Techniques["Rendering Techniques"]
MR["Multi-View Rendering\nRender both eyes\nin one pass (GPU)"]
SR["Single Pass Instanced\nInstanced draw calls\nfor both eyes"]
end
subgraph Reprojection["🔄 Reprojection"]
ATW["Asynchronous TimeWarp\nReproject last frame\nif new frame late"]
ASW["Asynchronous SpaceWarp\nSynthesize frames\nat half rate"]
end
Stereo --> Techniques
Techniques --> Reprojection