Optical Architectures for Augmented-, Virtual-, and Mixed-Reality Headsets, 拡張現実、仮想現実、および複合現実のヘッドセット用の光学アーキテクチャ, 9781510634336, 978-1-5106-3433-6

Optical Architectures for Augmented-, Virtual-, and Mixed-Reality Headsets

学術書籍  >  理工学  >  コンピュータシステム・ハードウェア  > 

学術書籍  >  理工学  >  知能情報学  > 

学術書籍  >  理工学  >  物理学・応用物理学  > 




Optical Architectures for Augmented-, Virtual-, and Mixed-Reality Headsets

18,810(税込)

数量

書名

Optical Architectures for Augmented-, Virtual-, and Mixed-Reality Headsets
拡張現実、仮想現実、および複合現実のヘッドセット用の光学アーキテクチャ
著者・編者 Bernard C. Kress
発行元 SPIE
発行年/月 2020年1月   
装丁 Softcover
ページ数 270 ページ
ISBN 978-1-5106-3433-6
発送予定 海外倉庫よりお取り寄せ 3-5週間以内に発送します

Description

This book is a timely review of the various optical architectures, display technologies, and building blocks for modern consumer, enterprise, and defense head-mounted displays for various applications, including smart glasses, smart eyewear, and virtual-reality, augmented-reality, and mixed-reality headsets. Special attention is paid to the facets of the human perception system and the need for a human-centric optical design process that allows for the most comfortable headset that does not compromise the user’s experience. Major challenges--from wearability and visual comfort to sensory and display immersion--must be overcome to meet market analyst expectations, and the book reviews the most appropriate optical technologies to address such challenges, as well as the latest product implementations.


 

Contents:

1 Introduction
Word of Caution for the Rigorous Optical Engineer

2 Maturity Levels of the AR/VR/MR/Smart-Glasses Markets

3 The Emergence of MR as the Next Computing Platform
3.1 Today's Mixed-Reality Check

4 Keys to the Ultimate MR Experience
4.1 Wearable, Vestibular, Visual, and Social Comfort
4.2 Display Immersion
4.3 Presence

5 Human Factors
5.1 The Human Visual System
5.1.1 Line of sight and optical axis
5.1.2 Lateral and longitudinal chromatic aberrations
5.1.3 Visual acuity
5.1.4 Stereo acuity and stereo disparity
5.1.5 Eye model
5.1.6 Specifics of the human-vision FOV
5.2 Adapting Display Hardware to the Human Visual System
5.3 Perceived Angular Resolution, FOV, and Color Uniformity

6 Optical Specifications Driving AR/VR Architecture and Technology Choices
6.1 Display System
6.2 Eyebox
6.3 Eye Relief and Vertex Distance
6.4 Reconciling the Eye Box and Eye Relief
6.5 Field of View
6.6 Pupil Swim
6.7 Display Immersion
6.8 Stereo Overlap
6.9 Brightness: Luminance and Illuminance
6.10 Eye Safety Regulations
6.11 Angular Resolution
6.12 Foveated Rendering and Optical Foveation

7 Functional Optical Building Blocks of an MR Headset
7.1 Display Engine
7.1.1 Panel display systems
7.1.2 Increasing the angular resolution in the time domain
7.1.3 Parasitic display effects: screen door, aliasing, motion blur, and Mura effects
7.1.4 Scanning display systems
7.1.5 Diffractive display systems
7.2 Display Illumination Architectures
7.3 Display Engine Optical Architectures
7.4 Combiner Optics and Exit Pupil Expansion

8 Invariants in HMD Optical Systems, and Strategies to Overcome Them
8.1 Mechanical IPD Adjustment
8.2 Pupil Expansion
8.3 Exit Pupil Replication
8.4 Gaze-Contingent Exit Pupil Steering
8.5 Exit Pupil Tiling
8.6 Gaze-Contingent Collimation Lens Movement
8.7 Exit Pupil Switching

9 Roadmap for VR Headset Optics
9.1 Hardware Architecture Migration
9.2 Display Technology Migration
9.3 Optical Technology Migration

10 Digital See-Through VR Headsets

11 Free-Space Combiners
11.1 Flat Half-Tone Combiners
11.2 Single Large Curved-Visor Combiners
11.3 Air Birdbath Combiners
11.4 Cemented Birdbath?Prism Combiners
11.5 See-Around Prim Combiners
11.6 Single Reflector Combiners for Smart Glasses
11.7 Off-Axis Multiple Reflectors Combiners
11.8 Hybrid Optical Element Combiners
11.9 Pupil Expansion Schemes in MEMS-Based Free-Space Combiners
11.10 Summary of Free-Space Combiner Architectures
11.11 Compact, Wide-FOV See-Through Shell Displays

12 Freeform TIR Prism Combiners
12.1 Single-TIR-Bounce Prism Combiners
12.2 Multiple-TIR-Bounce Prism Combiners

13 Manufacturing Techniques for Free-Space Combiner Optics
13.1 Ophthalmic Lens Manufacturing
13.2 Freeform Diamond Turning and Injection Molding
13.3 UV Casting Process
13.4 Additive Manufacturing of Optical Elements
13.5 Surface Figures for Lens Parts Used in AR Imaging

14 Waveguide Combiners
14.1 Curved Waveguide Combiners and Single Exit Pupil
14.2 Continuum from Flat to Curved Waveguides and Extractor Mirrors
14.3 One-Dimensional Eyebox Expansion
14.4 Two-Dimensional Eyebox Expansion
14.5 Display Engine Requirements for 1D or 2D EPE Waveguides
14.6 Choosing the Right Waveguide Coupler Technology
14.6.1 Refractive/reflective coupler elements
14.6.2 Diffractive/holographic coupler elements
14.6.3 Achromatic coupler technologies
14.6.4 Summary of waveguide coupler technologies

15 Design and Modeling of Optical Waveguide Combiners
15.1 Waveguide Coupler Design, Optimization, and Modeling
15.1.1 Coupler/light interaction model
15.1.2 Increasing FOV by using the illumination spectrum
15.1.3 Increasing FOV by optimizing grating coupler parameters
15.1.4 Using dynamic couplers to increase waveguide combiner functionality
15.2 High-Level Waveguide-Combiner Design
15.2.1 Choosing the waveguide coupler layout architecture
15.2.2 Building a uniform eyebox
15.2.3 Spectral spread compensation in diffractive waveguide combiners
15.2.4 Field spread in waveguide combiners
15.2.5 Focus spread in waveguide combiners
15.2.6 Polarization conversion in diffractive waveguide combiners
15.2.7 Propagating full-color images in the waveguide combiner over a maximum FOV
15.2.8 Waveguide-coupler lateral geometries
15.2.9 Reducing the number of plates for full-color display over the maximum allowed FOV

16 Manufacturing Techniques for Waveguide Combiners
16.1 Wafer-Scale Micro- and Nano-Optics Origination
16.1.1 Interference lithography
16.1.2 Multilevel, direct-write, and grayscale optical lithography
16.1.3 Proportional ion beam etching
16.2 Wafer-Scale Optics Mass Replication

17 Smart Contact Lenses and Beyond
17.1 From VR Headsets to Smart Eyewear and Intra-ocular Lenses
17.2 Contact Lens Sensor Architectures
17.3 Contact Lens Display Architectures
17.4 Smart Contact Lens Fabrication Techniques
17.5 Smart Contact Lens Challenges

18 Vergence-Accommodation Conflict Mitigation
18.1 VAC Mismatch in Fixed-Focus Immersive Displays
18.1.1 Focus rivalry and VAC
18.2 Management of VAC for Comfortable 3D Visual Experience
18.2.1 Stereo disparity and the horopter circle
18.3 Arm's-Length Display Interactions
18.4 Focus Tuning through Display or Lens Movement
18.5 Focus Tuning with Micro-Lens Arrays
18.6 Binary Focus Switch
18.7 Varifocal and Multifocal Display Architectures
18.8 Pin Light Arrays for NTE Display
18.9 Retinal Scan Displays for NTE Display
18.10 Light Field Displays
18.11 Digital Holographic Displays for NTE Display

19 Occlusions
19.1 Hologram Occlusion
19.2 Pixel Occlusion, or "Hard-Edge Occlusion"
19.3 Pixelated Dimming, or "Soft-Edge Occlusion"

20 Peripheral Display Architectures

21 Vision Prescription Integration
21.1 Refraction Correction for Audio-Only Smart Glasses
21.2 Refraction Correction in VR Headsets
21.3 Refraction Correction in Monocular Smart Eyewear
21.4 Refraction Correction in Binocular AR Headsets
21.5 Super Vision in See-Through Mode

22 Sensor Fusion in MR Headsets
22.1 Sensors for Spatial Mapping
22.2.1 Stereo cameras
22.2.2 Structured-light sensors
22.2.3 Time-of-flight sensors
22.3 Head Trackers and 6DOF
22.4 Motion-to-Photon Latency and Late-Stage Reprojection
22.5 SLAM and Spatial Anchors
22.6 Eye, Gaze, Pupil, and Vergence Trackers
22.7 Hand-Gesture Sensors
22.8 Other Critical Hardware Requirements