Overview

The human eye contains a crystalline lens; the curvature of this lens can be adjusted to bring objects at varying distances into sharp focus on the retina. This process of refocusing is known as accommodation and is an important depth cue used to navigate the real world. Though seemingly unrelated, presbyopia and virtual and augmented reality (VR/AR) share something in common: a lack of the ability to accommodate. The former, presbyopia, affects nearly 20% of the population worldwide and is caused by the stiffening of the crystalline lens over time. Traditional forms of presbyopia correction used fixed focal elements to approximate the abilities of the once pliable crystalline lens – inherently trading off acuity, field of view, or stereoacuity in exchange. An ideal “autofocal” solution would require focus-tunable lenses to mimic the accommodation response. In this talk from Samsung Forum, Nitish Padmanaban describes a novel autofocal system that, using a depth sensor and eye tracking to automatically update focus-tunable lenses, outperforms traditional forms of correction. On the other hand, in VR/AR, the problem of accommodation results from simple optical designs that provide only a single plane of sharp focus, preventing a user from being able to accommodate. Unsurprisingly, slight modifications to presbyopia corrections can be used to restore accommodation in VR/AR, to varying degrees of success. We explore how monovision and autofocal and multifocal presbyopia corrections can be used to restore accommodation in VR/AR systems.

About the Speaker

Nitish Padmanaban is a fourth year PhD candidate at Stanford EE, supported by an NSF Graduate Research Fellowship. He’s advised by Prof. Gordon Wetzstein as part of the Stanford Computational Imaging Lab. His research is focused on optical and computational techniques for virtual and augmented reality, with a focus on building and evaluating displays to alleviate the vergence–accommodation conflict (VAC) and related methods of improving visual perception.