New research from Kent State University and Meta Reality Labs has demonstrated large dynamic-focus liquid crystal lenses that could be used to create varifocal VR headsets.
Vergence-Accommodation conflict in brief
In the VR R&D space, one of the hot topics is finding a practical solution to the so-called vergence-accommodation conflict (VAC). All consumer VR headsets on the market today render an image using stereoscopy which creates 3D imaging that supports the vergence reflex of the pair of eyes (when they converge on objects to form an image stereo), but not the accommodation reflex of an individual eye (when the lens of the eye changes shape to focus light at different depths).
In the real world, these two reflexes always work in tandem, but in a virtual reality they disconnect because the eyes continue to converge where needed, but their housing remains static because the light all comes from the same distance ( the screen). Researchers in the field say VAC can cause eye strain, make it difficult to focus on close-up images, and can even limit visual immersion.
Look for a solution
There have been many experiments with technologies that could be used in varifocal headsets that properly support both vergence and accommodation, for example holographic displays and multiple focal planes. But it seems no one has cracked the code for a practical, cost-effective, and mass-producible solution to solving VAC.
Another potential solution to VAC is dynamic focus liquid crystal (LC) lenses that can change their focal length as their voltage is adjusted. According to a Kent State University graduate student project with funding and participation from Meta Reality Labs, such lenses have already been demonstrated, but mostly in very small sizes because the switching time (how quickly the setting to the point can be changed) slows considerably as the size increases.
To achieve the size of dynamic focus lens you’d want if you were to fit it into a contemporary VR headset – while keeping switching time low enough – researchers designed a large focus LC lens dynamic with a series of “phase resets”, which they liken to the rings used in a Fresnel lens. Instead of segmenting the lens in order to reduce its width (as with Fresnel), the phase reset segments are powered separately from each other so that the liquid crystals in each segment can still switch quickly enough to be convenient to use in a varifocal headset.
A great experimental lens
In new research presented at SID Display Week 2022, researchers characterized a 5cm dynamic focus LC lens to measure its capabilities and identify its strengths and weaknesses.
On the “highlights” side, the researchers show that the dynamic focus lens achieves high image quality toward the center of the lens while supporting a dynamic focus range of -0.80 D to +0.80 D and a switching speed of less than 500 ms.
For reference, in a 90Hz headset, a new image is displayed to the user every 11ms (90 times per second), while a switching time of 500ms equals 2Hz (twice per second) . While this is much slower than the frame rate of the headset, it may be within the practical speed when considering how quickly the eye can adapt to a new focal length. Additionally, the researchers claim that switching time can be increased by stacking multiple lenses.
On the “weaknesses” side, the researchers find that the dynamic focus LC lens suffers from reduced image quality when the view approaches the edge of the lens due to the reset segments. phase – a concept similar to the scattering of light due to ridges in a Fresnel lens. The work presented also explores a masking technique designed to reduce these artifacts.
Ultimately, the researchers conclude, the experimental dynamic-focus LC lens offers “possibly acceptable [image quality] values […] at a viewing angle of around 30°”, which is quite similar to the drop in image quality of many VR headsets with Fresnel optics today.
To actually build a varifocal headset from this technology, the researchers say the dynamic focus LC lens would be used in conjunction with a traditional lens to achieve the optical pipeline needed in a VR headset. Accurate eye tracking is also necessary so that the system knows where the user is looking and therefore how to properly set the lens focus for that depth.
The work in this article presents measurement methods and benchmarks showing lens performance that future researchers can use to test their own work or identify improvements that could be made to the demonstrated design.
The full article has not yet been published, but was submitted by its lead author, Amit Kumar Bhowmick at SID Posting Week 2022and credits Afsoon Jamali, Douglas Bryant, Sandro Pintz and Philip J Bos, between Kent State University and Meta Reality Labs.