This 'Metaform' optical device could shrink the size of AR headsets

By bridging nanophotonics and freeform optics, University of Rochester researchers hope to enable next-gen XR wearables.

When we see augmented reality (AR) in sci-fi, it usually takes the form of glasses — or even contact lenses — that are capable of producing rich, immersive experiences.

Real-world AR has certainly come a long way since Google Glass, but there’s still a pretty serious tradeoff between form and function: the more the wearable resembles a normal pair of glasses, the less augmentation it can actually pull off.

There are many reasons for that — after all, we’re talking about putting a context-aware computer on your head — but one issue lies with the optical system, or how the headset puts the AR imagery in front of your eyeballs.

The metaform allows for richer and more accurate visuals in augmented reality — while simultaneously decreasing the size of the lenses.

Researchers at the University of Rochester Institute of Optics have devised a technology that could bring us one step closer to the dream of AR wearables that look more like Ray-Bans than hard hats.

“No one wants to walk around with a bucket on their head the whole day,” said Stanford University professor Mark Brongersma, who leads the university’s Plasmonics and Photonics group.

The challenge: The ideal AR optical system is a window that is thin and untinted in front of your eyes so that you barely notice it — and the digital content merges seamlessly with the physical world around you. Straightforward as that might sound, getting light to cooperate is no simple matter, and all current AR devices suffer from it in one way or another.

  • Virtual reality (VR) headsets are designed to situate users in complete digital worlds, meaning they can block out all light from the surrounding environment. But AR is meant to integrate content within the physical world — it has to contend with ambient light at all times. And it has to do so while trying to minimize issues like chromatic aberration (color fringing), ghosting (double image), stray light (lens flare), and tinting (overly darkening the lenses).

  • Solutions to the problem of AR optics have often included bulky hardware or a “bug-eye” look to the optics (where the shape of lenses and visors bulge out like bug eyes) — which limit who will actually want to wear them.

Thus, designing optics that will produce rich digital images in projected light, while simultaneously letting in external light in such a way that all of it looks real — and not making people look like cyborgs in the process — is still among contemporary AR’s peskiest snags.

Enter: The metaform. It’s a new technology that University of Rochester researchers claim can “defy the conventional laws of reflection, gathering the visible light rays entering an AR/VR eyepiece from all directions, and redirecting them directly into the human eye.”

The technology is outlined in a new paper in the journal Science Advances. The team describes a method of imprinting a tiny “forest” of nanoscale structures onto a form of optics called freeform optics.

  • Freeform optics is a complex emerging technology. Where traditional optics are purely spherical (i.e., contact lenses, eyeglasses, or telescopes), freeform optics use a variety of surface types. Think of how a funhouse mirror produces wacky reflections. This way of seeing “more” than what the eyes alone could perceive of the physical world — at the risk of oversimplification — is what freeform optics can allow. In practice, freeform optics harness this power for a specific purpose beyond silliness, and can therefore be used to create devices that are lighter, more compact, and more robust than ever.

But don’t worry about being a freeform optics expert; the point is that the metaform allows for richer and more accurate visuals in augmented reality — while simultaneously decreasing the size of the lenses.

We’d love to hear from you! If you have a comment about this article or if you have a tip for a future Freethink story, please email us at tips@freethink.com.

Related
You’re thinking of the metaverse all wrong, says Matthew Ball
Rumors of the metaverse’s demise have been greatly exaggerated.
What will it take for smart glasses to replace smartphones?
Smart glasses that combine personal computing, AI, and augmented reality could be the next life-changing consumer tech device. Here’s how.
How Brilliant Labs CEO is creating a “symbiosis of humanity and artificial intelligence”
CEO Bobak Tavangar discusses the philosophy behind Brilliant’s latest device, Frame, and his vision for the future of AI.
OpenBCI’s new VR headset reacts to your brain and body
OpenBCI is reshaping the relationship between humans and the virtual world with Galea Beta, a headset that measures the body and brain.
Why Apple won’t call the Vision Pro “virtual reality”
Apple has forbidden developers from using “VR,” “AR,” or “MR” to describe their Vision Pro apps. That’s a mistake.
Up Next
Holographic Collaboration
Exit mobile version