Thursday
Aug012019

The Second Coming of Consumer Smartglasses




The article The Second Coming of Consumer Smartglasses originally ran in the August 2019 print edition of 2020 Magazine—the largest circulation trade publication for the optician / optometrist industry. Below is an excerpt of the technical inset, More Than A Pretty Face. The full article can be read online at the August 2019 edition of 2020 Magazine.




MORE THAN A PRETTY FACE

The next generation of smartglasses will be more than just an aesthetic upgrade, they will also be a generational leap from the Google Glass variety of devices in both functionality and user interface. The display in Glass produces the illusion of an oversized flat screen floating in the wearer’s peripheral vision and has a touch-sensitive bar on the adjacent temple. It is a rudimentary display, and the user input is rather basic.

Next-generation smartglasses have a discrete display integrated directly into a prescription lens, some are “stereoscopic,” which is to say the content is three dimensional, inserted into the wearer’s view of the real world. This is known as augmented reality.

Next generation smartglasses also have audio input with AI smart assistant integration—be that Alexa, Cortana, Siri or other—your preferred assistant is at your beck and call.

The most advanced smartglasses have depth sensors. Some sensors map the user’s surroundings, giving the device a three-dimensional understanding of the environment. Other sensors can track hand gestures for input. Enterprise and developer products like Microsoft HoloLens and MagicLeap One are already available to purchase, but not yet in a form factor that is viable for a consumer product… but it is coming.

The first step toward consumer viable smartglasses will come with discrete display systems. Focals by North already have a display integrated into a prescription lens.

NEAR-EYE OPTICS DISPLAY BREAKTHROUGHS

The dominant display technologies at the forefront of this revolution in consumer form-factor smartglasses include “waveguides” and “laser to holographic combiners.”

Waveguides themselves can be divided into holographic waveguides and surface relief waveguides (and surface relief further into additive and subtractive manufactured designs).

WAVEGUIDES

Waveguides are an adaptation of technology first patented in 1970 by researchers at Corning Glass, to enable multiple data-streams to flow through a fiberoptic cable. While fiberoptic cables are an extruded cylinder, waveguide technology can also transfer light through a plane.

While light can flow through a plane, in order to make a planar waveguides useful as a near-eye display technology, they need optical elements to get an image into one end of the waveguide, and then more optics to turn the image in front of the eye. These optics are referred to as an “input grating” and an “output grating.”

For these kinds of waveguides to be practical in use, it also required the development of complementary technology—miniaturized micro-displays to shine an image into the input grating end of the optical system, known as a “light engine.”

By the 1990s, engineers including those at Sony were experimenting with planar waveguides as a means of placing an image from a micro-display in front of the eye. Even the earliest holographic waveguide displays first appeared back in the ’90s. In the 2000s, Nokia invented a waveguide display that employed micro-optical elements engraved into the surface of a plane to facilitate the input and output gratings. This IP would eventually land in the hands of Microsoft, through their Nokia acquisition. These lenses became known as subtractive-manufactured surface relief waveguides.

Soon others, most notably the heavily venture funded MagicLeap (through their own acquisition of Molecular Imprints) employed a manufacturing process for input and output gratings on a surface relief waveguide using an additive method known as nano-lithography: the optical elements are “printed” at a molecular scale onto the surface of the plane.

Still other companies focused on “holographic” optical elements for the input and output gratings, most notably DigiLens. The holographic waveguides use a technique not dissimilar to the eagle security hologram featured on most credit cards. Holograms like those on a credit card are exposed into a transparent substrate, which is then mounted onto a reflective surface, so that ambient light can reflect back through the hologram (which must be backlit) to create the image the cardholder sees.

Now imagine—instead of an eagle—a hologram of columns of optical elements that behave like mirrors. These holographic mirrors allow for input and output gratings of only a few microns in thickness. Further, DigiLens has perfected a method of manufacturing their waveguides from a liquid crystal based polymer, making these holographic optical elements electro-active—also known as switchable bragg gratings.

LASER TO HOLOGRAPHIC COMBINER

Pico-laser-based near-eye displays were pioneered by Microvision. With this kind of display system, a laser is bounced off a micro-mirror, mounted on a dual-axis gimbal. Early versions of these Microvision displays simply used a beam-splitter, otherwise known as a two-way mirror to combine the view of the real world with the view of virtual content. Over time, a more sophisticated optical combiner was developed, similar to the holographic waveguide. A series of micro-mirror-like holographic elements could be embedded inside a lens and the laser targeted at them, to reflect into the user’s eye. These laser displays—in their current form—have one distinct shortcoming compared to waveguides: a very narrow field of view (that being the width of the user’s view that can be augmented with virtual content). But as a competitor to waveguides, they also have a tremendous lens-crafters like Interglass of Switzerland, or Canadian consumer smartglasses brand, North, have shown that these kind of laser-based displays can be embedded within a traditional prescription lens.

North also has a patent to embed a waveguide within a prescription lens. Interglass says they’re also working on a waveguide within a prescription lens, and DigiLens have IP around a curved waveguide applied to the surface of a prescription lens. A representative from Interglass has suggested that a holographic waveguide embedded within a prescription lens should be expected in time for the Consumer Electronics Show in January 2020.

The waveguide display also requires a “light engine,” or micro-display to project into its input grating—the image source. These are also miniaturizing, getting brighter and falling in power consumption.

In future generations, expect lenses to combine displays with tunable focus lenses.

Many researchers, including Nazmul Hasan at the University of Utah working under Professor Carlos Mastrangelo have been developing tunable focused lenses. Used in combination with a depth sensor, the lenses change their focus from near to far, based on where the wearer is looking, or can also change with the wearer’s prescription over time.

In an interview with Smithsonian, Mastrangelo explains, “This means that as the person’s prescription changes, the lenses can also compensate for that, and there is no need to buy another set for quite a long time.”

Perhaps the most intriguing development in the tunable focus lenses space is the partnership between two Israeli companies: Lumus optics, maker of waveguide-based near-eye optics display systems, has partnered with tunable focus lens manufacturer DeepOptics, to develop hybrid tunable focus lenses with an embedded display system.


The article The Second Coming of Consumer Smartglasses originally ran in the August 2019 print edition of 2020 Magazine—the largest circulation trade publication for the optician / optometrist industry. Above is an excerpt of the technical inset, More Than A Pretty Face. The full article can be read online at the August 2019 edition of 2020 Magazine.



GigantiCo by Chris Grayson