Sunday
Sep152024

Smart Glasses Battle: Meta vs Snap

In this article we will explore what display technology we might expect to see in the coming days from both Meta and Snap. By display, I mean both light engines and optics. I will then place these displays within the context of what I expect to see from their larger smart glasses system, and the resulting user experience. I will, of course, opine.

For several years Meta—and everyone else—has been exploring the best way to achieve smart glasses with a wide field of view (FOV). Many have come to the conclusion that a display that divides the FOV up into at least two, and possibly three separate light-engines illuminating a respective number of optical components is the right direction. While the Meta patent art above specifically shows an LBS (Laser Beam Scanner) light-engine, illuminating a holographic combiner, the patent states:

In some implementations, an optical element may comprise a waveguide and/or other components. A waveguide may include one or more of a layered waveguide, a planar 10 partial mirror array waveguide, a diffractive waveguide, a diffractive waveguide including Bragg gratings, a free form surface prism, and/or other waveguides.

…as well as…

[In] some implementations… [the] light source may comprise one or more of… a microLED microdisplay… a liquid crystal display (LCD)… [six other possible light-engines, skipped here by ellipses]… and/or other light sources…

Indeed, the novel proposal is not the particular light-engine used, nor the form of optical components the light-engine is illuminating, but rather a system for compositing multiple light-engines via tiled optical components to expand the field of view (made clear by the title).

Meta has made acquisitions and built partnerships to achieve these ends.

In December 2022 Meta acquired Belgian 3D printed prescription lens manufacturer, Luxexcel, after successfully embedding a waveguide in separate partnerships with both Lumus Optics and WaveOptics (the second of which would themselves be acquired by Snap… which we’ll get to). One year prior, in December 2021 Meta also acquired ImagineOptix, maker of electroactive holographic waveguides. Key inventors listed on their patents now hold Optical Scientist positions at Meta Reality Lab, operating from North Carolina, in a team still led by former ImagineOptix CEO, Erin Clark.

Another interesting development is a key Meta departure, by one Kelsey Wooley. Wooley was a holdover from Facebook’s Oculus acquisition, who had ascended to the position of Engineering Manager over Lithography for Augmented Reality Waveguides, before departing to join Eulitha AG, as Director of North American Operations. Eulitha is a manufacturer of optical mass manufacturing equipment. Earlier this year they debuted their newest line of photolithographic contactless optical manufacturing line with a unique breakthrough: the ability to mass-manufacture curved waveguides (a eureka moment).

Last October, Switzerland based Eulitha opened a new office in Redmond, Washington, that Google Maps estimates is a three minute drive to Meta Reality Lab’s Origin office (and no further than four minutes from any published Reality Lab Redmond office address).




So to recap: The former ImagineOptix team designs holographic waveguides. Eulitha makes equipment for mass manufacturing such waveguides (on a curve, no less). Their North American operations are now run by Meta Reality Labs’ former Engineering Manager over Lithography for Augmented Reality Waveguides, and she now operates a Eulitha office close enough to hit Meta Reality Labs’ offices with a rock, thrown from their respective parking lot (ok, you’d have to have a really good arm, or a small trebuchet… but still).

“[When] printing on curved glass… the extremely large depth of focus is one of the benefits of these DTL tools. That it’s able to print over non-planar substrates, topography, and curved substrates very well,” said Kelsey Wooley, in a recent interview. Eulitha’s photolithography systems are able to print gratings onto a curved lens with over 3mm height difference between the edge and the center, of a 4’ lens.

We can see Meta’s display taking shape. Rumor suggests these displays will be 90°, but the technology could theoretically expand even father… provided you had the light-engines to illuminate them.



The death of MicroLED Microdisplays has been greatly exaggerated.

In March of this year, it was revealed that Apple was canceling their MicroLED based Apple Watch Ultra, as well as canceling their MicroLED microdisplay manufacturing contract with amsOSRAM in the process. This was subsequently interpreted by the tech press as the death of MicroLED itself, and propagated as accepted wisdom: MicroLED was dead.

Long live MicroLED.

At that time I was in the very thick of producing an industry report on MicroLED microdisplays, so I was speaking to many in the near-eye display industry regularly, specifically about MicroLED microdisplays for use as light engines for smart glasses. The perception gap between the press coverage on “the death of MicroLEDs” versus the point of view of those in the industry was tremendous, particularly from waveguide manufacturers: The naysayers are either using hyperbole for engagement harvesting and site traffic, or they were simply foolish people (or both).

A deep dive on the Apple / amsOSRAM contract cancellation is beyond the scope of this article, but here are a few cogent points: Yes, as with all emerging technologies, MicroLED microdisplays have real challenges. Many have struggled with the miniaturization of the interconnects—think of them as the “wires” that power each individual diode—which must scale at parity with the diodes themselves. This has bedeviled many microdisplay developers. Hence, it has been typical for companies to partnered with others in the semiconductor space for their backplane expertise.

The real struggle is to produce MicroLEDs at volume, and to do so at high yield such that imperfections resulting in rejected units don’t eat away margins, so that cost per unit can be made practical for use in a market viable consumer product. In other words, quite similar to the struggles typical of every emerging new core technology.

Hardware is hard.

But when it comes to displays, the promise of MicroLED is without peer. The belief that, for smart glasses to succeed in the consumer market, they must have a wide field of view, but sufficiently miniaturized in size so as to fit into the form factor of a consumer-viable eye-wear design, yet also have low power consumption to keep the battery small, while retaining good uptime between charges. With these market demands for success, MicroLED is without peer.

At this moment MicroLED microdisplays are simply too expensive to produce at a market viable price point, and even the expensive mass manufacturing methods available don’t produce acceptable yield, for full color microdisplays… yet.

It is going to take high volume production to get to economies of scale for a consumer price point pair of MicroLED run smart glasses. This is where the Apple Watch came in.

If Apple wants to use MicroLED microdisplays for future smart glasses, they have an ace up their sleeve, or at least an Apple Watch under their cuff. If Apple produced a MicroLED microdisplay based Apple Watch as they had planned to do with the Apple Watch Ultra, they could use their existing high-volume wrist-worn wearable to bring economies of scale to MicroLED microdisplay production, lowering their production cost for other uses… such as smart glasses.

So while many in the tech sphere positioned the order cancellation purely as a technological failing, I believe there was more to it. Sure, for reasons stated above, there were presumably genuine concerns over whether amsOSRAM could deliver. But even if amsOSRAM had nailed it, there was an additional business concern: Apple likes exclusivity.

Tim Cook comes out of supply chain management. When there is a market differentiator, Apple likes to lock down supply. For their part, amsOSRAM was using the substantial Apple contract to fund what was billed as the world’s largest MicroLED factory. This would create much more capacity than Apple needed or likely would commit to paying for the exclusivity, for that volume of output. That would mean that Apple would be underwriting the cost for the whole industry, and likely footing the bill to bring down the cost for competitors who could then beat them to market with MicroLED microdisplay smart glasses, from amsOSRAM’s production line… all on Apple’s dime.

I emphasize, I have no Apple insider knowledge on this, it is speculation on my part, but speculation that fits Apple’s modus operandi, and it appeared to me likely that beyond technical concerns, there was a disconnect at amsOSRAM in understanding Apple’s business needs.

More pertinent to our story…

The death of Plessey has also been greatly exaggerated.

In July of last year, Wayne Ma wrote a report in The Information on various leaks regarding Meta’s smart glasses progress. While Ma’s report covered many aspects of Meta’s development, I’m principally concerned here with his coverage of their light engine, where it was suggesting that Meta was abandoning Plessey’s MicroLED microdisplay in favor of an LCoS microdisplay (though the article does walk it back a bit at the end). The report reads more accurate than many secondhand interpretations. The tech blog-o-sphere ran with the death of Plessey, which is now conventional wisdom in online chatter and industry shop-talk, whenever I mention the company.

Long live Plessey.

So what is my Plessey scoop? Everything I am going to share here is publicly available. When a company is in crash-and-burn phase, there are typically some, well, obvious tell-tale signs: Have they had massive layoffs? According to Linkedin headcount, Plessey has 204 employees, and have lost 5% in the past year (given recent tech layoffs, a 5% headcount reduction doesn’t look bad at all). Are key personnel departing? I can only find two Sr. management departures in this window: Jun-Youn Kim, Plessey’s former VP of R&D, left a couple months before The Information’s story broke (departing for a similar position within Samsung’s MicroLED group); and Ariel Meyuhas, a self described “operational turnaround” expert, and Plessey’s former COO departed just recently, but wasn’t even hired until after The Information’s story ran. I guess Meyuhas’ task was complete. Has the CEO or CTO been pushed out? Keith Strickland has been, and remains both CEO & CTO of Plessey.

Plessey has also continued producing bleeding-edge MicroLED microdisplay IP. If the reader will indulge me for a moment—both from an engineering and aesthetic point-of-view—Plessey produces some of the most pleasing patents in the industry.

Just look at that pixel!


Wayne Ma’s story concludes:

People familiar with Meta’s MicroLED efforts say it hasn’t given up on the technology and will continue working on it with Plessey, though it isn’t clear when it will be ready for prime time.

So the story out of The Information was not nearly as apocalyptic of Plessey as the rumor mill that resulted (perhaps because The Information is behind a paywall, most only knew the rumors).

Earlier in his article, Ma noted:

Meta’s decision to abandon Plessey’s microLED technology means it is reliant on an older technology for its AR glasses. MicroLEDs contain pixels that are microscopic in size and are difficult to produce… By contrast, LCoS was first introduced… in the 1990s. The technology isn’t known for its brightness, which is a major requirement of AR products…

Let’s talk more about light engines…

MicroLED adaptive-illuminated LCoS light engines, to be specific.

MicroLED remains the industry’s long-term solution, but the patent record shows many have been simultaneously exploring a stop-gap hybrid MicroLED / LCoS microdisplay.

A quick explainer: Liquid crystals are non-emitting, that is to say, they produce no light of their own. When electricity is applied to liquid crystals their optical properties can be altered from a highly transparent state to a substantially opaque one, thereby modulating the light, when paired with a light source. A liquid crystal display typically has a fully-lit backlight. To produce colors, they also require non-emissive color filters in front.

Note again that many of the most difficult struggles with MicroLED microdisplays concern the mass manufacturing of full-color displays.

Creating a single-color white (blue-ish “cool white”) MicroLED circumvents these challenges, and can be used as a backlight for an LCoS microdisplay, but unique in that it only illuminates individual pixels or sub-pixels, as needed.

Given that smart glasses are producing imagery over the real world, anything that is “transparent” is an unlit pixel in the light engine. An LCoS display using a MicroLED backlight that only illuminates on a pixel-by-pixel basis will give both better image quality, as well as much lower power consumption… and it can produce RGB at a much lower cost than any current pure MicroLED based microdisplay (substantially so).

While many have been pursuing R&D in this direction, as of this past January, Avegant and Lumileds were first to take their partnership to market.

So who has IP, R&D, and/or known displays using MicroLED illuminated LCoS microdisplays?




Two of those companies are Plessey and Compound Photonics, the former under exclusive supplier contract to Meta, and the later acquired by Snap.

Everyone who partnered with Plessey was quickly acquired—Compound Photonics (CP) was snapped-up by Snap, Jasper was gobbled up by Google (sorry, I had to)—but interestingly CP had a long history in LCoS microdisplays. Naturally they were one of the earliest to develop MicroLED illuminated LCoS microdisplay IP.

It’s all very synergistic, it’s not like one cutting off from the other… We’re a microdisplay company—we’re not a MicroLED company, we’re not an LCoS company… we’re going to make the best performing, smallest form factor displays we possibly can…

—Mike Lee, then CEO of Compound Photonics, from SPIE Fireside-Chat 2020

So not only should we expect to see Plessey working with Meta on a MicroLED illuminated LCoS microdisplay, but we should expect the same from Snap’s smart glasses.

In a moment we’re going to see what we should expect to be different, from a use case / user experience between Meta and Snaps, but first we’re going to bring this full circle: Snap has also recently been awarded a patent for a multi-light-engine waveguide (two, in this case), and we’re going to pair it with their 3D sensing camera IP, as it is central to Snap’s use case.


Meta & Snap Use Cases

With these technologies working in tandem, Snap’s glasses should have both the wide FOV display, and the three dimensional understanding of their content to display the kind of filters Snap is famous for on their Snapchat platform, and their IP reflects that intent.



From the patent, in reference to the art above…

…if the user wishes to apply augmented reality lenses to the captured image, the augmented reality lenses would be selected based on the objects in the display and applied to the objects in the portion of the real-world image that is shown in the display. For example… a face may be recognized that is captured in the display and an augmented reality lens applied (in this case, the features of a dog). In this example, the lens including the dog features would [also] be applied to the face of the second person as shown…

Those familiar, may recall Snap’s 2021 demo was promising, but with a narrow FOV. I expect Snap’s content to be on-brand, polished visuals, but now with an expanded wide field-of-view and to show more virtual content directly interacting with people, like their app filters.

I expect the content for Meta’s glasses to build on their previous audio demo, bring AI into the display. I’m going to close with a preview of Ramblr.AI’s recently released demo real. Ramblr is a newly launched AI / AR interface for the physical world. It is the latest from Thomas Alt, he the former founder of AR platform, Metaio, who exited to Apple, back in 2015. While I anticipate something more polished, interface-wise, from Meta’s demo, I anticipate similar visual AI use cases (I also expect Ramblr to evolve quickly).








Christopher Grayson is a marketer, and an independent market analyst on near-eye optics displays, the consumer eye-frames industry, and the smart glasses market. As a writer, he specializes in deciphering technical subjects for an educated but non-engineering audience. Grayson has worked as a marketing director and a PR consultant to various companies in the smart glasses space. He pivoted client-side after a career in the New York advertising industry, working for agencies such as Ogilvy, and Grey Advertising, producing award winning campaigns for Intel, Nikon, and others.

Grayson majored in architecture at Pratt Institute in New York, with prior studies in the contemporary cultural anthropology of humans and technology at Memphis State University.

Contact for consulting projects: chris@chrisgrayson.com