Skip to main content

Digital Trends may earn a commission when you buy through links on our site. Why trust us?

Hands-on with Microsoft Mesh: I handed someone a whale shark, and it was awesome

There’s a fantastic scene in Kingsman: The Golden Circle in which Eggsy and the other leaders of the secret spy agency (Channing Tatum, too, of course) assemble around a board room table — except they’re not really there.  Each and every one of them is a hologram, thanks to incredible technology stuffed into a pair of ordinary-looking eyeglasses.

Recommended Videos

It’s a common trope in cinema, something James Bond and Ethan Hunt have used as well. And it simply isn’t possible with today’s augmented reality technology.

Wasn’t possible, I mean.

Today at the Microsoft Ignite developer conference, the tech innovator unveiled Microsoft Mesh, a cross-technology platform that lets app developers build persistent virtual environments for collaboration, communication, and more.

Mesh pushes the boardroom scene even further, allowing people from around the globe to share holograms. Yes, you can toss a virtual key to someone else. It sounds silly, but it opens up vast new worlds, thanks to computation so complex it’s hard to fathom.

Fun with holograms

To get a feel for the new tech, Microsoft sent me a crate with a very fancy HP Omen gaming laptop, a Hololens 2 headset, and a mixed-reality headset from HP called the Reverb G2. Those last two devices create two very different virtual environments. The Hololens is primarily an augmented reality device; it’s essentially a headband with a clear visor like those you might remember from shop class, onto which the device can beam images. Your eyes assemble the world before you and the virtual images into one new reality, one better than the ordinary world you live in.

The mixed-reality headset is a virtual reality device, meant to completely transport you to a new world. Microsoft Mesh works in both, as well on desktop (PC and Mac), and shortly, on Android and iPhone devices, Microsoft says.

To test it out, I donned a Hololens 2 and fired it up. The headset builds a detailed map of your environment, including your hands — no need for controllers here. You launch apps and adjust settings via a menu, accessed by pressing a floating Windows icon mapped onto your wrist.

I launched the test app (called Fenix while it was still in beta) and met the floating avatar of Greg Sullivan, director of mixed reality at Microsoft, around a shimmering work surface, which we could individually place and resize in our physical environment. I smooshed the virtual table between my real bed and doorway, locked it to the ground, and turned my attention to the jellyfish, earth, and whale shark floating above it.

“It’s like the metaverse.”

Hololens is uniquely intuitive; you grab any holograms with your hands, either pinching or simply snatching them like a fly ball, and rotate, grow, or shrink them exactly as you’d expect to. And sure enough, Sullivan and I were able to pass holograms back and forth. I picked up the shark, which swirled its tail menacingly, and expanded it to get a better look at its pale underside. I passed it across the table to another reporter, who grabbed it and stacked it atop the jellyfish.

Fun with holograms! I encountered only one issue: I tend to close my hands when I lower my arms to my sides, and I kept grabbing holograms by accident. The app crashed once or twice as well, but it’s just a tech proof of concept more than anything else. Consider the concept proven.

Sure, we couldn’t fist bump to save our lives — heck, we were just avatars, after all — but this was collaboration from another dimension.

“It’s like the metaverse,” Sullivan told me. “When someone who’s not in the room with you hands you a hologram, it’s a pretty powerful experience.”

The control panel for Mesh allows you to add and alter the workspace and the holograms in it. Image used with permission by copyright holder

New platform in the making

I experienced Mesh through the Fenix app, but Microsoft is thinking of it as a platform, and says it will release SDKs to allow any developer to add this capability to their apps — hence the announcement at the Ignite developer conference.

The initial demo apes a design review on purpose. It’s a natural opportunity for this type of collaboration, Sullivan said, and any time you can avoid flying someone to a factory to inspect a prototype product, a company will be saving money. Other potential applications including remote support and training, manufacturing, and more.

Architects and engineers could physically walk through a holographic model of a factory floor under construction, seeing how all the pieces of equipment fit together in three dimensions, potentially avoiding costly mistakes. Engineering or medical students learning about electric car engines or human anatomy could gather as avatars around a holographic model and remove parts of the engine or peel back muscles to see what’s underneath.

Microsoft also pledged to release Mesh-powered versions of its own apps in the near future, though a timeline was frustratingly absent. A Mesh-powered Teams environment seems like a no-brainer, after all, and the company already has libraries of 3D objects built into its software, including the Office suite.

Jeremy Kaplan
As Editor in Chief, Jeremy Kaplan transformed Digital Trends from a niche publisher into one of the fastest growing…
Get ready: Google Search may bring a pure ‘AI mode’ to counter ChatGPT
AI Overviews being shown in Google Search.

It is match point Google as the tech giant prepares to introduce a new “AI Mode” for its search engine, which will allow users to transition into an atmosphere that resembles the Gemini AI chatbot interface.

According to a report from The Information, Google will add an AI Mode tab to the link options in its search results, where the “All,” “Images,” “Videos,” and “Shopping” options reside. The AI Mode would make Google search more accessible and intuitive for users, allowing them to “ask follow-up” questions pertaining to the links in the results via a chatbot text bar, the publication added.

Read more
I tested Intel’s new XeSS 2 to see if it really holds up against DLSS 3
The Intel logo on the Arc B580 graphics card.

Although it technically arrived alongside the Arc B580, Intel quickly disabled its new XeSS 2 feature shortly after it was introduced. Now, it's back via a new driver update, and with a few fixes to major crashes issues. I took XeSS 2 out for a spin with the Arc B580, which has quickly climbed up the rankings among the best graphics cards, but does XeSS 2 hold up its side of the bargain?

XeSS 2 is Intel's bid to fight back against Nvidia's wildly popular DLSS 3. The upscaling component at the core of XeSS is the same, but XeSS 2 includes both a Reflex-like latency reduction feature and, critically, frame generation. The latency reduction, called XeLL, is enabled by default with frame generation.

Read more
Windows PCs now works with the Quest 3, and I tried it out for myself
i tried windows new mixed reality link with my quest 3 alan truly sits in front of a pc and adjusts virtual screen while wear

Microsoft and Meta teamed up on a new feature that lets me use my Windows PC while wearing a Quest 3 or 3S, and it’s super easy to connect and use. I simply glance at my computer and tap a floating button to use Windows in VR on large displays only I can see.

Meta’s new Quest 3 and 3S are among the best VR headsets for standalone gaming and media consumption. When I want more performance or need to run one of the best Windows apps that aren’t yet available in VR, I can connect to a much more powerful Windows PC.
Setting up Mixed Reality Link
Scanning Microsoft's Mixed Reality Link QR code with a Meta Quest 3 Photo by Tracey Truly / Digital Trends

Read more