Snap is going all-in on its augmented reality smart glasses whether consumers want them or not. Right after announcing the latest generation of its Spectacles line, news surfaced that Snap has agreed to acquire WaveOptics, which supplies the AR displays powering its frames, for more than $500 million, the Verge reported Friday.
In an emailed statement to Gizmodo, Snap confirmed the deal is worth over $500 million in cash and stock, with roughly half paid upfront in stock. The remainder will be paid out in cash or stock over the next two years.
WaveOptics primarily develops waveguides, the techy displays inside AR glasses that make superimposing virtual images onto the real world possible, and projectors to direct light at said waveguides. Snap’s fourth-generation Spectacles use WaveOptics’ lenses but aren’t being sold to the general public at this time. Instead, Snap is rolling them out to a select number of AR effect creators first as they fine-tune the tech further.
This deal isn’t exclusive, meaning WaveOptics will continue to supply other companies with its waveguides as it works on custom optical systems with Snap, a Snap spokesperson told the Verge. Still, it’s sure to give Snap a valuable leg up in the nascent but quickly developing market for AR headsets. Competitors like Google and Facebook are reportedly building their own waveguide technology to power their AR frames. Apple scooped up waveguide-maker Akonia in 2018 for its long-rumored Apple Glasses, which could be announced as soon as next year.
Snap CEO Evan Spiegel told CNBC on Friday that the company has been collaborating with WaveOptics for “many years” to develop waveguides for its line of smart glasses.
G/O Media may get a commission
“These are really sophisticated and complex components,” he said. “This really represents a long-term investment in the future of Spectacles.”
Snap’s been on a bit of a buying spree lately, with WaveOptics being its fourth and easily one of its largest acquisitions so far this year. In March, it bought Fit Analytics, an apparel-sizing analytics firm, for $124 million as part of a larger push into e-commerce. Snap also scooped up StreetCred in January and Pixel8Earth in April for their mapping tools to develop location-related services.
If it’s willing to invest $500 million into the future of Spectacles, Snap must have some faith that this latest version will be a success. Because lord knows that wasn’t the case with the previous three. Snap’s first-generation Spectacles purportedly cost the company $40 million in unsold inventory, with “hundreds of thousands” of pairs left to gather dust in warehouses. Later models faired a bit better but still failed to drum up much fanfare, at least in part because they came with significantly higher pricetags.
Maybe the fourth time’s the charm? Snap sure seems to hope so.
Snap continues to forge ahead with its line of techy camera glasses despite previous generations largely proving unprofitable and/or unpopular. But hey, fourth time’s the charm, right? On Thursday, Snap announced its newest Spectacles smart glasses with built-in AR displays, but don’t expect to get your hands on them anytime soon—they’re not for sale.
More on that in a second. Snap CEO Evan Spiegel debuted the company’s first true augmented reality glasses on Thursday, which come with dual 3D waveguide displays capable of overlaying digital AR effects on the world around you. A demo posted to Twitter shows Spiegel playing fetch with a virtual dog and watching virtual butterflies flutter about, with one landing on his outstretched hand. You can snap videos of moments like this and send them to friends with the click of a button, he said.
However, Snap’s not quite ready to roll out its fourth-generation Spectacles to the general public yet. For one, the battery only lasts 30 minutes. So Snap’s only giving its glasses to an undisclosed number of creators “looking to push the limits of immersive AR experiences” after they’re vetted through its online application process. The hope is that a portion of the 200,000 users already creating AR effects with Snapchat’s software tools will experiment with the new tech and generate hype for its eventual launch, Spiegel told the Verge.
As for its specs (pun intended), the Spectacles feature a built-in touchpad, two RGB cameras, and four built-in microphones. It includes several buttons for controls, but users can also say “Hey Snapchat” to provide audio commands.
G/O Media may get a commission
Its dual waveguide displays boast a diagonal field of view of 26.3 degrees and 15 millisecond motion to photo latency. The Spectacles have a significantly narrower than other AR headsets like the HoloLens 2 and Magic Leap One, which come in at 52 degrees and 50 degrees respectively. That being said, with its displays capable of shining up to 2000 nits of brightness, the frames seem much better equipped to handle the sunny outdoors than their competitors.
In an interview with the Verge, Spiegel posited that, within a decade or so, AR glasses will be as ubiquitous as smartphones are today.
“I don’t believe the phone is going away,” he told the outlet. “I just think that the next generation of Spectacles can help unlock a new way to use AR hands-free, and the ability to really roam around with your eyes looking up at the horizon, out at the world.”
Before that happens though, Snap will need to figure out how to get consumers interested in its AR smart glasses, especially with tech giants like Facebook and Apple reportedly planning to roll out their own versions soon. Snap’s first-generation Spectacles purportedly cost the company $40 million in unsold inventory, with “hundreds of thousands” of pairs left to gather dust in warehouses. Spiegel didn’t talk prices for its fourth-gen Spectacles on Thursday, but hopefully, Snap will ditch the steep upward trend it’s been on with the line’s predecessors. The original Spectacles went for $150, the Spectacles 2 for $200, and the third-gen for a whopping $380. Sure, AR lenses are cool to play around with, but I think Snap would learn real quick that a $400+ price tag for “cool” isn’t much of a sales pitch.
Warmer weather is coming, the days are getting longer, and we’re all starting to venture outside more—depending on which hemisphere you live in, and what your local coronavirus restrictions look like, of course. If you’re going to be exploring more of the great outdoors in the coming months, these augmented reality apps give you more reasons to leave the house (and make doing so more fun).
Exploring the world is a lot more fun with Spyglass installed on your smartphone. This is essentially a GPS and compass navigation tool, but the way its features are implemented means it’s much more appealing than your standard mapping app. It works offline, it’ll track your pace and altitude, it can log waypoints as you go, and plenty more besides.
As far as the augmented reality features go, you can overlay the compass on top of whatever your camera is seeing in front of you, see the current position of the sun as you move your phone around, and even use the AR capabilities to find your way around using the stars. It’s an app that’s packed with detail and features to enhance any journey.
PeakVisor’s mission is to help you identify mountains that you come across on your travels. You can point it at a range in front of you and find out which summit is which, thanks to the magic of AR. The app should at least make sure that you don’t spend half a day hiking up the wrong trail, and end up at the top of the wrong mountain peak.
If you don’t know what a particular peak is called, you can simply point your phone’s camera at it to find out, and the app gives you elevation readings and information about different ranges as well. There are apparently around a million hills and mountains stored in the PeakVisor database, so it’s unlikely that you’re going to be able to stump it.
If you’re in an unfamiliar part of the world (or even a familiar one), then sometimes you never know what’s around the corner—and that’s where the World Around Me (WAM) app can come in handy. It works like a mapping app in the way it directs you toward places of interest, but you access everything through your phone’s camera and the power of AR.
Whether you’re looking for a landmark, a restaurant, or an ATM, just scan your surroundings with WAM and you’ll be able to see what’s nearby. There are 31 categories of places to look for, and it works really well whether you’re trying to find somewhere specific or you’re just wandering around the neighborhood hoping to discover interesting places.
Everyone’s heard of Snapchat, though you might not have realized the extent of the AR features inside it. You can summon up all kinds of animations, stickers, effects, creatures and characters in augmented reality, then use them to adorn the photos and videos you send. If you want your social media posts to stand out, this is one way to do it.
The AR lenses and effects that are available to you in Snapchat rotate over time, and they’re sometimes dependent on where you are in the world (several are tied to famous landmarks). You can even create your own if you want to. Then there’s everything else Snapchat offers, including looking up your contacts on an interactive map.
Lines of Play is an interesting little experimental app from Google to show off some of the capabilities of ARCore on Android—so it’s not available on iPhones, unfortunately. The app lets you drop in augmented reality blocks on top of the real world and have some fun with them. It’s quite limited in what it does, but it’s a good showcase for the potential of AR tech.
On your next jaunt outside, you can set up a series of colored blocks and have them fall one after another, domino-style. The clever part is the way that the graphics can interact with the real world, whether that’s by the line of blocks disappearing as it goes behind an object, or the cascading movement of the line stopping when it hits a real-world obstacle.
6) Jurassic World Alive (freemium for Android, iOS)
What’s not to like about dinosaurs rampaging around your neighborhood? You don’t have to be a fan of the movie franchise to enjoy Jurassic World Alive, although it certainly helps—you make progress by exploring your local surroundings and discovering new dino DNA, and it’s then up to you to train and develop your dinosaur team inside the app.
You can put your creatures into battles with other players, share your augmented reality creations on social media, or just challenge yourself to develop the best crew of dinosaurs possible. There’s a good blend of elements, with parts that involve going outdoors and parts that don’t, and the game is varied enough to keep you interested over time too.
You don’t have to limit your outdoor adventuring to just the daytime of course, and if you’re out in the evening or at night, then Star Chart will give you an augmented reality guide to the heavens above you. Just point your mobile device at the sky, and you’ll see labels next to stars, constellations, and planets as you move the viewpoint around.
If you want to dig deeper into information about the universe, then Star Chart can help here, too: You’re not just limited to learning about the celestial view in front of you, because Star Chart can take you on a tour across the galaxies. You can even go backwards and forwards in time to see how the constellations have shifted over thousands of years.
You may already be taking Google Maps around with you on your travels, but make sure you’re fully aware of the AR elements you can find inside it. We’re mostly talking about the Live View part of the walking navigation mode—you’ll see a button to launch Live View at the bottom of the screen when you look for walking directions in the app.
You can also enable Live View while you’re actually mid-navigation, just by tapping the icon on the map. As you scan around your location with your phone’s camera, you’ll see AR objects dotted over the view, showing the direction you need to head in, where your next turn is, etc. A recent upgrade means it’ll work in many indoor locations, too.
Zombies, Run! is part exercise app, part AR experiment, and part zombie game, and it somehow pulls off that combination. In this case, the AR isn’t actually overlaid on top of your phone’s camera; it’s coming through your headphones in the form of an audio adventure you’re in the middle of. The goal is to complete missions, collect supplies, and more.
Get your running gear on, load up the game, and get outside. If the thought of undead hordes chasing after you doesn’t improve your average running times around the block, then it’s possible that nothing will. The game works at any speed you like, and you can even use it while walking around, so it’s suitable no matter what your level of fitness.
There’s no lack of speculation around Apple’s rumored AR headset, but the latest is the most sci-fi of them all. According to reliable Apple analyst Ming-chi Kuo, it’s possible the headset will eschew handheld controllers in favor of eye tracking and iris recognition.
Per AppleInsider, Kuo’s investor note on the topic says the headset will use a “specialized transmitter” to track eye movement and blinking. The way Kuo says the transmitter works is that it emits “wavelengths of invisible light”, which then get reflected off your eyeball. A receiver then picks up that reflected light and the changes in light patterns are then analyzed to determine where you’re looking.
That data could then be used to better customize a user’s interaction within an AR environment. Another benefit is that could allow people to control menus by blinking or, perhaps even learn more about an object if they stare at it for a certain period of time. It could also enable better processing power, as anything in your peripheral vision could have a reduced screen resolution.
Where this gets kicked up another notch is iris recognition. While Kuo isn’t sure this is a bonafide feature, he says the “hardware specifications suggest the HMD’s [head mounted display] eye-tracking system can support this function.” Iris recognition is big, as we’ve all seen spy movies where it’s used as a form of biometric identification. This could potentially enable an additional layer of security, making sure no one else can use your device—because these devices will not be cheap. In a more everyday sense, it could also be used for services like Apple Pay.
G/O Media may get a commission
One of the biggest problems with mixed reality and virtual reality is there’s no great way to interact with what you’re seeing. Enterprise headsets like Microsoft’s HoloLens 2 and the Google Glass Enterprise Edition 2, as well as previous consumer versions like Focals by North, all relied on some iteration of hand controls or finger loops. They work, but calibration is an issue and the process can be clunky. Eye tracking, if done well, is a potential gamechanger as you don’t have to keep track of another accessory, or memorize a set of controls.
This interface problem is well known among companies trying to create AR gadgets for consumers. Apple isn’t the only company looking for a novel solution. Facebook recently revealed that it’s envisioning wrist-based wearables that could let you control AR with your mind. It’s far too early to tell which of these two methods (or potentially one we haven’t even heard of yet) will win out in the end. Previously, Kuo noted that Apple’s mixed reality headset is likely to come in 2022, with smart glasses coming in 2025. Facebook is expected to launch some kind of smart glasses this year, but it’s likely the futuristic methods it’s described are for later down the line. That said, I will definitely take eye-tracking over haptic socks any day.
The Facebook Reality Labs Research team’s biggest challenge is finding ways to interact with augmented reality the way we do with a PC. We have a number of headsets and glasses, but no AR equivalent to a mouse and keyboard.
So instead of trying to make existing devices work in AR, Facebook is looking to create new types of human-computer interfaces (HCIs) that are easy to use, reliable, and still provide some level of privacy. Facebook has said it envisions AI as a critical part of the formula to help provide you with the right tools or commands depending on the situation, which should help reduce friction or possible user confusion.
And while this tech is far from being polished, Facebook already has some ideas about how AR-based HCI devices might work in the future. Instead of relying completely on voice commands, Facebook sees wrist-mounted wearables as a good solution, offering a familiar and comfortable design not completely dissimilar to a standard wristwatch, but with new tech that can support various input methods.
Facebook says that by leveraging electromyography, it can use sensors to convert electrical signals that get sent from your brain to your hands into digital commands. Facebook claims EMG sensors are sensitive enough to detect movements of just one millimeter, with future devices potentially even being able to sense someone’s intentions without any actual physical movement. In essence, Facebook is looking to provide direct mind control of AR devices, but without the need for physical implants.
G/O Media may get a commission
Further, with precise EMG sensors, Facebook can also support new gesture controls, like pinching your thumb and index fingers together to create a “click.” In this way, people can translate what they do on a regular PC into a new set of AR-based gestures that Facebook someday hopes to expand into all sorts of controls and movements. Facebook even hopes to reimagine typing with the help of AI to making writing essays or emails faster and more accurate.
Facebook said it knows that all these technologies will need to evolve with each other, because simply being able to click on an AR object won’t be enough if the rest of the AR interface is constantly getting in the way. And once again, Facebook thinks AI can help, by intelligently knowing when you want to switch virtual workspaces or focus on a specific tool or getting additional input from EMG sensors or even eye-tracking sensors.
Although touchscreens and virtual screens are useful, there’s simply no replacement for real physical stimulus. So in addition to touching something with your fingers, Facebook just showed off two different prototypes that deliver haptics in interesting ways.
With its “Bellowband” prototype, Facebook uses a string of eight pneumatic pumps attached to a wrist-mounted device that blow air and create various pressure and vibrations patterns. When combined with its Tasbi prototype (Tactile and Squeeze Bracelet Interface), Facebook has been able to create a device that squeezes your wrist to better mimic the sensation of moving or touching real objects.
The biggest issue, of course, is that Facebook’s track record on privacy is, well…we all know it’s not great. The company said safeguarding people’s data in AR is critically important, though Facebook Reality Labs science director Sean Keller added that “understanding and solving the full extent of ethical issues requires society-level engagement.” In short, Facebook needs feedback on how to improve privacy and security in AR (surprise, surprise), and is encouraging its researchers to publish relevant work in peer-reviewed journals.
Admittedly, while all of this does sound pretty far-flung, given the speed at which VR was adopted by certain sectors of business like engineering and design, it’s not that outlandish to imagine AR seeing similarly explosive growth over the next 10 to 15 years. And, as in other industries, if you’re the first company to define and control a market, there’s a good chance profits will follow. You can rest assured Facebook is going to do its best to try to stay ahead of competitors—but it sounds like Microsoft, Apple, and the rest all have the same idea. Let the games begin.
One thing that most Big Tech companies working on smart glasses haven’t figure out yet is how to effectively interact with an augmented reality environment. Apple is heavily rumored to be working on its own pair of AR glasses and apparently considered vibrating haptic socks to tackle this problem.
A new patent spotted by AppleInsider mainly describes a haptic output device that “may include foot-shaped structures with cavities configured to receive the feet of users.” The foot-wearable support structure would also feature an “array of haptic output components” that work to “apply feedback” to the bottom and top of a person’s foot, possibly to create a sense of movement even if the foot isn’t moving. “These forces may provide a user with a sensation of resting or sliding across a tiled surface or other surface with surface irregularities,” the patent reads.
Technically speaking, the patent says the “foot-wearable support structure” doesn’t have to be a sock. It could also be a shoe. Or just a thing you stick your foot into. The patent is also pretty vague as to what sort of device these haptic socks (or shoes) would be giving you feedback for. It mentions joysticks, buttons, scrolling wheels, touchpads, keypads, keyboards, microphones, speakers, tone generators, vibrators, cameras, and even cooling systems. It also discusses a whole host of sensors, including ones you’d expect like force and touch sensors, as well as sensors for detecting temperature, air pressure, and moisture. Apple, it would appear, doesn’t want sweaty feet to take away from the experience of whatever it was thinking of using these buzzy socks for.
Of all the things Apple’s purportedly working on, its niche VR headset and AR smart glasses are the most likely candidates. From a gaming perspective, something like this would definitely help make an Apple headset feel more immersive. It’s a pie in the sky thought but you could theoretically use these to simulate walking without requiring a user to actually move around.
G/O Media may get a commission
As ridiculous as vibrating socks seem, it’s not totally out of left-field either. Facebook Reality Labs, the division of the social media giant that works on its AR projects, recently published a blog detailing a similar vision of “soft wearables” to help users interact in virtual environments. Granted, Facebook was talking about gloves and wristbands, which are a bit more intuitive than, well, socks. Still, this is an extension of that same line of thinking.
You shouldn’t bet on Apple launching any sort of VR or AR device with these babies. Big Tech files patents all the time just to put their stamp on an idea before a competitor—and right now it feels like all the major players are plugging away on some kind of consumer smart glasses. But it does sort of show us where the notoriously secretive Apple’s head is at with regard to one of AR’s biggest problems. Personally speaking though, I have no intention of ever showing f**t to Apple, or any other tech company.
It’s well-known that Facebook’s partnering with Ray-Ban to develop a pair of augmented reality glasses. What’s less clear is how Facebook envisions these glasses will function, and how the company imagines people will interact with the device. A new Facebook Reality Labs blog sheds a little light on that front—and it possibly involves haptic gloves and “soft” wristbands.
Facebook Reality Labs is essentially a group of researchers, developers, and engineers working on virtual and augmented reality. Every so often, they publish deep dives into the challenges and potential of AR. This time around, FRL is addressing the interface problem with smart glasses. Namely, even if you have a bunch of notifications popping up in your field of view, you need some kind of way to interact with what you’re seeing. The now-defunct Focals by North, as well as the Google Glass Enterprise Edition 2, both had discreet finger loops that let you navigate menus. Others, like Epson’s Moverio glasses, rely on your smartphone. Neither of these methods is particularly intuitive, and it’s one reason why smart glasses just really haven’t taken off.
The FRL blog lays out a theoretical day of wearing Facebook AR glasses, along with what it calls a “soft wristband.” Basically, you go to a cafe and your smart glasses ask if you want to play a podcast. Instead of having to answer via your phone or finger loop, you could flick a finger and the wristband would interpret that as clicking an invisible play button. The blog then outlines a scenario where you’d be able to pull out a pair of “soft, lightweight haptic gloves” that then signal to the glasses to project a virtual screen and keyboard.
What FRL is describing isn’t as futuristic as you might think. It’s essentially tapping into something called electromyography (EMG), which harnesses electrical signals traveling from your spine to your hand. This tech already exists—the Mudra Band is an Apple Watch band prototype that lets you control certain functions by flicking your fingers. It, too, does this by reading electrochemical signals produced by your nervous system. When I spoke to the Mudra Band’s creators at CES, they also envisioned the band potentially being used for AR and VR controls. Facebook isn’t the only company with this idea.
G/O Media may get a commission
Then there’s the haptic gloves, which Facebook believes to be an “ultra-low-friction input.” Or more simply put, gloves are much more natural to use than tech like hand-tracking cameras, microphone arrays, and eye-tracking. Haptic feedback is also supposedly an easy way to give a user feedback regarding the virtual objects you’re interacting with—kind of like a phone vibrating. Ultimately, it seems Facebook’s betting on “soft, all-day wearable systems” or “devices worn close to or on the skin’s surface where they detect and transmit data.”
It’s admittedly a clever approach, and as the blog details, would enable a more intuitive way of interacting with smart glasses and virtual environments. If you could “click” buttons with discreet finger movements, you wouldn’t necessarily have to scroll through menus. The interface could then be designed around “yes” or “no” questions, so long as the AI was powerful enough to interpret what you want in a given situation. (You can peep a concept video of what that interface might look like.)
That is admittedly a big “if” and probably not something we’re going to see in whatever the first, forthcoming iteration of Facebook’s smart glasses is. Facebook Reality Labs itself says in the blog that the sensing technology and highly personalized data needed to train an AI inference model simply does not exist yet. Still, the concept is surprisingly thoughtful, considering just a few weeks ago Facebook stupidly said it was mulling facial recognition for future smart glasses. Honestly, it would be great if Facebook continued investing more in ideas like these for its smart glasses, instead of creating more privacy headaches.
While the rumblings over Apple’s planned venture into augmented reality, virtual reality and mixed reality have been getting louder recently, we now have a series of dates for these devices by prognosticator Ming Chi-Kuo, a good source with a reliable track record on all things Apple.
In a research note with TF International Securities obtained by MacRumors, Kuo stated that Apple will release an MR helmet type product by 2022, an AR glasses type product by 2025 and an AR contact lens type product by 2030-2040. The Apple prognosticator didn’t have a lot to say about the Apple contact lenses, stating that the lenses will bring electronics from the era of “visible computing” to “invisible computing.” He added that there is “no visibility” for the product as of now.
“We predict that Apple’s MR/AR product roadmap includes three phases: helmet type by 2022, glasses type by 2025, and contact lens type by 2030–2040,” Kuo wrote, per MacRumors. “We foresee that the helmet product will provide AR and VR experiences, while glasses and contact lens types of products are more likely to focus on AR applications.”
When it comes to Apple’s MR headset, though, Kuo had a lot more to say. In terms of size, the analyst stated that several prototypes of Apple’s mixed reality headset weighed between 0.4-0.6 pounds (200 to 300 grams). However, Apple’s apparent goal is to reduce the weight to between 0.2-0.4 pounds (100-200 grams), which would make the company’s headset a lot lighter than many existing devices.
It will also be portable, Kuo stated in the report, and have independent computing power and storage. Nonetheless, this doesn’t mean that it will be truly “mobile,” like an iPhone, at least at first. Kuo stated that he expects the new helmet to improve its mobility as technology improves.
G/O Media may get a commission
The analyst also added weight to the rumor that Apple’s headset will be equipped with sophisticated micro OLED displays. The company is working with Sony on this, he said, which is in contrast to previous reports that stated Apple was working with Taiwan Semiconductor Manufacturing Co. With the micro OLED displays and several optical modules, the headset will be able to provide a “see-through AR experience,” as well as a VR experience.
Now here’s the thing, why should you buy Apple’s MR headset when there are a lot less expensive options to choose from?
“Although Apple has been focusing on AR, we think the hardware specifications of this product can provide an immersive experience that is significantly better than existing VR products. We believe that Apple may highly integrate this helmet with video-related applications (e.g., Apple TV+, Apple Arcade, etc.) as one of the key selling points,” Kuo wrote.
He stated that Apple’s mixed reality headset is expected to cost around $1,000 in the U.S.
As far as Apple’s AR glasses go, which are expected to provide an “optical see-through AR experience,” the Apple prognosticator expects a 2025 launch at the earliest. We’ll see if that pans out, as Kuo said he doesn’t think there’s a prototype for this product yet.
All in all, those are some exciting predictions from Kuo. Now let’s remember, although he is a noteworthy source, not everything he predicts comes true. If it is true, though, these Apple devices could change the game for AR and VR if they’re good.
Today at Microsoft’s annual Ignite conference, the tech giant revealed a bold glimpse at the future of digital collaboration with Mesh, a new mixed reality experience set to shape how people work and socialize online.
Powered by Microsoft’s Azure cloud platform and designed to run on a range of devices, including Microsoft’s Hololens headsets, traditional VR goggles, phones, and more, Mesh is Microsoft’s vision of an evolution in current online work tools, which for most people generally consist of a bunch of shared documents, email, a messaging app (Teams, Slack, etc.), and a seemingly non-stop lineup of video meetings.
With Mesh, Microsoft is hoping to create a virtual environment capable of sharing data, 3D models, avatars, and more—basically, the company wants to upgrade the traditional remote-working experience with the power of AR and VR. In the future, Microsoft is planning for something it’s calling “holoportation,” which will allow Mesh devices to create photorealistic digital avatars of your body that can appear in virtual spaces anywhere in the world—assuming you’ve been invited, of course.
By taking advantage of things like eye-tracking, facial-monitoring, and more, Microsoft says it’s hoping to add an extra level of immersion and realness to virtual collaboration, with holoportation even mimicking your expressions and eye contact. Meanwhile, by using outward-facing cameras and object tracking, Mesh will allow people to share and interact with virtual objects across various mixed-reality environments in a more natural way.
Microsoft says the end goal is that Mesh “will also enable geographically distributed teams to have more collaborative meetings, conduct virtual design sessions, assist others, learn together and host virtual social meetups. People will initially be able to express themselves as avatars in these shared virtual experiences and over time use holoportation to project themselves as their most lifelike, photorealistic selves.”
G/O Media may get a commission
During its presentation, Microsoft demonstrated Mesh’s ability to project a 3D model of a car in real space, allowing engineers to view a life-size AR rendering in a shared virtual space. And while Microsoft didn’t announce any concrete timelines for integrating Mesh with Microsoft Teams or its Dynamic 365 productivity suite, Microsoft is already planning to add support for Mesh-enabled apps in future versions of its enterprise collaboration software.
But remote work isn’t the only application Microsoft has in mind for Mesh, and at Ignite, Microsoft partnered with Niantic and OceanX to present demos for how gaming and educational experiences might look in Mesh, right to down cute AR Pokemon roaming the world.
For Microsoft Technical Fellow Alex Kipman—who is one of the leads behind Mesh and helped demo Mesh today at Ignite—the potential power and adaptability of Mesh has always been some of its most tantalizing aspects.
“This has been the dream for mixed reality, the idea from the very beginning, you can actually feel like you’re in the same place with someone sharing content or you can teleport from different mixed reality devices and be present with people even when you’re not physically together,” Kipman said.
That said, much of Mesh’s potential lies in how Microsoft can take these concepts and turn them into reality, and even with a number of convincing demos shown today, it’ll probably be some time until they become a core part of our daily lives.
Currently, Mesh is available in preview form on Microsoft’s expensive Hololens headsets, which will allow users to collaborate remotely, and as part of a new version of AltspaceVR with support for hosting virtual meetings and gatherings.
But even as a loose framework for what’s to come, Microsoft’s Mesh is certainly a bold dream for upgrading our remote working capabilities. A number of companies, including Facebook, Apple, and others, are thinking along similar lines, so it looks like the push to create grand and rich shared virtual spaces may end up being one of the next great tech races.
The e-commerce industry spends a lot of time and energy trying to keep the public interested in its wares and traditionally this has involved spending cash on photography: taking nice pictures of your products has typically been the best way to keep users scrolling to your page on the likes of Amazon or Shopify. But photos may seem old-fashioned soon, as companies invest in more “interactive” strategies to grab users’ interest.
The startup CGTrader is a good example. The company, which recently raised $9.5 million in a Series B funding round, is part of a blossoming industry that sees augmented reality and computer-generated imagery as a way to enhance online marketing. CGTrader reproduces images of household products and appliances for e-commerce vendors, with one key difference between theirs and old-fashioned photos: the images they produce aren’t actually real. Or at least not in the traditional sense.
Take a look at the picture of a modern house interior below—which was produced entirely via CGTrader’s premier “ARsenal” enterprise platform.
Yes, call it the deepfake-ification of product visuals. You’ve probably noticed more and more online retailers reproducing images like the one above (most notably, IKEA recently revealed that a majority of the pictures in its catalog are actually CGI). Indeed, in recent years, firms have rushed to invest in marketing tools that can encourage new kinds of user engagement, while also driving down corporate costs by outsourcing human labor to computer-generated production (though there is some debate about how cost-effective these tools always are).
G/O Media may get a commission
In the case of CGTrader, the images it produces are 3D, so that users can inspect the products from 360 degrees. Products can also be viewed in so-called “lifestyle scenes” (computer-generated house interiors like the one above) so that users can see the products in realistic settings—with the option to customize or shift them as desired. The company also maintains a large database of “stock” 3D image designs that can be integrated into a company’s marketing strategies.
Perhaps most impressive is CGTrader’s claims about creating an AR experience for consumers. The way that works is this: Vendors send CGTrader pictures of their products, which the firm then uses to create 3D photorealistic designs of said products. These designs can then be integrated into mobile and tablet-based AR experiences, wherein consumers can visualize whether the products would look good in their home or not:
And this gets pretty weird! Check out the video below of a bottle of wine sitting on a kitchen table that is definitely not actually there. Then just imagine where this technology may be in a couple of years:
CGTrader was founded in 2011 by former 3D designer Marius Kalytis (who now serves as the firm’s Chief Operating Officer) and by longtime entrepreneur Dalia Lasaite. The startup has seen increased financial interest over the last several years—with significant funding rounds involving large venture capital firms. It’s also experienced a swelling customer base, including the likes of Nike, Microsoft, Shopify, Crate&Barrel, and other large companies.
Lasaite summed up her company’s appeal like this:
“3D models are not only widely used in professional 3D industries, but have become a more convenient and cost-effective way of generating amazing product visuals for e-commerce as well. With our ARsenal enterprise platform, it is up to ten times cheaper to produce photorealistic 3D visuals that are indistinguishable from photographs. Unlike photos, 3D models can also be used to create not just static images but highly interactive AR experiences when users can see how products look and fit in their home. Traditional photography is quickly becoming obsolete in e-commerce.”
Covid-19 accelerated the e-commerce industry’s reliance on CGI and AR for marketing, as traditional media campaigns were hampered by health restrictions. It’s been predicted that such trends will continue to gain prominence in the coming years. It’s surely an exciting new step for marketing, though it also gives you an idea of where we’re headed with the whole “uncanny valley” effect: We’ll soon be living in an online world where you won’t be able to tell the difference between what’s real and what isn’t—whether it’s Tom Cruise golfing or a rattan ottoman you just scrolled past.