You Can Pet a Virtual Cat and Feel Its Simulated Fur Using This Elaborate VR Controller

For some pet owners, being away from their furry companions for an extended period can be heartbreaking. Visiting a beloved pet on a video call just isn’t the same, so researchers at National Taiwan University developed a VR controller that allows the user to feel simulated fur while petting a virtual animal.

Advertisement

Created at the university’s Interactive Graphics (and Multimedia) Laboratory, in collaboration with Taiwan’s National Chengchi University, “HairTouch” was presented at the 2021 Computer-Human Interaction conference this week, and it’s another attempt to bridge the real world and virtual reality to make simulated experiences feel more authentic by engaging more than just a user’s sense of sight and sound. A VR controller, the motions of which can be tracked by a virtual reality headset so the movements of a user’s hands are mirrored in the simulation, was augmented with an elaborate contraption that uses a couple of tufts of fake fur that a finger can feel.

The HairTouch controller not only presents the fake fur when a user touches a furry animal in VR, but it’s also capable of simulating the feeling of different types of fur, and other surfaces, by manipulating those hairs as they extend and contract. By controlling the length of hairs, the fake fur can be made to feel softer and more pliable when it’s fully extended, or stiffer and more coarse when only a small amount of the fibers are sticking up.

To accurately simulate a pet, whose fur coat doesn’t stick straight up like the fibers on a paint brush do, the fake fur on the HairTouch controller can also be bent from side to side, depending on the user’s hand and finger movements in the simulation, and the orientation of the virtual animal. Petting your dog from 3,000 miles away doesn’t seem like the best use of hundreds of dollars worth of VR gear (unless you’re a really devoted dog owner), but the controller can be used to simulate the feel of other textures, too, including fabrics, so the research could also be a welcome upgrade to virtual shopping—a promised use of the technology that hasn’t really moved past the concept stage.

Don’t expect to see the HairTouch available as an official Oculus accessory anytime soon (or even ever), as it’s currently just a research project and the prototype isn’t quite as sleek as the VR hardware available to consumers now. But it’s a clever idea that could find its way into other hardware, and other applications, helping virtual reality blur the lines with reality.

HTC Hopes Its Long-Awaited 5K Vive Pro 2 Headset Won’t Make You Sick

Illustration for article titled HTC Hopes Its Long-Awaited 5K Vive Pro 2 Headset Won't Make You Sick

Image: HTC Vive

We haven’t seen a new HTC Vive virtual reality headset in a minute, but today, the company announced two new devices, including the Vive Pro 2.

Advertisement

The second-gen Vive Pro looks a lot like its predecessor, but nearly every core spec has been upgraded in some way. HTC’s latest high-end consumer VR headset sports a 5K resolution (2.5K to each eye), a wider 120-degree field of view, and a faster 120Hz refresh rate, all of which combines to prevent the motion sickness that people sometimes encounter on less sophisticated head-mounted displays, HTC said. The company also said it moved over to a new display with fast-switching RGB sub-pixels, so in addition to more resolution, graphics on the Vive Pro 2 should look extra sharp and colorful.

In a first for a VR headset, HTC said it worked with both Nvidia and AMD to add support for Display Stream Compression via DisplayPort 1.2, which is a visual compression technique used to reduce the amount of bandwidth needed to output video with practically no loss in image quality. And in a somewhat pleasant surprise, HTC said the Vive Pro 2’s minimum hardware requirements only include an Nvidia RTX 2080 GPU or a Radeon 5000-series card, which is good news for anyone who has had trouble getting their hands on a current-gen graphics card (which is pretty much everyone).

Illustration for article titled HTC Hopes Its Long-Awaited 5K Vive Pro 2 Headset Won't Make You Sick

Image: HTC Vive

The Vive Pro 2 features a handy knob for adjusting IPD (interpupillary distance) and built-in speakers that support 3D spatial audio, along with a revamped headband that delivers a more comfortable fit and a 50-50 weight balance.

One thing I was hoping to see that didn’t make the cut on the Vive Pro 2 is native wireless tethering for receiving video from a nearby PC. This means you still need a physical video cable unless you opt for Vive’s Wireless Adapter, which is compatible with both the original Vive Pro and the new Vive Pro 2.

Advertisement

The other small bummer is that with a starting price of $800 for just the headset, the Vive Pro 2 is still rather expensive compared to something like the Oculus Quest 2. That said, the Quest 2 does have a lower resolution display and a narrower FOV, so the old adage that you get for what you pay for still applies. Also, for people who might not already have base stations or controllers to pair with the Vive Pro 2, the headset will also be available as a kit with two Base Station 2.0 and two Vive controllers for $1,400.

The other new HTC Vive headset, the Vive Focus 3, is intended primarily for enterprise and large corporations, and in some respects, it’s actually the more interesting gadget of the two.

Advertisement

Powered by Qualcomm’s Snapdragon XR2 chip, the Vive Focus 3 is many ways like the Quest 2 but with even better optics. Not only does it have a 5K display similar to what you get in the Vive Pro 2 (but with a 90Hz refresh rate instead of 120Hz), it supports both standalone operation (no need for a nearby PC) and a wired mode, so you can get that full wireless experience or higher fidelity graphics from a tethered PC depending on your needs.

Advertisement

The Focus 3 also features new controllers and a chassis featuring a magnesium alloy frame that HTC said is 20% lighter and 500% stronger than typical plastic. You also get inside-out tracking thanks to the four cameras on the outside of the headset, front and rear gaskets that can be changed out for easy cleaning, built-in speakers, and even a special audio privacy mode to prevent people from eavesdropping on you while you’re in a meeting. In a nod toward enterprise use, the Focus 3 comes with a swappable battery system that lets you slap on a fresh power pack in just a few seconds.

The Vive Focus 3 will cost $1,300 and includes a two-year enterprise warranty, in addition to a whole suite of new business-focused software support and apps to help companies more easily transition from traditional office collaboration to working in VR.

Advertisement

Now technically, anyone can pay $1,300 for a Focus 3 if what they want is essentially a Quest 2 with better specs, but unfortunately, the Focus doesn’t come with the same kind of software and support the average consumer wants, so unless you’re planning on tinkering around on your own, the Vive Pro 2 is likely the better option.

The Vive Focus 2 is available for pre-order today and officially starts shipping on June 4. The Vive Focus 3 arrives June 27.

Advertisement

Facebook VP Says There Aren’t Any Plans to Release a Quest Pro or Quest 3 in 2021

Illustration for article titled Facebook VP Says There Aren't Any Plans to Release a Quest Pro or Quest 3 in 2021

Photo: Sam Rutherford

Despite dropping hints that Facebook could be working on a new more powerful VR headset, Facebook VP Andrew Bosworth made it clear recently that the company doesn’t have any plans to release a Quest Pro or Quest 3 headset anytime this year.

Advertisement

The subject of a new headset from Facebook came up recently during a podcast (recorded by UploadVR here) hosted by Facebook Reality Labs vice president Andrew Bosworth and Oculus CTO John Carmack where Bosworth admitted that even though he had previously mentioned the possibility of a more sophisticated Quest Pro headset, he wanted to make it clear that no such device is coming anytime soon.

When asked about future headsets from Facebook, Bosworth said “People are also asking about the Quest 3, which doesn’t exist yet, and everyone who is listening to us who is a reporter there isn’t a Quest 3, there’s only a Quest 2, but I did hint at an AMA earlier this year about Quest Pro because we do have a lot of things in development where we want to introduce new functionality to the headset along the kinds that people theorize that we would want to introduce, and that’s a little ways off still. It’s still not gonna happen this year.”

Bosworth then capped off the podcast by saying “For those who are curious, Quest 2 is going to be in the market for a while – for a long while, and it’s gonna be, you know, I think the best bet for the most accessible way to get into VR and have a great experience.”

Renewed speculation about Facebook’s plans for future VR hardware has recently been spurred on by the release of the Resident Evil 4 VR remake, which doesn’t run on the original Quest and is the first new title made exclusively for the Oculus Quest 2. This caused a small panic among Quest 2 owners regarding Facebook’s long-term support of its current flagship VR headset, which originally came out back in the fall of 2020.

So far, both Facebook and Oculus developers have been rather slow to begin pulling support for the original Quest, with Bosworth claiming that there are over a million people still using Facebook’s last-gen headset. However, with Facebook having designated both the original Quest and the Rift S as products that have reached end-of-life, it’s pretty clear that the Quest 2 is Facebook’s flagship headset for both mobile and desktop VR experiences for the foreseeable future.

Thankfully—with Oculus having recently announced new features for the Quest 2 including support for native wireless VR streaming (called Oculus Air Link), improved productivity features, and faster 120Hz refresh rates—it seems there’s plenty of room to continue improving Facebook’s current VR goggles without the need for all-new hardware.

Advertisement

And when it comes to what is still a relatively new branch of tech, updated components and more powerful hardware are always nice, but there’s something to be said about focusing on the stability of your platform too, which is what Facebook seems to be doing with the Quest 2.

Is There VR for Senses Other Than Sight?

Illustration for article titled Is There VR for Senses Other Than Sight?

Illustration: Benjamin Currie/Gizmodo

Giz AsksGiz AsksIn this Gizmodo series, we ask questions about everything and get answers from a variety of experts.

In an ideal world, virtual reality would be just like regular reality, except without regular reality’s annoying restrictions on human flight, teleportation, endless/consequence-free consumption, etc. As it is VR, it is inescapably VR—you can lose yourself, but only to a point. This has to do partly with the technology’s bulkiness, but also its sensory limitations. Smell and taste, two fairly major components of regular reality—key to memory, present-time enjoyment, and more—do not factor into commercial VR equipment. Of course, this problem isn’t unknown to the score of researchers presently working on VR, and over the last few years, in labs all over the world, these researchers have made real advances in the realm(s) of sensorial verisimilitude. For this week’s Giz Asks, we reached out to some of them for insight into these developments.

Advertisement


Krzystof Pietroszek

Assistant Professor, Communication, American University, and Founding Director of the Institute for Immersive Designs, Experiences, Applications and Stories (Institute for IDEAS)

Technology that lets you simulate other senses is already available in various commercial forms—Teslasuit (no connection to Tesla) makes haptic suits that let you feel as if you were hit if you’re shot in a game, for instance. Not very strong hit, but still, a hit. HaptX, meanwhile, makes a glove that serves as a kind of exoskeleton, providing force feedback—simulating what it feels like when you close your hands on a steering wheel or an apple. There’s also the Digital Lollipop, made by Nimesha Ranasinghe at the National University of Singapore, which is a digital device you can put on your tongue to simulate different tastes. And a company called FeelReal is developing a mask you can put under a VR headset on which different smells are programmed—flowers in spring, forest fires, etc.

Sarah Ostadabbas

Assistant Professor, Electrical and Computer Engineering, Northeastern University

You pull a strap, cinching your suit, and turn on the game. Immediately you’re transported to a warm castle in a fantasy version of medieval Europe. You feel the warmth from the fire against the back of your legs. As you walk to the window, you hear your footsteps echoing off the stone walls. Opening the window, you feel the cold breeze against your cheeks and smell spring flowers mixed with the smell of sheep—it’s shearing time. You notice a tray of pastries and taste the sweet and slightly salty custard as your bite into the tart.

Theme parks have been creating experiences like this with 3D glasses, hydraulic seats, and machines for blowing air, spraying water, and emitting scent. But an interactive, multi-sensory VR experience like this feels about as far-fetched as VR felt in the ‘80s. Of course, this seems possible with high resolution direct brain stimulation and much greater computational power, but this technology is at least decades away. Meanwhile, there are technologies that can help us achieve part of this. Sound is the most integrated, with every VR system including some means of approximating 3D sound. Sound fidelity still lags significantly behind the visual experience, but it’s easy to produce and essential for immersion, so for the most part we can consider that covered. For smell, there are products and research based on liquid scent cartridges that are rapidly vaporized and mixed for immersive olfactory experiences. There has also been preliminary work in direct electrical stimulation of smell receptors. Given that taste centers around regionally distributed receptors on the tongue, most research has focussed on direct electrical and thermal stimulation.

The final sense, touch, is complicated. This sense actually encompasses pressure, temperature, sheer, acceleration (linear and angular) and physical resistance. For years, flight simulators have used hydraulics to simulate feelings of motion and acceleration. Recently electrical motion bases have become available for home use. They are expensive, but within range of a well-funded gamer. Haptic systems, which resist motion and provide tactile feedback, such as vibration, are commercially available and widely used in industry for surgical robotics, CAD input, and in gaming (see here and here). There has also been work on creating haptic textures for more nuanced experiences. There are even full haptic suits for better training.

Like most technologies, wide adoption would drop the price considerably (due to volume), resulting in even wider adoption. As “normal” sight-and-sound VR gets adopted, this will prove of interest and start to drive price drops in further immersion. Right now, though, VR is commercially available, but still relatively niche.

Robert Stone

Chair in Interactive Multimedia Systems and Director of the Human Interface Technologies Team at the University of Birmingham

My team and I have been evaluating commercial VR scent or smell systems (more formally, “olfactory displays”) for the last five or six years. There are, at the moment, a few commercial products on the market. A company called Olorama has invented a cartridge injection system, which can release synthetic smells; the process is similar to perfume production sometimes used on a larger scale in stores and supermarkets. Other techniques use synthesized smells in blocks of paraffin—you inject a high-speed jet of air into the paraffin, and that releases the smell. Some use a tray of small plastic wells which are then heated up, and as they vaporize a fan blows the scent out. A new olfactory display company based in the States, OVR, has developed an innovative scent delivery system that clips to the bottom of most of the popular virtual reality headsets (HTC Vive, Oculus Quest, etc.); I believe their system can currently handle up to nine smells. We’re hoping to work with them this year to bring realistic smells to our 17th century “Virtual Mayflower” demo, so that users can smell just how bad Plymouth was in those days when the Pilgrims departed for the New World!

To trigger the smells in VR, we place an invisible “force field”, a bounding box, around a given object, so that when your movement path intersects with that force field it triggers the relevant smell or smells. The recreation of the old harbor in Plymouth gives us plenty of objects to act as triggers, and believe me, it’s going to smell pretty bad—fish, stale seawater, human excrement, animal excrement, etc. When you walk close to a 17th century latrine, a dung pile, or over a wooden plank spanning the human waste run-off channels into the sea, you’re going to get a whiff of something truly awful, but hopefully realistic for the period!

But the one problem we’ve always had with olfactory displays involves the delivery mechanism. Take, for example, a project we undertook for the British Army. We wanted to recreate the smells of a Middle Eastern village, because we learned that when Army personnel were on patrol, certain smells – even missing smells—could alert them that something about what was about to go down. We could synthesize the smells—cooking, tobacco, rotting, hanging meat and so on, and release them, but the noise of the electromechanical and pneumatic components of the hardware alerted the users long before the smells did, which rendered the simulation useless.

Having said that, at the moment, it’s smell I’m most excited about in a VR context, because I think a good smell delivery system will definitely enhance immersion, and invoke emotions and memories. We found many, many years ago that adding sound to a VR experience made all the difference to how users engaged with and believed the visuals. Bringing sound and smell together will, I have no doubt, revolutionize virtual experiences.

As for taste, which, of course, is closely related to smell: there are electronic techniques that have been around since the 18th century (described as “galvanic tongue stimulation). But, in all honesty, I can’t see an acceptable non-intrusive means of combining and displaying the tastes being available for decades. Indeed, I believe we’re not going to achieve a truly immersive, all-senses-covered VR system, including believable sight, sound, touch, smell and taste experiences (sweetness, bitterness, saltiness, sourness, and umami (savoury)) until we’ve got something akin to a real Star Trek “Holodeck”, something that avoids the need to don encumbering wearable tech!

Advertisement

Murat Akcakaya

Associate Professor, Electrical and Computer Engineering, University of Pittsburgh

Right now, we’re working on National Science Foundation-funded research on haptic interaction in simulated environments, including virtual reality.

We start with human beings interacting with objects in a real environment—interfacing, say, with different textures (rough, smooth, etc.). These textures produce different brain patterns, as measured via EEG. Our work shows that, looking only at a person’s EEG, their real-time brain responses, we can identify what texture they’re touching.

So now our idea is: let’s go back into the VR environment and monitor people’s brain responses there. Applying certain haptic stimulations—electrical stimulation, really any vibrating tactical stimulant—and controlling certain parameters like amplitude, magnitude, frequency, phasing, etc.—when we control those parameters, can we generate or replicate responses in the brain similar to those generated by interacting with objects in a real environment?

We’re still in the phase of investigating different effects, but the overall object is to integrate all those real-time brain-response-guided stimulation control capabilities in a simulated environment such as VR in order to provide a more realistic simulation of touch.

Advertisement

Michael R.M. Jenkin

Professor, Electrical Engineering and Computer Science, York University, whose research interests include Computer Vision, virtual reality and mobile robotics

I have a special interest in understanding the effects of long-duration microgravity on human perception. Let’s suppose you want to train someone to learn how to land a spaceship on the moon, and you train them on earth. Well, when they actually go ahead and do it on the moon, or near the moon, they’re going to be subject to a quite reduced gravity effect. So the question is: is your simulator, which you’ve learnt on earth, going to properly train you? Or is it going to get things wrong, and negatively impact your performance, because gravity is different on the moon? You can ask the question the other way around: if you’re an astronaut on a long-duration space mission in zero gravity and you’re about to land back on earth, in a one-g environment, are there any things you learned in your microgravity training that are going to negatively impact your performance in that scenario? The real question, then, is whether we can build virtual reality that simulates the different loading forces on the body. And I think this can be done.

Advertisement

Jurgen Schulze

Research Scientist, University of California San Diego

Already, we have VR for our ears via spatial audio, which pretty much every VR headset supports. Just hearing a noise in one ear louder than in the other, providing a directional sense, can provide a pretty immersive level of realism.

While arguably the most obvious, I also find it the most important, and the most neglected. Programmers often don’t implement spatial audio very carefully. Often, they don’t factor in the way sound reflects off objects in the environment—they just assume the environment is empty, free of any objects, and simulate what a given act would sound like in empty space. You have directionality, but you don’t have the realism of sound bouncing off of an object, or sounding different in different situations. When you enter a church made mostly of stone, it sounds very different than walking through a space composed of carpets and curtains. And when a soccer ball hits a stone floor, it makes a different sound than it does when it hits a wooden floor, or a carpeted floor. But these differences are rarely reflected, and are often not supported by current VR apps.

Advertisement

Kevin Curran

Professor, Cyber Security, Ulster University

Virtual Reality (VR) of course fools the primal parts of our brains to usher us into immersive worlds. Other senses, however, can be used to compliment the vision sense in VR.

The senses that can be used in VR are sight (sense of vision), audition (sense of hearing), olfaction (sense of smell), somatosensation (sense of touch), and thermoception (ability to sense intensity of heat). Whenith sights, smells, thermoception, sounds and haptic sensation are all utilized together in a spatially representative environment, one is more aligned sensorially.

The first VR attempt to assemble various senses was in 1957 with “Sensorama.” It has a display enriched with physical world objects (e.g., images, stereo sounds, smell, wind, and seat vibration). However, since this experiment, the most common senses stimulated in VR have been sight (through the use of 3D images) combined with hearing.

The Lost Foxfire game system, for example, consists of a virtual reality headset paired with a configurable multisensory suit that delivers thermal, wind, and olfactory stimuli to players, helping them in the game.

The suit contains five heat modules, allowing players to sense heat on the front, the back, the sides of their necks, and on their faces. When players encounter a fox character, they will catch a whiff of the scent of apples, and as players approach fire, they can feel the heat it emits. This leads to a more immersive gaming experience.

Sound is crucial in many VR experiences, but truly realistic immersive sound requires spatial audio, which simulates sounds coming from various distances and directions. Most of the leading VR Companies include spatial audio in their hardware to ensure it is localized correctly matching the manner sound naturally travels from one ear to the other.

The amygdala—the part of the brain that manages emotion and memory—is linked to smell. This is why smells can be so powerful. It is not trivial to incorporate smells in VR, as the chemicals required to create scents can linger or mingle with other scents. Examples however have been games where sea scents have been used when the player is near the sea or in a forest.

It is not all about games, however. There have been studies which have used tactile or olfactory stimuli with the specific purpose of treating specific phobias or posttraumatic stress disorder. They found that by adding other sense stimuli, the effectiveness of VR exposure therapy for the phobia increased.

Haptic interfaces—devices which simulate physical sensations—rely on vibration to simulate sensory experiences. However, they have typically been bulky and need large battery packs or wires to power them. Recently, Northwestern university developed a 15cm square which can be stuck onto the body and uses actuators that vibrate against the skin to simulate tactile sensations. This ‘synthetic skin’ is controlled wirelessly with an app that transmits tactile patterns to the patch.

Virtual reality taps into evolutionary biology to make our brains believe, with all 5 senses, that we are experiencing something more real than previously thought possible. The covid pandemic has shown us how fragile the real-world is and how much we miss human contact. Multi-sensory VR may be just the way out of the next pandemic. Remember, technology never gets worse—it only improves.

Advertisement

Do you have a burning question for Giz Asks? Email us at tipbox@gizmodo.com.

Apple’s Mixed Reality Headset Might Be Light AF

Illustration for article titled Apple's Mixed Reality Headset Might Be Light AF

Photo: Mladen Antonov/AFP (Getty Images)

Apple prognosticator Ming-Chi Kuo is not done dispensing rumors about the company’s long-rumored mixed reality headset. This time, per 9to5Mac, Kuo has details on the headset’s weight: reportedly less than 150 grams thanks to the use of hybrid ultra-short focal length lenses.

Advertisement

If you struggle with metric conversions, 150 grams is absurdly light. It’s the equivalent of about five ounces. You know what else weighs five ounces? According to Weight of Stuff, that’s roughly equivalent to a baseball, half a deck of cards, a checkbook, a medium-sized apple, and a small bottle of glue. As far as smartphones go, that would be lighter than the iPhone 12, which weighs 164 grams, or 5.78 ounces.

Holy guacamole, Batman, that’s a game-changer when it comes to mixed reality headsets. Most are relatively bulky and are significantly heavier. The Oculus Quest 2, for instance, weighs 503 grams, or 17.7 ounces. Microsoft’s HoloLens 2 is even heavier at 566 grams, or close to 20 ounces. That’s 1.1 and 1.2 pounds, respectively. In the grand scheme of things, that’s not outrageously heavy—but it does start to weigh on you if you’re trying to wear a headset for an extended period of time. Plus, there’s no two ways about it—you look stupid in a bulky headset no matter how cool the tech is.

Part of how Kuo believes Apple will do this is by adopting a hybrid Fresnel lens design. More specifically, AppleInsider quotes Kuo as saying that each Fresnel lens “comprises three stacked Fresnel lenses.”

If you’re unfamiliar with Fresnel lenses, they’re basically lenses with a lot of concentric grooves etched into the material. There’s a lot of fancy optics involved, but the gist is that you can focus light in a way that’s similar to (or potentially better than) a conventional curved lens. The bonus is that you can use plastic instead of glass, thereby reducing weight, and Fresnel lenses can be much thinner than conventional lenses. They were initially used in lighthouses, but these days you can find them in VR headsets like the HTC Vive because they can help users focus on an image at an extremely close distance—all while reducing weight. That said, these lenses aren’t perfect. While they can improve your field of vision in a VR headset, they can cause distortion and unwanted light rings.

On that front, Kuo says Apple’s Fresnel lenses will have “customized material and coating” and that “light transmission is not lower than glass.” The downside is that while Fresnel lenses are generally cost-effective, the way Apple seems to want to use them will be expensive. Perhaps that’s one reason why the rumored cost for this headset is so damn high.

Advertisement

Again, this is also just the newest rumor we’ve heard about this headset. Just a few days ago, Kuo also said the mixed reality headset could potentially forego hand controllers for eye tracking and iris recognition. Other rumors say the headset might sport 8K displays, an M1 chip, and more than a dozen cameras. If you’ve lost track of them all, I’m sorry to say that you’ll likely have to gear up for at least another year of leaks and speculation. Kuo’s latest timeline puts the mixed reality headset launching in 2022, with AR glasses to follow in 2025. The rumors will only grow more intense the closer we get to a launch date—so buckle up.

Apple’s Mixed Reality Headset Might Forgo Controllers for Eye Tracking and Iris Recognition

Illustration for article titled Apple's Mixed Reality Headset Might Forgo Controllers for Eye Tracking and Iris Recognition

Photo: Ryan Anson/AFP (Getty Images)

There’s no lack of speculation around Apple’s rumored AR headset, but the latest is the most sci-fi of them all. According to reliable Apple analyst Ming-chi Kuo, it’s possible the headset will eschew handheld controllers in favor of eye tracking and iris recognition.

Advertisement

Per AppleInsider, Kuo’s investor note on the topic says the headset will use a “specialized transmitter” to track eye movement and blinking. The way Kuo says the transmitter works is that it emits “wavelengths of invisible light”, which then get reflected off your eyeball. A receiver then picks up that reflected light and the changes in light patterns are then analyzed to determine where you’re looking.

That data could then be used to better customize a user’s interaction within an AR environment. Another benefit is that could allow people to control menus by blinking or, perhaps even learn more about an object if they stare at it for a certain period of time. It could also enable better processing power, as anything in your peripheral vision could have a reduced screen resolution.

Where this gets kicked up another notch is iris recognition. While Kuo isn’t sure this is a bonafide feature, he says the “hardware specifications suggest the HMD’s [head mounted display] eye-tracking system can support this function.” Iris recognition is big, as we’ve all seen spy movies where it’s used as a form of biometric identification. This could potentially enable an additional layer of security, making sure no one else can use your device—because these devices will not be cheap. In a more everyday sense, it could also be used for services like Apple Pay.

One of the biggest problems with mixed reality and virtual reality is there’s no great way to interact with what you’re seeing. Enterprise headsets like Microsoft’s HoloLens 2 and the Google Glass Enterprise Edition 2, as well as previous consumer versions like Focals by North, all relied on some iteration of hand controls or finger loops. They work, but calibration is an issue and the process can be clunky. Eye tracking, if done well, is a potential gamechanger as you don’t have to keep track of another accessory, or memorize a set of controls.

This interface problem is well known among companies trying to create AR gadgets for consumers. Apple isn’t the only company looking for a novel solution. Facebook recently revealed that it’s envisioning wrist-based wearables that could let you control AR with your mind. It’s far too early to tell which of these two methods (or potentially one we haven’t even heard of yet) will win out in the end. Previously, Kuo noted that Apple’s mixed reality headset is likely to come in 2022, with smart glasses coming in 2025. Facebook is expected to launch some kind of smart glasses this year, but it’s likely the futuristic methods it’s described are for later down the line. That said, I will definitely take eye-tracking over haptic socks any day.

Advertisement

There’s a Hidden Code in Intel’s New Graphics Card Teaser

Illustration for article titled There's a Hidden Code in Intel's New Graphics Card Teaser

Screenshot: Intel (Other)

Intel posted a cryptic teaser on Twitter this morning that seems to suggest the chipmaker is getting ready to release more details about its long-awaited gaming graphics card, the Xe HPG.

Advertisement

What that information is no one knows, but Intel has been dropping tiny crumbs about the Xe HPG for several months. Last August, Intel said its gaming GPU could ship sometime in 2021, and during its keynote at CES 2021, the company mentioned it had figured out a way to utilize both an integrated GPU and a discrete GPU at the same time.

Intel didn’t specify if this would be a feature just for its graphics cards and processors, or if it would be compatible with Nvidia and AMD cards, but it’s clearly something the company is working on. It’s possible we could hear more about the feature during an upcoming Intel webcast on March 23. Or maybe the reveal date will be March 26.

Why March 26? According to Wccftech, there’s a binary code in the video that, if translated properly, points to a hidden scavenger hunt website. “Welcome to the Xe HPG Scavenger Hunt” is displayed prominently in a small black box on the website. There’s a March 26, 9 a.m. PST “launch date,” and a note for people to come back on that day and enter their “secret code.” The code seems to have been cracked by a Twitter user.

There’s no other information about how to find that secret code. It’s possible it could be tied to the March 23 webcast, but doesn’t seem likely at this point. Like most scavenger hunts, there’s a series of hidden objects and puzzles to solve before you reveal the mystery at the end, or free yourself from the escape room; Gizmodo reached out to Intel for hints, but has yet to hear anything back.

Intel officially released its first Xe desktop GPU back in January. However, those are mainly for budget desktops, non-gaming ones at that, and most of the cards are included in pre-built PCs rather as a stand-alone, off-the-shelf item. The Xe HPG is supposed to be more of a mid-tier card and competitively priced in comparison to Nvidia and AMD.

So, Intel’s getting ready for something, but clearly it’s trying to create more of a mystery to drum up excitement. Consider us excited.

Sony’s Next-Gen VR Controllers Are Packing Some Major Upgrades

Illustration for article titled Sony's Next-Gen VR Controllers Are Packing Some Major Upgrades

Image: Sony

Even though Sony has already said its upcoming VR headset won’t be available until sometime next year, the company is already showing off teasers of its new controllers. Based on what we’ve seen so far, the next version of Sony’s PS VR tech looks to be packing some serious upgrades.

Advertisement

Unlike the old PlayStation Move controllers (which are basically holdovers from back when console makers thought standalone motion controls were going to be the next “big thing”), Sony’s new VR controllers have an orb-like design that’s supposed to make them easier and more natural to hold while also supporting better motion tracking.

Illustration for article titled Sony's Next-Gen VR Controllers Are Packing Some Major Upgrades

Sony says its new headset will have built-in motion tracking, instead of requiring a camera, that will detect a “tracking ring” positioned on the bottom of the controllers, so motion tracking should be more accurate and easier to set up with one less component to worry about.

On top of that, Sony is borrowing some tech from the current DualSense controller for use in virtual reality, with its VR controller also getting adaptive triggers that can change their tension on the fly and more sensitive haptic feedback. And Sony is also adding support for finger touch detection that lets you make gestures and commands simply by touching parts of the controller’s grip.

Illustration for article titled Sony's Next-Gen VR Controllers Are Packing Some Major Upgrades

Image: Sony

Advertisement

And because it wouldn’t be a PlayStation controller without Sony’s classic face button, you also get circle and x buttons on the right controller along with an analog stick, a grip button, and a shoulder button, while the left gets a mirrored setup with triangle and square face buttons instead.

In a lot of ways, it seems like Sony has been taking notes and learning from the designs of other VR controllers, most notably the Valve Index’s finger-sensing controller and the Oculus Quest 2‘s controller, which is similarly shaped though has a slightly more minimalist design.

Advertisement

Illustration for article titled Sony's Next-Gen VR Controllers Are Packing Some Major Upgrades

Image: Sony

Unfortunately, because Sony has already announced that its new VR headset won’t be arriving this year, at least for now, we’re going to have to be content with this slow drip of teases while we wait for more concrete info in the future.

Advertisement

However, Sony’s VR controller does appear to be off to a promising start, and if Sony can nail the optics on its accompanying headset, PS5 owners should be in for a treat when the devices officially launch sometime in 2022.

Apple Patent Hopes We’ll All Show Feet in AR

Illustration for article titled Apple Patent Hopes We’ll All Show Feet in AR

Screenshot: USTPO

One thing that most Big Tech companies working on smart glasses haven’t figure out yet is how to effectively interact with an augmented reality environment. Apple is heavily rumored to be working on its own pair of AR glasses and apparently considered vibrating haptic socks to tackle this problem.

Advertisement

A new patent spotted by AppleInsider mainly describes a haptic output device that “may include foot-shaped structures with cavities configured to receive the feet of users.” The foot-wearable support structure would also feature an “array of haptic output components” that work to “apply feedback” to the bottom and top of a person’s foot, possibly to create a sense of movement even if the foot isn’t moving. “These forces may provide a user with a sensation of resting or sliding across a tiled surface or other surface with surface irregularities,” the patent reads.

Technically speaking, the patent says the “foot-wearable support structure” doesn’t have to be a sock. It could also be a shoe. Or just a thing you stick your foot into. The patent is also pretty vague as to what sort of device these haptic socks (or shoes) would be giving you feedback for. It mentions joysticks, buttons, scrolling wheels, touchpads, keypads, keyboards, microphones, speakers, tone generators, vibrators, cameras, and even cooling systems. It also discusses a whole host of sensors, including ones you’d expect like force and touch sensors, as well as sensors for detecting temperature, air pressure, and moisture. Apple, it would appear, doesn’t want sweaty feet to take away from the experience of whatever it was thinking of using these buzzy socks for.

Of all the things Apple’s purportedly working on, its niche VR headset and AR smart glasses are the most likely candidates. From a gaming perspective, something like this would definitely help make an Apple headset feel more immersive. It’s a pie in the sky thought but you could theoretically use these to simulate walking without requiring a user to actually move around.

A “illustrative foot and an associated array of haptic output components.” Show f**t sweeties.

A “illustrative foot and an associated array of haptic output components.” Show f**t sweeties.
Screenshot: USTPO

As ridiculous as vibrating socks seem, it’s not totally out of left-field either. Facebook Reality Labs, the division of the social media giant that works on its AR projects, recently published a blog detailing a similar vision of “soft wearables” to help users interact in virtual environments. Granted, Facebook was talking about gloves and wristbands, which are a bit more intuitive than, well, socks. Still, this is an extension of that same line of thinking.

Advertisement

You shouldn’t bet on Apple launching any sort of VR or AR device with these babies. Big Tech files patents all the time just to put their stamp on an idea before a competitor—and right now it feels like all the major players are plugging away on some kind of consumer smart glasses. But it does sort of show us where the notoriously secretive Apple’s head is at with regard to one of AR’s biggest problems. Personally speaking though, I have no intention of ever showing f**t to Apple, or any other tech company.

No, Mark Zuckerberg, Virtual Reality Goggles Won’t Solve the Climate Crisis

Sorry!

Sorry!
Photo: Hannah McKay-Pool (Getty Images)

The billionaires are once again claiming that tech will save us. In an interview published Monday, Facebook CEO Mark Zuckerberg said that by the end of this decade, he expects we’ll all have access to virtual reality goggles that will allow us to feel like we’re teleporting into other people’s houses and hanging out. That could, in turn, help cut down on greenhouse gas emissions from travel.

“Obviously, there are going to keep on being cars and planes and all that. But the more that we can teleport around, not only are we personally eliminating commutes and stuff that’s kind of a drag for us individually, but I think that’s better for society and for the planet overall, too,” Zuckerberg told the Information.

Globally, the transit sector is one of the largest contributors to the climate crisis, and in the U.S., pollution from the transit sector make up the largest share of U.S. greenhouse gas emissions. There’s no question that that needs to change.

Advertisement

But will virtual reality goggles help us cut down on carbon emissions? Maybe. Studies on the greenhouse gas output of VR technology are limited, but it’s true that the carbon emissions associated with an individual call on Zuckerberg’s prospective smart glasses might be lower than those associated with a plane trip across the country, for instance. Right now, though, we don’t have enough information to know for sure.

“Videoconferencing already [reduces travel-related carbon emissions] to a limited degree, but I don’t think anyone can say in advance whether VR will be cool enough to get people to forego leisure travel. If it substitutes for business travel and meetings it will be a net positive,” Jonathan Koomey, an energy and climate researcher who runs a sustainable IT consultancy, wrote in an email, though he added that “there are so many unknowns about this that it’s hard to say anything definitive.”

VR also isn’t a substitute for in-person jobs which require face-to-face interactions and work, from construction to food service. The millions of people employed in those industry will still continue to commute. In order to make much of a dent in business travel, VR technology would likely have to be used quite widely in other sectors, then. And as for traveling for fun, Koomey doesn’t think it has much of a shot.

“I doubt people will choose to use VR instead of going on a tropical holiday,” he said.

Advertisement

Of course, not everyone in the world is going on tropical holidays. In fact, one recent study found that just 11% of people globally traveled by plane at all, and no more than 4% of the world’s population took international flights. But the global elite, the study found, are flying with no abandon. Just 1% of the global population was responsible for half of the world’s commercial flight emissions that year.  

If flying on planes is bad, taking private jets is far worse. A single PJ flight across the country can produce nearly double the entire annual greenhouse gas output for an average American. Yet Facebook covered nearly $3 million in private jet costs for Zuckerberg in 2019, according to federal disclosures. In addition to flying for business, Zuckerberg also has a $100 million mansion in Hawaii. It’s hard to imagine him forgoing trips there to stay at home on the VR headset instead.

Advertisement

In some sense, though, that’s not entirely Zuckerberg’s fault. We can’t expect people with his level of extreme wealth to simply act in good faith and make personal decisions that are better for our collective future on Earth! In other words, we can’t simply depend on people choosing to buy new, fancy technologies to fix the climate crisis. The world’s leading climate scientists say we need high levels of government regulation to clamp down on pollution from aviation, cruises, cars, and other polluting sectors.

Zuckerberg, though, has remained staunchly anti-regulation. But then, of course, he has to—his job is to make Facebook money. Making VR goggles the next big thing could be great for Zuckerberg’s quest to enrich Facebook and himself given the company’s stake in the technology. And branding the glasses as green could help him in doing so.

Advertisement

In every other sense, though, there’s little evidence that VR goggles will really do that much. So if Mark Zuckerberg really cares about lowering planet-warming emissions, maybe he could start with a more robust solution to combat climate denial, axing avenues for deniers to advertise lies, and generally not funding them or their events in the first place.