Why Apple is pushing the term ‘spatial computing’ along with its new Vision Pro headset

admin
7 Min Read

SAN FRANCISCO (AP) — With Apple’s hotly anticipated Vision Pro headset hitting store shelves Friday, you’re probably to start to see more people wearing the futuristic googles that are supposed to usher in the age of “spatial computing.”

It’s an esoteric mode of technology that Apple executives and their marketing gurus are trying to thrust into the mainstream while avoiding other more widely used terms such as “augmented reality” and “virtual reality” to describe the transformative powers of a product that’s being touted as potentially monumental as the the iPhone that came out in 2007.

“We can’t wait for people to experience the magic,” Apple CEO Tim Cook gushed Thursday while discussing the Vision Pro with analysts.

The Vision Pro also will be among Apple’s most expensive products at $3,500 — a price point that has most analysts predicting the company may only sell 1 million or fewer devices during its first year. But Apple only sold about 4 million iPhones during that device’s first year on the market and now sells more than 200 million of them annually, so there is a history of what initially appear to be a niche product turning into something that becomes enmeshed in how people live and work.

If that happens with the Vision Pro, references to spatial computing could become as ingrained in modern-day vernacular as mobile and personal computing — two previous technological revolutions in technology that Apple played an integral role in creating.

So what is spatial computing? It’s a way to describe the intersection between the physical world around us and a virtual world fabricated by technology while enabling humans and machines to harmoniously manipulate objects and spaces. Accomplishing these tasks often incorporates elements of augmented reality, or AR, and artificial intelligence, or AI — two subsets of technology that are helping to make spatial computing happen, said Cathy Hackl, a long-time industry consultant who is now running a startup working on apps for the Vision Pro.

“This is a pivotal moment,” Hackl said. “Spatial computing will enable devices to understand the world in ways they never have been able to do before. It is going to change human to computer interaction, and eventually every interface — whether it’s a car or a watch — will become spatial computing devices.”

In a sign of the excitement surrounding the Vision Pro, more than 600 newly designed apps will be available to use on the headset right away, according to Apple. The range of apps will include a wide selection of television networks, video streaming services (although Netflix and Google’s YouTube are notably absent from the list) video games and various educational options. On the work side of things, videoconferencing service Zoom and other companies that provide online meeting tools have built apps for the Vision Pro, too.

But the Vision Pro could expose yet another disturbing side of technology if its use of spatial computing is so compelling that people start seeing the world differently when they aren’t wearing the headset and start to believe life is far more interesting when seen through through the goggles. That scenario could worsen the screen addictions that have become endemic since the iPhone’s debut and deepen the isolation that digital dependence tends to cultivate.

Apple is far from the only prominent technology company working on spatial computing products. For the past few years, Google has been working on a three-dimensional videoconferencing service called “Project Starline” that it draws upon “photorealistic” images and a “magic window” so two people sitting in different cities feel like they are in the same room together. But Starline still hasn’t been widely released. Facebook’s corporate parent, Meta Platforms, also has for years been selling the Quest headset that could be seen as a platform for spatial computing, although that company so far hasn’t positioned the device in that manner

Vision Pro, in contrast, is being backed by company with the marketing prowess and customer allegiance that tend to trigger trends.

Although it might be heralded as a breakthrough if Apple realizes its vision with Vision Pro, the concept of spatial computing has been around for at least 20 years. In a 132-page research paper on the subject published in 2003 by the Massachusetts Institute of Technology, Simon Greenwold made a case for automatically flushing toilets to be a primitive form of spatial computing. Greenwold supported his reasoning by pointing out the toilet “senses the user’s movement away to trigger a flush” and “the space of the system’s engagement is a real human space.”

The Vision Pro, of course, is far more sophisticated than a toilet. One of the most compelling features in the Vision Pro is its high-resolution screens that can play back three-dimensional video recordings of events and people to make it seem like the encounters are happening all over again. Apple already laid the groundwork for selling the Vision Pro by including the ability to record what it calls “spatial video” on the premium iPhone 15 models released in September.

Apple’s headset also reacts to a user movements through hand gestures and eye in an attempt to make the device seem like another piece of human physiology. While wearing the headset, users will also be able use just their hands to to pull up and arrange an array of virtual computer screens, similar to a scene featuring Tom Cruise in the 2002 film, “Minority Report.”

Spatial computing “is a technology that’s starting to adapt to the user instead of requiring the user adapting to the technology,” Hackl said. “It’s all supposed to be very natural.”

It remains to be seen how natural it may seem if you are sitting down to have dinner with someone else wearing the goggles instead of intermittently gazing at their smartphone.

Share This Article
By admin
test bio
Please login to use this feature.