Spatial computing: Why Apple’s headset is not the most important thing it announced this week
The Apple Vision Pro is not just a piece of tech, but a whole new way of thinking about computers, writes Andrew Griffin
When Tim Cook took to the stage at Apple’s event this week, he promised that he was about to make some of the company’s “biggest announcements ever”. Over the next couple of hours, he and his colleagues at Apple ran through a range of new updates: new Mac computers, and changes to the software that runs products such as the iPhone and Apple Watch.
Then he said there was “one more thing” left to announce, echoing the wording that came to be associated with Steve Jobs’ showmanship and the big reveals he would leave until the end of his keynotes. That last thing had been “years in the making”, and a “profound technology”, he said.
It was the long-rumoured, already much-discussed headset, which he said was to be known as Apple Vision Pro. It is like a pair of smart ski goggles, which include powerful lenses and cameras to not only show you virtual reality experiences but overlay them on an almost-realistic image of the world, with the intention that you are never cut off from your surroundings.
In use, at demonstrations on Apple’s campus this week, all of that headset works as Apple says it does. The view of both the world outside the headset and the virtual reality inside of it are both crystal clear, and the Vision Pro already looks set to be a critical success, though we are a long way from knowing whether Apple’s plans for it will actually come off.
Apple’s Worldwide Developer Conference is, as the name suggests, its annual event focused on those people who make the apps that run on its platforms. It is also, as a result, the time of year when it focuses on software rather than hardware, and reveals the big updates that are coming to its products.
It might seem unusual, then, that the hardware announcement that could decide both the future fortunes of Apple and of its customers – it costs a small fortune itself, at $3,500 – would be announced during an event largely focused on software and Apple’s developers. But it was not an accident.
Nor was the wording that Mr Cook chose to introduce that headset with. After talking about just how excited he was about the product, he got started actually revealing it.
“So today I’m excited to announce an entirely new AR platform with a revolutionary new product. And here it is,” he said.
Tim Cook is a deeply deliberate man, and Apple’s keynotes are as tightly structured as the inside of an iPhone. It is no accident that he chose to say that he was announcing a new platform, and that the product was coming along with it.
At the end of the keynote, as he summed up what had been announced, he used similar wording. “Apple Vision Pro together with visionOS introduces an entirely new spatial computing platform,” he said. “A platform that presents incredible possibilities for our users and exciting new opportunities for our developers.”
That is because the most important thing that Apple announced this week was not actually that Vision Pro, even if it is what has grabbed the headlines and will decide Apple’s future. Instead, it is the software and operating system that underpins it, and which could be around for much longer than the headset itself.
“So in the same way that Mac introduced us to personal computing and iPhone introduced us to mobile computing, Apple Vision Pro will introduce us to spatial computing,” Mr Cook continued. “This marks the beginning of a journey that will bring a new dimension to powerful, personal technology.”
Apple did not invent the phrase “spatial computing”. Its first use may have been exactly 20 years ago, when Simon Greenwold published a Masters thesis that defined it as “human interaction with a machine in which the machine retains and manipulates referents to real objects and spaces”.
In that paper, Greenwold said that computers tended to be involved in a “denial of physicality”: moving parts had mostly been removed from personal computers, and at that time big, deep CRT monitors were turning into flat panels that freed up space on people’s desks and removed much of the remaining physical connection they had to computers.
That did not always make people happy, he argued, noting how phones had shrunk and then got bigger again, how small and thin computers had divorced people from the actual work going on inside them and arguing that “the physical shrinkage of the machine manifests itself as an embarrassment of the flesh”. But people remained fascinated with the “space inside the machine”, he said, pointing to the 1982 film Tron as a demonstration of how people were interested and fearful about being swallowed up by them.
In response to those changes, Greenwold described a version of computing – mostly theoretical then, and still largely unrealised – in which virtual objects would be overlaid on the real world. Those virtual objects would be associated with places in the actual world and give us back some of the connection between the real and the digital.
Apple believes it has at least started some of that work, with the announcement of the Vision Pro this week. And it used the reveal of the headset to show how it imagined that kind of spatial computing might work in reality.
The most common and perhaps most unimaginative example used throughout its presentation was really just the recreation of screens within a real environment. During its announcement ad, Apple showed how users could put the headset on and surround themselves with boxes that look a little like floating TV screens showing one specific app.
Those screens are undeniably digital, not only in the sense that they do not actually exist in the real world but also in that they are showing the kinds of experiences we have been having on computers for years, such as looking through photos or messaging people. But the windows are also integrated within the real world: they cast shadows, the sound coming out of them will sound as if it is bouncing off the objects in the room, and most importantly if you move your head and look back again, they will be right where they were before.
The advantage of such an arrangement is partly that it gives people significantly more space. A real computer screen might only be 13-inches across; in the virtual world, they are limited only by your eyes and your room. Even the latter restriction is not entirely true, since in Apple’s demos, they showed how it was possible to watch films on a screen soaring up across Mount Hood.
This was Apple’s primary focus during the introduction. It called the displays “an infinite canvas”, arguing that it would allow people to be freed “from the boundaries of a display”.
It may be useful in other ways, too, that we are yet to understand. The human brain is wired to associate objects with locations, and that might be helpfully exploited by spatial computing; just as we remember to pick up our keys as we leave the house, for instance, we might remember to check our messages.
And, in the future, that connection between virtual objects and the real world could be exploited even more. It will presumably be possible to put directions on top of roads themselves, for instance, or digital graffiti could be applied to real walls.
The same thing could happen in reverse, too, using the real world to illuminate our relationship with digital objects. Designers could look at renders of their work as if they had really been made already, for instance, and customers could drop prospective purchases into their living room to make sure they will fit physically and aesthetically.
Apple already offers some apps that are similar to this, as part of the augmented reality work that it has been doing in public for years, and which was now clearly done at least in part to get ready for the launch of the Vision Pro and its operating system. In the visitor centre across from the Apple Park campus where it revealed the headset, it allows people to look at a physical model of the building and point an iPad at it to see moving models showing what is happening.
In focusing on these kinds of interactions, the Vision Pro could be a kind of computer that makes us more in touch with the world rather than divorced from it, helping respond to growing concerns about what phones and other flat computing platforms might be doing to our relationship with our environment. That might be the reason that Mr Cook was so keen to stress both in the presentation and since that the Vision Pro is the first Apple product designed to be looked “through, not at”.
(The Vision Pro was seemingly not designed to go outside, and the large design of it means that people would probably not want to anyway. But it is easy to imagine that Apple’s eventual dream is to shrink the technology down so that it fits in an innocent-looking pair of glasses.)
But for all of the excitement, Apple did not say exactly why spatial computing is so important, and mostly focused on the fact that displays can be larger. This is helpful and important – much of the computing revolution that has taken us from room-sized computers to personal desktop PCs and then into mobile is really just about changing the size of screens – but it did not say exactly why it believes it will actually improve how people work or enjoy digital content.
It said that the operating system “delivers powerful spatial experiences that can take advantage of the space around the user, unlocking new opportunities at work and at home”, for instance. But the new opportunities really just looked like big versions of the old opportunities, with the same apps blown up to a large screen.
That might be because Apple does not know exactly what those opportunities will be. It explicitly said that the announcement this week was just the “beginning”, and that it was putting in the hands of developers with the hope that it would find out what they could do with it.
That is what has happened in the past, and what Apple is no doubt hoping will happen again this time around. The iPhone may have been an innovative piece of kit at launch – but it wasn’t until it actually made it into the world that apps from Twitter to Netflix showed what it really meant to have a portable display with an always-on internet connection, and revealed how the product would change the way we live.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments