evan's thoughts


Reality Check: Where The Vision Pro Is Going

Is it the Mac of the future or just a new iPad?

Last weekend I went to the Seattle Apple Store in the University Village to pick up my Apple Vision Pro. I have waited for this device since the first rumors of the project sprung up years ago. Those rumors, along with Meta (then Facebook’s) purchase of Oculus, and the eventual release Meta’s infinite office demo made it incredibly easy for anyone to understand the vision of what the long term vision both companies were. A spatial computer, an infinite canvas free of the confines of a screen. The Apple Vision Pro, while by no means perfect, is the closest we have come to realizing this future so far. To me it feels almost like the dream device I have been waiting my entire life for, one that brings the experience of a computer out into the real world. Every morning I wake up, having just finished a dream, and the headset somehow doesn’t disappear from my reality. It feels like a device that shouldn’t exist, and yet it is in my life at last. The computer I have always wanted, the device that pulls the experience out of a screen and into the world around me. I am able to litter my apartment with windows and views into the digital world, a world I hold dead. I love computers. I always have, I always will. I just wanted to write that before I spend the next 2.5k words admiring and criticizing this device I truly adore more than almost any piece of technology I have bought for the past decade.

I wouldn’t go so far as to call myself a VR expert, but I have been an enthusiast for over a decade at this point. I used the first Oculus DK1 the month it released, their Crystal Cove prototype, purchased a CV1 at launch, touch controllers, Oculus Go, Quest 1, and Quest 3. This is not to state that I am the only one able to talk about the Vision Pro in a critical way, or that I am the best to analyze it, but rather to just explain the previous experiences that set my expectations. In many ways The Vision Pro is a huge leap above them, and in a few surprising ways barely an evolution at all. The screens are the story here. I believe the Apple Vision Pro is the “retina moment” for XR devices. Unless I specifically try my best to look, I can’t see any pixels at all. This barely puts it under the “retina” flag, and I believe Apple has even done their best to avoid using the term themselves, but the moment I put the Vision Pro on echoes the moment I saw my first retina Mac. While the resolution of the device is not as clear as my iPad or my Studio Display, you’d have to closely analyze text on the device in order to notice much of a difference. Additionally, keep in mind that while each eye has its own 4k screen, you are never going to be utilizing all of those pixels to watch a film, so all your movies are actually streaming at sub 4k resolutions. Better than almost any laptop / tablet you can find on the market for sure. While it’s not technically better than a 4K OLED, and I found myself not noticing much of a difference between my TV and the Vision Pro due to the “distance” the virtual screen was from my face, preferring my TV if only because I don’t have to have a device on my face to use it.

What has underwhelmed me the most so far has been the passthrough. The Verge’s Nilay Patel called camera based passthrough a “dead end technology”, and after a week using this device I am inclined to agree with him. My first thought putting the Vision Pro on my head after experiencing the passthrough of the Quest 3 was “is this it?”. It is very good, better than the Quest by a bit, but it was not the step change I was expecting it to be. Rather, it passes a barrier of usability that the Quest 3 was unable to, but not exceeding it at the same level of the rest of its features. It’s the first passthrough I feel fine utilizing for extended periods of time. While it looks better, It has not solved any of the problems of camera based passthrough in the slightest. The increased resolution of the Vision Pro displays compared to the Quest 3 only highlight how inadequate the cameras on either headset in comparison to the human eye. The biggest surprise for me is that the world is still blurry. You can read text off a monitor, but that text is a lot blurrier than text in visionOS. You’re barely able to read watch notifications, and there’s a noticeable dip in quality when watching my TV through the device. Make no mistake, if you are using this headset you want to avoid using passthrough for anything other than seeing where you are and talking to people outside the headset. Because of this, I believe that camera based passthrough is actually a huge blocker for mass adoption, as it is the main reason I take off the headset. It is a larger tradeoff than I expected for the freedom you gain with the “infinite canvas” as Apple puts it. This is one of the reasons I think it works better as a laptop / desktop replacement. In my home office and at work I am essentially starting at walls anyways, and the benefits of that infinite canvas are clearer to see. Without it though, all I see is my orange cat looking washed out and blurrier than in real life.

There’s no getting around the price though. The vision pro starts at $3500. That’s for the base model with 256gb, which to be completely clear, you should not buy if you intend to download movies. This puts off anyone besides early adopters (me) and software developers (also me) from buying this device. It’s hard to judge the headset without talking about the price. If you disconnect the cost from it, it’s one of the most fantastic products Apple has ever built, both in terms of hardware and software engineering. It is the best way I have ever watched movies, besides my Sony A85J / Sonos surround setup in my living room, or my local IMAX theater. It is the best portable computing interface I have used, allowing me to multitask beyond my wildest dreams for a device I can fit in my backpack. In essence the Apple Vision Pro is a screen with a computer built in, a portable workstation you can use on a plane, a giant flatscreen you can fit in a bag, etc. However you can’t talk about it that way because we live in a capitalist society, and while I am fortunate enough to be a be able to afford this product, most aren’t as of yet. To me, it is worth this cost, but I would put it incredibly low on the hierarchy of technological needs for almost anyone considering what computer to buy.

Talking about what this device can do often seems a little boring, but Apple’s ambitions are super clear when you first put it on your head. The “Hello” cursive is the first thing you see when visionOS boots to the setup screen, along with the familiar Mac chime. The tutorials in the OS that teach you how to interact with it state that touching your fingers together is “like a click on your Mac”. Apple is trying to pitch this as the laptop / desktop of the future, and minus a few changes to the hardware itself to make it lighter and cheaper, I believe they are on the right path. However, functionally, it is an iPad.

People online have made this comparison / complaint about the device, that from a functionality standpoint it isn’t capable of performing any tasks that my iPhone or my iPad can’t, and i’ll break down that argument a bit. I’m not going to argue with it directly as a universal claim because its easy enough to say “well Super Fruit Ninja lets me cut the fruit with my arms instead of my fingers” and call it a day, but I think this circumvents the issue rather than engaging with it directly. In theory, my phone can’t do anything my iPad can either, besides use the Apple Pencil and have multiple apps open at the same time. The Vision Pro may be an iPad on your face, but I remember distinctly when I called the iPad a “big iPhone”. I was right, and ultimately the modern product line resembles the Surface more than the original iPad, but at the time the multitasking / writing functionality of the iPad were not in the lineup. Now that they are, the increased screen real estate, pen support, and window management give me value over my phone.

The Vision Pro, in its least charitable reading of the device as simply an iPad on your face, makes sense in that same context. My iPad while able to multitask suitably for smaller tasks, such as writing this blog post, has many limitations. Regular iPads can only utilize 2 to 3 apps at the same time with iPadOS’s split view, and M series iPads can only use 4 apps at once with Stage Manager (even when plugged into a Studio Display). If the Vision Pro has a limit for how many apps you can have “open” at once, I haven’t found it yet. I’m sure this is a clever slight of hand from the operating system, which from my experience is far closer to iPadOS than macOS, and seems to suspend apps you aren’t actively using in the background. However I have had more than 4 apps operational at the same time in my field of vision which puts the system’s capabilities above what my iPad can do, and my iPad is a device with the same exact system on a chip as the Vision Pro. Since the early 2010s, the goal of companies like Microsoft, Apple, and Google were to build a product ecosystem where software transcended the form factors of devices, and where those devices were designed for specific aspects in your life rather than the software they were capable of running. The app I am writing this post on is on my MacBook, on my iPad, and on my iPhone, universally synchronizing between the three with all functionality intact. I could write this post from my Mac if I wished, but I prefer the portability and focus of my iPad. The idea that using primarily apps which exist on other devices in the Vision Pro disqualifies it as a product category is an outdated and silly one. That’s just not how computers work anymore. I’m sure most of your time on a desktop computer is spent in a web browser and using electron apps that are on your phone as well, but the user interface and input mechanisms on a desktop or laptop are better for certain circumstances (extended periods of productivity).

This is not to say there aren’t complaints with the device and its position within the Apple ecosystem. Apple has been trying to build what they believe is the successor to the Mac for years. Steve Jobs pitched the iPad as the device that would kill netbooks (he was right), and that product ultimately morphed into a Surface competitor with a keyboard and mouse support. The iPad and Mac have been on a collision course for years, with the limitations of the former and the relative complexity of the later making it hard to recommend one or the other. On the one hand, a person might want a lower priced and simple device to take handwritten notes on in school as their main computer, in which an iPad is perfect. On the other hand, they are majoring in a field which requires software that the iPad can’t run, or in the case of computer science, will never be allowed to run. This means entire professions and career paths are locked out of ever using the iPad as their primary computer, without the assistance of either a cloud machine or laptop. One of those professions, is software engineering. Apple does not allow unsigned code to run on the iPad, which includes any code a person would write.

The Vision Pro has the same restrictions. This means that I will never be able to do my job on the headset alone. For the past week I have been using a combination of the virtual display feature to mirror my Mac, and the Windows 365 app in Safari to connect to both my laptop and my cloud PC which enable me to do my actual job. This makes the Vision Pro the most expensive thin client in the world. I’m not particularly upset by this, I knew this was the deal going in and accepted this was a limitation of the device. My plan was always to use my Mac to handle vscode and my headset to handle browsing the web, email, chats, meetings, etc. However I now believe a significant part of the devices value proposition is missing from its functionality. Spending time in the device and experiencing the interface for the better part of a week made me understand just how much Apple needs this to replace the Mac in order for it to succeed in the short term, but even as their primary target for the device I am unable to leave my laptop in my backpack. As of now there is no way to access a native shell in visionOS. There is no way I can download VSCode or any equivalent, and even if there was, no way for me to build my projects offline without connecting wirelessly to a computer running macOS, Windows, or Linux. Even if every professional program in the world gets ported to iPadOS / visionOS, my profession will never be able to complete their work on this platform, even when those in my exact profession were the ones who built the Vision Pro in the first place.

When looking at the competition, Apple’s interface is maybe a decade ahead of the Meta Quest OS in terms of functionality and capability. It has a native window UI framework written in Swift where the Quest falls back to just wrapping web apps. It has an actual windowing system where the Quest only supports 3 windows of different programs next to each other. It allows window resizing where the Quest only has two options (in front of your face and movie theater) and more. It’s so far ahead of Meta right now that if I were Mark Zuckerberg I might be genuinely frightened that Apple could win this space. He’s not frightened though, because he understands the advantage that being an open platform is an extraordinary advantage to have in the race to replace the desktop computer. Once Meta’s operating system catches up to visionOS’s functionality in x amount of years, unless the target use case of spatial computing being for productivity / work changes significantly, the Vision’s hardware superiority will fundamentally cease to matter. If the bet of these devices replacing laptops / desktops is correct, the one with less limitations will always win. It doesn’t matter if the UX is worse, it doesn’t matter if the screens are worse, just look at the current laptop / desktop market for professional use and try to prove me wrong. If Apple’s ambitions are to sell these to offices and companies to replace the laptop, you can’t do that while still remaining a closed operating system with limited functionality. Meta will open the doors to the Quest more than they already have and sell a ton of them because it’ll be capable of running dotnet by then and visionOS won’t, so guess which one i’d have to buy to do my job.

This device is amazing. It has already replaced so much of what I use my iPad for (except writing this post, as Ulysses hasn’t enabled visionOS support yet) and is a genuine glimpse into the future of computing. For that alone, it was worth a purchase for me. It has brought the future so close to my every day life that I will appreciate and cherish it as long as I can. However it is one possible future. I would love to see that future come to pass in a world in which Apple is less restrictive about the use cases of their products, but I have no fear that a software ecosystem similar to the iPad will ever replace the Mac, because it functionally can’t. Maybe it’ll replace the iPad, but i’d hope Apple has more ambition with this product than simply to be an iPad killer. The amount of time, energy, and effort they placed into visionOS makes me think that the ambition was there, at least. I desperately want this computer to be the Mac of the future, but the limitations of the operating system itself make me believe that Apple may be doomed to repeat the path of the iPad. I vastly prefer visionOS to the current Quest OS, but in order to win the future both Apple and Meta are chasing, Apple’s approach towards control of their operating system needs to change a bit. We know Meta’s hungry to succeed here, and if this product category is the real deal, they are willing to do whatever it takes to own it. Apple just needs to give up a tiny bit of control here in order to compete effectively with them over the next decade, but the jury is out on whether that’ll happen. I hope they do that, because if I have to pick right now, this is the OS i’d rather write code on.