One Month with Apple Vision Pro
Thoughts on using Apple Vision Pro for one month.
On Feb 2, 2024, at 9.30 AM I went to my local Apple Store to pick up my Apple Vision Pro. The staff did a demo for me using one of their units, then sent me home with my own. I was excited to dive in but had to wait for UPS to drop off the prescription lenses, which didn’t happen until around 6:30 PM.
That first night I didn’t have much time to spend with it. I got it set up and tried a few apps and downloaded Project Graveyard from the App Store so I could finally try it on a real device. (It has some issues that I’m working on, but generally it does what I want it to do.) I watched an episode of For All Mankind and called it a night.
I spent a little time using it the rest of that weekend, but I was also busy getting ready for a conference. I had to spend four agonizing days away from it while talking about it to anyone who would listen–which wasn’t hard to find at Claris Engage. The weekend after I got home, I started to dive into this new platform more. I downloaded lots of apps from the App Store and started to keep track of them in the VR Library section of my database. I took some time off around this time, so I didn’t do much with it but play and explore.
By mid-February I was back to work and ready to dive into Xcode. For me, the killer feature of Apple Vision Pro has been the “closed loop” development process. I can connect to my Mac from within the headset, fire up Xcode, and build and deploy the app all without ever taking it off. It isn’t perfect, but it’s awesome, nonetheless. It takes 5-15 seconds for the build to launch on the headset, compared to one second in the Simulator on the Mac. The first window for the app will also open wherever I happen to be looking when it’s ready. I work at a standing desk often and I find myself doing the same little dance with my Volume-based app: Build and run, take 3 steps back, turn around 45 degrees, wiggle and dance around until it shows up. It’s become sort of a ritual for me.
For the rest of the month, the main thing I did with my device was visionOS development. I worked on version 1.1 for Project Graveyard and I’ve been building labs in Canvatorium to aid that process. I also play-tested a couple of WebXR sites for some developers who don’t have a device.
It has only been within the last week or so that I’ve started to use Apple Vision Pro outside of visionOS development. A few use cases so far:
- Using the virtual display to work on my Mac, but with a much larger display than the one in my office. Doing the same sort of things I normally do at work: FileMaker Pro, WordPres, PHP, JS, Vue, and Babylon JS.
- Writing. I write a lot. Notes, sales proposals, documentation, emails, messages, and articles. I’ve always liked getting out of my office for this type of work. Opening a text editor or web browser and entering an environment has replaced many trips to the coffee shop.
- Working in new postures. I have some RSI issues with my right hand and neck and spending too much time at a desk or table is a huge cause of pain. I can work for a while with a laptop in a comfortable chair or sofa, but the smaller screen and fixed position are often a hindrance. With this headset, I can treat the laptop as a “keyboard and trackpad deck” while I move the screen anywhere I want. Thanks to this setup I’m always moving around my home, working in a variety of postures.
- Reading has been great. I’ve used Apple Books mostly. My favorite way to read in Apple Vision Pro is to turn on Dwell Control in accessibility settings and lay down. I can position the reader above me and read while turning pages with my eyes. This gives my hands and neck a rest while I mind explores another world. It’s even better with an environment and a bit of music.
That’s not to say that I spend all day in the headset. On average I think I’m spending 4-5 hours a day on the Mac by itself, with an additional 1-2 hours in Apple Vision Pro with the virtual display. Then maybe another 30-45 minutes a day exploring apps and the capabilities of the device.
I’m happy with what I’ve gotten out of it so far. I’m also excited to keep building for this new platform. I’m going to keep working on some native apps and also dive back into WebXR development for some truly cross-platform projects.