A few weeks ago, I was flying cross country with a big pile of digital photos I wanted to edit and sort with my preferred tool for the job, Adobe Lightroom. But the seat in front of me was so close I could barely open my laptop lid. Disappointed, I ended up solving some crossword puzzles and watching a lame movie.
But what if I’d had a virtual or augmented reality headset — the Apple Vision Pro, for example — that projected my photos onto a big screen only I could see?
This week, I got to try out that exact technology. I used Adobe’s new Lightroom app for the Apple Vision Pro. I operated the whole thing just by looking at what I wanted and tapping my fingertips together.
And it works. In this exclusive first look at the app, I can say it took only a few minutes to figure out how to use the headset for standard Lightroom actions like tweaking a photo’s exposure, applying some editing presets or gradually dehazing the sky.
Color me impressed. My experience helped convince me not only that Apple has done a good job figuring out an effective interface for what it calls “spatial computing,” but also that developers should have a reasonably easy time bringing at least their iPad apps to the system.
And that bodes well for the long-term prospects of Apple’s headset. The more productivity and fun the Apple Vision Pro and its successors offer, the better the chance they’ll appeal to a sizable population, not just some narrow niche like Beat Saber fans.
For me, the most compelling possibility with the Apple Vision Pro is using the virtual and augmented reality headset to have a private workspace in a public area. Lightroom fits right into that idea. I’m not ashamed or embarrassed by my photos, but I don’t exactly enjoy sharing them with everybody on a plane flight.
Lightroom support isn’t enough to get me to buy an Apple Vision Pro — starting price: $3,499 — but if I bought one eventually, I’d definitely use Lightroom on it.
For a broader view, check my CNET colleague
Read the full article here