Over the holiday weekend, my wife and I took some family photos. There was a particular photo of us that she really, really liked - but when the photo was taken, she blinked and her eyes were caught in a slightly closed state. Of course.
The photo was taken with an iPhone 6S and used the new “Live Photo” feature. I’ve heard it described as the the “Harry Potter Effect”, and I have to admit, when I first saw it I thought it was pretty magical.
Here’s how it works: Basically, the iPhone camera captures 1.5 seconds before and after the photo is taken. When you touch your finger to the photo, it plays the short video clip, making the photo “move” at 12 frames per second.
Viewing the Live Photo of my wife with her eyes closed, I could see that there were a handful of frames in the moving picture when her eyes were definitely open. This means that there must be a way to retrieve that moment, right? Nope. There is no way within the iOS interface to jump back to a particular frame in the photo and save it out. The purpose of Live Photo feature is strictly ornamental and a missed opportunity to tangibly improve the product’s experience.
In their marketing campaign, Apple rationalizes the feature as a way to “relive a moment any time your phone leaves your pocket.” With only three seconds at 12 frames per second, you’re not reliving anything. A user who wants to relive moments will simply choose to take a full length video.
I’m still a huge fan of Apple design, but, in the past, their design approach was always driven by the user experience. Don’t get me wrong, the Live Photo feature does add a “cool”, eye-popping effect. However, if all products were designed using that form of rationalization, then all we’d have in our pockets would be a bunch of shiny, useless toys.
- Vocalizer: Help Others Pronounce Your Name Correctly
- The Master Prototype
- On 'Should Designers Code'