All the major publications which were given review units of the new iPhones ahead of the launch came out last week. I got my review unit last Friday (along with many Apple customers), but since I’ve now been able to spend a few days with it, I wanted to share a few thoughts. I’m going to try very hard not to rehash everything everyone else has said, and also to add a bit more insight in areas I haven’t seen others write about yet. I’m also going to spend a bit more time on the cameras than most of the other reviews – I’m not the world’s greatest photographer, but I do enjoy taking pictures and my phone is by far the camera I use the most to take pictures of my kids, so it has to be good. As such, I’ve spent a good chunk of time over the last few days taking pictures and videos of various things to test the camera specifically and I’ll share some examples below.
The new devices are nominally virtually unchanged on the outside from the previous versions, and that’s certainly the first impression they give too. They are a hair thicker and a tiny bit heavier than their predecessors, but if I hadn’t known that I might well not have noticed. Along with the iPhone 6 Plus I’m using, Apple also sent one of its new leather cases (I have the saddle brown one) and I’ve been using that for the last few days, which has made comparisons between this phone and the iPhone 6 Plus less relevant.
As with previous iPhones, the hardware feels very solid, well balanced, and high quality. Nothing’s changed there. The aluminum and glass are both supposed to be more durable than last year’s, but I can’t think of a way to test either of those that doesn’t involve trying to break the phone, so I haven’t tested either.
The new vibration engine (apparently now one and the same as the taptic engine) is very nice, too – I don’t use vibrating alerts much anymore since I started wearing the Apple Watch, but on the odd occasions when I still get them (mostly for phone calls), they’re more substantial than they used to be. It’s hard to know for sure, but I feel like the phone speaker has got better too – calls sound clearer and louder than before.
3D Touch is arguably the headline feature on the new iPhones, and from the moment I got to use it in person at Apple’s September event I’ve said I thought it was going to be important.
Just tried 3D Touch on the new iPhones. This is going to be huge. Both on home screen and in apps including third party apps
— Jan Dawson (@jandawson) September 9, 2015
Having now used it for more than just a few minutes in a tightly-controlled demo environment, I have a few additional thoughts:
- This is a big deal, but for now it’s mostly used by Apple’s own apps and just a handful of third-party apps. That has two implications. One, if you tend to use third-party replacements for key things like Mail, Calendar, and so on, you’ll find 3D Touch a lot less useful, at least for now. Two, that may mean you migrate back to Apple’s own apps in some cases, to make use of this feature. I’m curious to see how quickly most third-party app-makers add support – if I were them, I’d do it quickly, especially the Quick Actions functionality. I suspect this will be like the Apple Watch, in that apps that fail to support it will find users replacing them with ones that do.
- Especially on the 6S Plus, which is the one I’m testing, 3D Touch makes apps on the top half of your home screen less useful than those on the bottom. Yes, you can use the Reachability feature to bring those higher-up apps within easier reach of your thumb, but that adds friction in the use of a feature that’s all about reducing friction. I haven’t done this yet, but I can see myself rearranging the icons on my main home screen based on which I’m likely to use Quick Actions with.
- Speaking of which, I’ve always kept the Camera app in the top right of my first home screen, because I do want access to it when the phone is unlocked, but I most often trigger it when the phone is unlocked, and therefore use the camera button on the lower right of the lock screen. However, the introduction of 3D Touch and the much-faster Touch ID sensor (on which more below) means I rarely see the lock screen anymore, and even when I do using that lock-screen camera button is less flexible than the app icon for the camera on the home screen. I wish I could use 3D Touch in some way on the lock screen – that’s something Austin Mann predicted Apple would do, but it didn’t. I suspect that’s because the lock screen is becoming less relevant, but it also means I likely need to put my Camera app icon somewhere closer to the bottom of the screen, and maybe even on my home row.
- For now, I’ve been using 3D Touch more in apps than on the home screen, and that’s partly because I tend to use third-party apps more than Apple’s own. I‘ve used it most in Instagram and the Photos app, where I’m using it both to view Live Photos and to quickly review recently-taken pictures when I’m still in camera mode. The latter is a really great addition, and I think third party developers will likely come up with lots of cool ideas for using this feature.
- One thing I think developers should be thinking about is making Quick Actions user-customizable. Instagram, for example, chose to make access to the Direct inbox, Search, and View Activity the three additional Quick Actions beyond the obvious New Post option. If I had my way, I’d probably choose other aspects of the app to get quick access to, and I’m betting I’m not alone in that. Launch Center Pro does a great job of this as a key feature, and I think it’s brilliant (h/t @rjonesy).
- Related to this, the order in which functions appear in Quick Actions is interesting too – I think we’re accustomed to reading menu-type lists (along with everything else we read) from top to bottom, but depending on where an app sits on your home screen, the menu items may appear in what seems to be reverse order (I think the rule of thumb is that the thing you’re most likely to want to use is closest to the app icon itself, for easy thumb access, but that may mean it’s at the bottom of the list). That means something of a learning curve for users, but is probably also something developers should think about in designing the order of items (and any icons they use alongside the labels).
- Lastly, I’ve noticed some of the negative side effects of the introduction of 3D Touch. One of the things that’s happened to me several times is tapping on web links without any result. I think what’s happening is that I’m tapping just hard enough to trigger 3D Touch, but not holding it at all, which leaves me in a sort of limbo where I don’t get either the desired result or any visual signal that I’ve accidentally activated 3D Touch either. John Gruber has talked about the problem of trying to delete apps, a function I suspect we’ve all kind of activated by pressing down fairly hard on the screen to trigger the wobbling icons. I’ve had this problem too, and there are several other places where I’ve previously pressed fairly hard for the “long press” but now have to get used to pressing only gently. No doubt the mental and physical adjustments involved will come in time.
I won’t spend lots of time on this, as it’s been well-covered elsewhere, but the Touch ID sensor is dramatically faster now, and frequently completes authentication before the lock screen even pops up fully. That’s wonderful for quick access to functionality, but as others have pointed out (and as I alluded to in the context of the camera above) it does mean the lock screen becomes a lot less useful, unless you trigger the home button with a finger not registered for Touch ID, or use the side button to turn on the phone. Training the Touch ID sensor is also much quicker now – not something you have to do a lot, but I’ve added several fingers to the new phone more or less immediately, whereas it took me a long time to bother doing so on the iPhone 6 Plus because it took more time to do.
My initial reaction to the Live Photos demo was that this reminded me of something out of Harry Potter (it may have helped that my daughter has recently been reading the books and seeing the films for the first time).
Apple brings Harry Potter technology to the iPhone – Live Photos — Jan Dawson (@jandawson) September 9, 2015
Again, I got a brief demo of the feature at the September event, but it’s very different to be working with your own limited skills as a photographer rather than with carefully chosen images pre-installed on a demo phone. I’m glad I read some of the reviews last week ahead of trying to use it, because it meant I was immediately aware that I needed to change my past behavior slightly and hold the phone steady both before and after taking the still image (though the software on the phone now knows to cut off the video if the phone is lowered prematurely). That probably shortened the learning curve somewhat, but it’s still an interesting process to figure out how best to use Live Photos. Apple’s demo photos were an interesting mix of moving objects and people, and I’ve definitely had more luck with the latter than the former in terms of getting compelling Live Photos out of the process.
Below are some examples of Live Photos from the last few days – they’re a mix of objects and people/animals, and you’ll see how variable the results can be. For what it’s worth, sharing these anywhere other than in iOS is still difficult – I connected my phone to my Mac and used QuickTime to record the screen of my iPhone as I used 3D Touch to interact with them, which resulted in this series of 7-second videos you see below. In each case, the video starts with the still image, then shifts to the video, and returns to the still (you may hear background noise from my home office on the audio on some of them – no idea why QuickTime records microphone noise when capturing the iPhone screen).
Overall, I’m really enjoying Live Photos, and there are some interesting things to note:
- Even when in Live Photo mode, you can capture multiple pictures in quick succession – at first, I was waiting for the yellow Live indicator to disappear before taking the next picture, but I found that it works just fine even when the pictures are taken close together. The video still attaches itself to each picture in the same way, which means you get an interesting effect when scrolling through pictures quickly and playing the Live Photo – you’ll hear almost the same background noise on each, with the beginning and end shifting a fraction of a second each time. This is very clever stuff on Apple’s part.
- I kind of wish Apple had made these Live Photos auto-play as you scroll through your camera roll (or gave users the option of selecting this) – it would make your camera roll come alive in a completely different way, whereas for now your camera roll looks entirely static until you decide to engage with an individual picture. Maybe it’s the Harry Potter thing again, but I like the idea of these pictures looking alive from the get-go, rather than having to be prodded into action. I’m sure the team at Apple responsible for the feature spent at least some time discussing this decision, and ultimately came down on the side of having them be still by default – perhaps because scrolling through moving photos was too distracting visually, perhaps because of the impact on battery life, or for some other reason. Perhaps it’ll change in time or become a user option.
- Related to this, the blurry transition between the still and the video isn’t my favorite element here. I can see why the engineers thought it needed a clear visual transition from one mode to the other, but when you’re reviewing a bunch of pictures it’s an unnecessary visual obstacle and delay that adds little once you’re used to how the feature works. At the very least, it feels like it should be quicker.
The cameras have always been one of my favorite features of the iPhone, and I continue to find the cameras on the iPhone better than any other smartphone camera out there, at least for general use. The new cameras offer improvements over last year’s, which were already very good (see my review from last year here and this Flickr set for lots of pictures from last year’s phones).
My wife and I went to pick up my kids from her parents’ farm on Saturday, and I had a chance to take some pictures and video while we were there. We then went on a drive up the canyon near our home on Sunday afternoon, and I went on a brief hike with my son this morning too, so I’ve taken pictures and video in a few different settings over the last few days. Overall, I’ve been very impressed by the camera, both for photos and videos.
Below is a panorama I took this morning – it won’t look all that impressive below, because I’ve reduced it to fit here, but if you click on it, it’ll open the full-size image in a new tab or window.
The full image is over 13,000 by 3,600 pixels, and I think it looks fantastic (not my composition, but the various different levels of light and shade and how they’ve come out so well, while retaining a lot of detail). This is one of the huge strengths of the new cameras – the combination of high resolution and retention of detail, which will allow for much more usable cropping of pictures.
The two images below are a virtually complete crop and a partial crop of the same picture, both of which I’ve edited using Snapseed. I’m including them because of the detail that remains in the cropped version.
I’ve amped up the color in these pics a little, but I’ve included some other unedited shots below so you can get a sense of how these come out of the camera. In both cases, you can click on the picture and it’ll show full-size. For more photos, mostly unedited, see this Flickr set.
As for video, the iPhone continues to have a great slo-mo camera, but of course it now also has 4K video. I haven’t spent a ton of time using this, but one of the most striking things with video on the iPhone 6S Plus is how good the image stabilization is getting. I’m including below a few YouTube embeds which show off this capability – apologies for the slightly dizzying cinematography on some of the videos, but I was trying to test the camera’s ability to adjust to changes in lighting.
4K video – pan across mountain landscape:
This one was shot by one of my kids out the car window while driving on a bumpy, windy road through the canyon – it’s 1080p only:
This is another 1080 rather than 4K video, but it shows off the image stabilization quite well, as well as the quality of the video capture:
Other than the specific features I’ve reviewed, the one overarching theme with the new iPhone is speed. Touch ID is faster, as I’ve already mentioned, but everything else is noticeably faster too, as a result of the new chip, more RAM, and a variety of other improvements. There’s almost no lag now for a number of tasks which used to take time. And the overwhelming impression you’re left with is that you and your clumsy fingers are now the biggest source of latency for a lot of what you’re doing. I find myself more drawn to Siri and to voice dictation for text entry than before, simply because it now feels like I’m slowing everything down when I type things in.
All part of the pattern
In conclusion, the iPhone 6S range feels like a continuation of the pattern for Apple. In a piece I wrote on Techpinions a while back, I talked about the fact that Apple often builds new features and functionality incrementally over time, and it’s often not clear where a particular feature is heading until several years after its original launch. The iPhone 6Ss feature examples of both the outgrowth of earlier features (e.g. Force Touch on the Watch maturing into 3D Touch on the phone, the Touch ID sensor getting enormously faster), but also likely the beginning of new things that aren’t yet apparent. 3D Touch in particular feels like it’s just getting started, and could spread both to other parts of Apple’s product line and to other parts of iOS (count how many of Apple’s own apps don’t yet support it, for starters). But I’m sure there’s far more here, too, though it will probably only become clear as Apple launches future devices.