Android manufacturers could learn a thing or two from the iPhone 11 when it comes to video

iPhone 11 Pro
iPhone 11 Pro (Image credit: iMore)

No matter your feelings towards iOS as a platform, you can't deny that the iPhone takes phenomenal video. Stabilization, color, dynamic range … all of it has been consistently great on iPhones for years, often outpacing even Android phones that take subjectively better photos. With today's announcement of the iPhone 11, Apple just knocked things up another notch.

In case you missed the live stream, here's the short of it: with the iPhone 11, Apple is finally adding an ultra wide-angle lens (120°, to be exact) to its phones, and while yes, a wide variety of Android devices in the last few years have had wide-angle lenses, it's never bad news that good features are coming to the hands of more users. The iPhone 11 Pro and 11 Pro Max (horrendous naming scheme, I know) even introduce Apple's first triple camera array.

The iPhone 11's ability to seamlessly switch between lenses while filming is a potential game changer, if you ask me.

With those cameras, you'll get the same video features that have become table stakes on most flagship phones these days: 4K video at 60fps, slow motion capture at up to 240fps in 1080p, time lapses, and great video stabilization. But new to the iPhone 11 is what Apple's calling extended dynamic range, which works on any of the three rear lenses at 4K60 as well as on the front camera at 4K30.

This should do an even better job than before at keeping the highlights from blowing out and the shadows from underexposing, even in challenging lighting situations.

I'm always a fan of more dynamic range while shooting video, so that's a great new addition, but what I'm even more excited about (and preemptively jealous of) is the iPhone 11's ability to seamlessly switch between lenses while shooting.

Hot-swapping lenses isn't entirely new to mobile videography, but Apple says that it keeps each lens ready in the background so that when you switch, everything from color to exposure should be nearly identical, to the point you might not even notice the change.

If you've ever tried switching lenses while shooting video on your Android phone, you'll know that this is a huge deal; far too often, the most noticeable change isn't the reduced quality when switching away from your primary lens, but the wildly different color science and, ultimately, reduced dynamic range.

While it's hard to put too much stock into a camera's quality from a controlled video shown on stage, Apple's demo showed that you can even pinch to zoom and smoothly transition across lenses without seemingly any noticeable changes to exposure or color, almost as if you were shooting with a constant aperture zoom lens on a dedicated camera — something truly useful that I've yet to see on any Android phone.

Android phones have had great video-centric features for years, but I say the more the merrier.

The real showstopper of today's event though, at least if you ask me, was when Apple showed off how the new lenses and A13 Bionic chip work with the popular third-party app Filmic Pro. You can film with all three rear cameras and even the front camera at the same time, saving each feed as a separate clip to then edit at will in post-production. That's incredible for a smartphone.

It's also an amazing feat of processing power (and likely an enormous tax on your system memory), and it could be a massively helpful feature for projects where you'd otherwise need to do multiple takes for different perspectives.

Of course, it has to be said that there are plenty of Android phones with similarly great video-centric features, like the LG G8 and Sony Xperia 1 — the latter of which I've used extensively, and come to love the included Cinema Pro app. With it, I can adjust my shutter speed, frame rate (including 24fps), white balance, and set manual focus and exposure on the fly. You can even apply LUTs to get different color profiles on your videos in an instant.

Similarly, LG offers powerful features like audio zoom, manual controls, and a microSD card for easily expanding your storage — which means you don't have to spend as much as $1450 on an iPhone 11 Pro Max to get a usable amount of storage. Even the Galaxy Note 10, which doesn't specifically claim to be a video-centric phone, has the benefit of precise editing in apps like Adobe Premiere Rush thanks to the S Pen. Our own MrMobile even shot, narrated, and edited his most recent video entirely on the Note 10!

Obviously, Apple isn't the first company to offer great video-centric features on its phones, but frankly, it's never productive to get caught up on who did what first. The important thing is that it's becoming even easier to shoot and produce amazing-looking video, no matter what phone you have, and the iPhone just got a huge upgrade in that regard.

I'm hoping that this move will lead to Google putting a larger focus on video with its phones and software, since despite its fantastic photography, the Pixel 3 never excelled in the video department. A major company like Apple adding these kinds of great features tends to start a chain reaction in which other companies race to offer similar features, which becomes a win-win for everyone involved. So I say the more, the merrier! I think it's great that Apple put such a large focus on video with this year's iPhone refresh, and I can't wait to see video become less of a niche priority on Android as a result.

Hayato Huseman

Hayato was a product reviewer and video editor for Android Central.