Examining the differences between iPhone X Face ID and Samsung iris scanning

At the September 2017 Apple event, the iPhone X was revealed. It seems like Apple went all out on its "Anniversary" model, and one of the new features is Face ID.

Unlocking your phone with your face isn't exactly new. Android has had the feature for a while, and Samsung has used a special iris scanner since the Galaxy Note 7. But Apple is doing things very differently, as it is wont to do. Rather than use a pattern to create an unlocking token, Apple is using the shape of your face. And it has some pretty specialized hardware in place to do it.

I haven't used the iPhone X just yet, but this is an area where I have a good bit of experience. Modulated acquisition of spatial distortion maps, then turning the acquired data into something a piece of software can use as a unique identifier has been around for a while, and products you have in your house right now were built, packaged or quality-checked using it. I've been involved in designing and deploying several systems that use depth image acquisition to sort produce (apples, peaches, plums, etc.) by grade, shape, and size and understand how the technology used in Face ID will work.

Let's compare.

Android's facial recognition

Face unlock debuted on the Galaxy Nexus in 2012.

Face unlock debuted on the Galaxy Nexus in 2012.

Unlocking your phone with your face has been part of Android since version 4.0, Ice Cream Sandwich. This is the least complicated and least secure of the three things we're comparing.

Using the front facing camera, your Android phone can grab an image of your face and Google's facial-recognition software then processes it to build a set of data based on the image. When you hold the phone to your face to unlock it, an image is collected, processed and compared to the stored data. If the software can match both of them, a token is passed to the system so your phone will unlock.

Face unlocking came to Android in 2012, and Samsung has made it much better on their newest phones.

The data isn't sent anywhere and is collected and processed all on the phone itself. It is stored securely and encrypted, and no other process is able to read the raw data. Android face unlocking also doesn't need any special lights or sensors or cameras — it uses the same front-facing camera you use to take selfies with.

Samsung has improved the experience with the Galaxy S8 and Note 8 phones by starting scanning as soon as the screen is tapped, and the processing is faster and more accurate because of the better camera and CPU. Face unlock on the Galaxy S8 is fast and generally works well once you get a feel of how to hold the phone while you're using it.

The biggest problem with face unlock is that it's not secure. It's not advertised as being secure, even by Google or Samsung. It's a convenience feature that was built to showcase (and refine) Google's facial recognition algorithms, and a printed photo of your face will defeat face unlock.

Thankfully, Samsung also offers an alternative way to recognize your face.

Samsung's iris scanning

Galaxy Note 8 iris scanning

Samsung first brought iris scanning to the Galaxy line with the Galaxy Note 7. Having a computer scan your eyeballs to authenticate you is something we've all seen in movies, and it is used for secure entry in real government facilities. Samsung is using the same concept with its iris scanning system, just scaled back so it can work faster and work with the limited resources of a smartphone. It's more than secure enough for your phone, even if it's not 100% foolproof.

Every eye has a different pattern, and your right eye is even different than your left.

Every eye has a unique pattern in the iris.Your left eye even has a different pattern than your right. Iris patterns are actually more distinct than a fingerprint. Because every eye is unique, Samsung is able to use your eyes to identify you and act as your credentials. These credentials can be used for anything a fingerprint or even a passcode could. You hold the phone so the special camera can see your eyes and your phone will unlock.

To do this, Samsung is using specialized hardware on the face of the phone. A diode emits near-infrared light and illuminates your eyes. It's a wavelength of light that humans can't see but it's fairly intense and "bright." Near-infrared light is used for two reasons: your pupils won't contract and you'll have no change in vision, and it illuminates anything with a color pattern better than the wavelengths we can see. If you look closely at your iris you'll see that there are hundreds of different colors in a distinct pattern. Under near infrared, there are thousands of colors and they contrast with each other very well. It's just better for grabbing an image of your iris, because even though you don't see any of this, your phone can and uses it to build a dataset.

Samsung uses near-infrared light and a special camera to collect and process data about your eyes.

Once the iris is illuminated, a specially tuned narrow-focus camera grabs an image. The regular front facing camera on your Galaxy S8 could register color information under infrared illumination, but it wasn't designed to do it. That's why a second camera is needed.

This image is analyzed and a distinct set of data is created and stored securely on your phone. All the processing, analyzing and storage of the data is done locally and is encrypted so only the process of recognizing your iris has access to it. This data is used to create a token, and if the iris scanner process provides the right token a security check was passed — those are your eyes, so any software that needs your identity is able to proceed.

Of course, Samsung also collects some data about your face using the normal front-facing camera. Most likely, the facial data is used to help position your face so the iris scanner has a clear view.

Your eyes need to be in the right spot to setup and use the iris scanner.

Your eyes need to be in the right spot to setup and use the iris scanner.

There are some inherent drawbacks. Because using iris scanning to unlock your phone needs to be very fast, not as much data is collected about the pattern in your eyes. Samsung had to find the right balance of security versus convenience and since nobody wants to wait five or 10 seconds for each scan, the iris scanning algorithms can be fooled with a high-resolution photo laser printed in color and a regular contact lens to simulate the curvature of an eye. But, honestly, nobody is going to have a photo of your eye that is clear enough to unlock your Galaxy S8 or Note 8. If they do, you have a much bigger problem on your hands.

Samsung's iris scanning works well as long as your eyes are in the 'sweet spot.'

The bigger issue is accuracy. Enough of your irises need to be analyzed to pass the software check, and because the camera that grabs the image for recognition has a very narrow focus there's a "sweet spot" your eyes need to be in. You need to be in that sweet spot long enough to pass the checks. The system is of no use if it doesn't collect enough data to prevent someone else's eyes identify as you, so this is just how it has to work.

It's a good system as far as biometric security goes, and for many it's great. Only your eyes will work (ignoring the off chance some spy agency has photos of your eyeballs) and it's fairly fast. You just have to learn to use it correctly — and yes, that typically comes as a result of many times holding your phone unnaturally high with your eyes wide open.

Apple's Face ID

Apple has entered new territory when it comes to biometric security on a phone. It wasn't so long ago that you needed specialized lighting, multiple cameras with special lenses and a very expensive image processing computer board for each of them to collect enough shape data for unique recognition. Now it's done with some components on the face of the iPhone X, Apple's new A11 chipset, and a separate system to crunch the numbers.

Face ID projects an intense infrared light to illuminate your face. Just like the light used by Samsung's iris scanner, it's a wavelength a human can't see but it's very "bright." It's like a flood light — an equal amount of light across a wide area that washes your face and will fall off quickly at the edges of your head.

Apple is trying something very different with Face ID and how it gathers data about your face.

While your face is illuminated, a matrix of infrared LED lasers is projected over your face. These LEDs use a wavelength of light that contrasts with the light used for illumination and thousands of individual points of light cover your face. As you move (and we can never be perfectly still) the points of light reflect the changes.

With your face illuminated with the infrared lamp and a light matrix is projected over it, a special camera is collecting image data. Every point of light is marked and as you move and they change, those changes are also logged. This is known as depth image acquisition using modulated pattern projection. It's a great way to collect data that shows shape, edge detection, and depth while an object is in motion under any type of lighting conditions. A ton of data can be collected and used to show a distinct shape that can be recreated in 3D.

The data is then passed to what Apple is calling the A11 Bionic Neural Engine. It's a separate subsystem with its own processor(s) that analyzes the data in real time as it is being collected. The data is used to recreate your face as a digital 3D mask. As your face moves, the mask also moves. It's an almost perfect mimic, and Apple does an excellent job showing it off with its new iMessage animated emojis in iOS 11.

Face ID uses some of the same technology as Android phones with Tango.

For authentication purposes, the data set is also used to calculate a unique identifier. Just like Samsung's iris scanner, Face ID securely stores this data and can compare it against what the special camera is seeing while Face ID is actively running. If the data set matches what the camera can see, the security check is passed and a token that verifies that "you are really you" is given to whatever process is asking for it.

While Apple is also making a few concessions to ensure Face ID is fast and easy, there are some clear advantages from a user perspective. Face ID is actually more secure because you're moving (more data is being analyzed) and there is no "sweet spot" as all of your face is being used and the camera uses a wider field of view. The matrix projected on your face contrasts well against whatever is in the background because a sense of depth is used to isolate your face's shape.

As a bonus, the shape data of your face in real time can be used for other purposes using what Apple calls the TrueDepth Camera system. We saw an example of this with the new portrait mode for selfies, the animated emojis, and Snapchat masks. Apple has built the Bionic Neural Engine in a way that it can share simple shape data with third party software without exposing the data it uses to build a secure identifying token.

Which is better?

We can't say anything is really better until we've tried it.

Better is subjective, especially since we've not yet used Face ID or the iPhone X in the real world. For authentication purposes, the important thing is that the process is accurate and fast. Samsung's iris scanner can be both as long as you point the phone so it can find the data it needs, but on paper, Face ID will be easier to use because it doesn't need to lock on any particular spot to work. And for many of us neither is better and we would prefer a fingerprint sensor, which the Galaxy S8 and Note 8 both still have.

Whichever you prefer, there's little doubt that Apple has outclassed the competition in this regard. Extensive hardware to build and collect data about your face's shape and features, combined with its own processing system to analyze it all more akin to Tango than any previous facial recognition we've seen on a phone. I'm excited to see this level of technology come to mobile devices, and can't wait to see how future products build on what we see from Apple.

Jerry Hildenbrand
Senior Editor — Google Ecosystem

Jerry is an amateur woodworker and struggling shade tree mechanic. There's nothing he can't take apart, but many things he can't reassemble. You'll find him writing and speaking his loud opinion on Android Central and occasionally on Twitter.

136 Comments
  • What if I have an identical twin ?
  • Iris scanner, no problem.
    Face ID, as stated on stage, "bad luck".
  • I agree irus scanner and finger print are the way to go. I had the irus scanner on my lumia 950xl and I have contacts if I put glasses on or took out my contacts it did not work for me more secure than face. Fingerprint is the quickest though esp on my xz premium.
  • Unlike your Lumia, Apples system is sophisticated enough to recognize if you are wearing glasses. Please watch the presentation.
  • They also said that the iPhone is bezzle-less and it isn't
    ..so what's your point? Half of what Apple claims is BS.
  • Where'd they say that?
  • Lumia 950/HP Elite X3 works with glasses and sunglasses. Not with contacts,
  • It worked with my contacts but if I took them out it would not unlock the phone that's what I was trying to say it was like it could see the contacts but I do have hard contacts not soft ones which change the shape of my eye.
  • did you take the time to train it for different lighting, glasses on/off etc? it has the capability to learn these differences and that's all it takes. I use my 950 XL with or without glasses, contacts etc. and the iris scanning works great, but I've trained it for these different scenarios. as Apple said, they'll be continuously training Face ID. that's the only way to make it work reliably. the question I have is *when* will the iPhone X be doing all this face learning?
  • I did but I do think it was because I have an issue with my eyes called caritona (I think it's spelt) and I use hard contacts so my eye shape is different when I have them in or out and the lenses on my glasses are also special ones to so that might be why I had the issue.
  • Iris scanner is useless in daylight (doesn't even have to be that sunny).
    Also as mentioned in the article it requires you to position yourself directly at the camera. Face ID will probably be better than the iris scanner. But I still agree a fingerprint scanner outclasses both those methods. Just if Samsung didn't put it in such a weird place..
  • I am guessing the problem with the Iris scanner in daylight has to do with the light coming from the sun that we can't see. I am not sure Face ID would be any better at it since it also uses the same light.
  • Yeah. Can't understand why Samsung didn't hero the fingerprint scanner on the front in those huge bezels.
  • Where in the presentation did Apple said it will not work with Twins. Fake news
  • They said if you have an evil twin use a passcode. I watched. It was pretty funny. But yes if you have a twin supposedly it will unlock.
  • They did not say it does not work, what they said was that where on ordinarily there is 1/1 000 000 chance of someone breaking in your phone, those chances increase if you have an evil twin. Meaning that you get very close to having someone who might possibly be mistakenly identified as you.
  • I have identical twin girls, iris scanner isnt fooled.
  • You should be able to find it pretty easily. The slide was showing Spock and Evil Spock 😀
  • For as much as I dislike apple they usually don't release core things that are half baked, map applications are an exception but I don't consider that a hardware core function..
  • I don't think it worked during the apple presentation.
  • Honestly I felt bad for the guy. Imagine being up there and having that happen on the biggest stage possible...... He probably had to change his underwear after that.
  • I couldn't help thinking of this: https://www.penny-arcade.com/comic/2010/06/09/an-inside-job
  • Lol!
  • Same thing happened when they introduced the face unlock feature on the Galaxy Nexus back in 2011. It seems unlocking your phone with your face is destined to fail when you're on stage. :P
  • I think that's because they had just turned the phone on. if you look at the image from the event it says 'Passcode required to enable Face ID.' it's the same when you turn on an iPhone or Android phone, it requires your pattern or code the first time before the fingerprint reader works.
  • Not according to quiet a few journalists who have reported it failing in the demo room after the keynote.
    There are also videos of portrait lighting failing as well and the Apple reps quickly trying to change the subject.
  • They forgot to enter the password after restarting the phone or being left alone for a while.
  • Honestly, I'm not big on face scanning still.
  • I just think they should have had Touch ID too. Face ID is cool though. Thanks for this great explanation Jerry.
  • I'm sure Touch ID will be back once sub-glass fingerprint scanning is ready. I hope that's set for next year.
  • Yeah I agree with you.
  • They couldn't get the FPS under the OLED glass correctly this year, so Apple went with this. As much as they try to sell folks on it being better than a FPS, in practice, I would bet anything that there will be way more problems using it. By a mile. they should have just stuck the FPS on the back, but I guess that isn't proprietary enough for Apple :-)
  • Nah putting the finger sensor on the back is just not their style. 3D face id it is until they can get the embedded sensor working under the OLED screen. In the mean time was pretty curious to see how they would replace the home button functionality. Way to go bringing in BlackBerry 10 swiping gestures. I found that hilarious to see. Well they implemented it well. Love it.
  • I was thinking the same thing about the gestures but I was reminiscing about my old Palm Pre. I absolutely loved the swipe based UI palm had in WebOS.
  • A lot of stuff on the new phones was done already done in WebOS. That system was way ahead of its time...
  • So why not put a fingerprint scanner on the back as the backup?
  • Thank you for explaining the differences Jerry.
  • I thought Samsung had one of the best applications of Iris tracking by keeping your phone screen awake when you are looking at it. I'm not a fan of the use of facial/Iris scanning for unlocking. Similar to voice activation, there are those frustrating situations when it just doesn't work.
  • I use Iris scanning which works correctly 99% of the time, but I still have fingerprints and pattern set up as backup methods for that 1% where it fails.
  • Does it work with sunglasses on?
  • Mine does, yes.
  • I remember when my local Telus had a Note7 on display. I tried iris scanning about a dozen times through my prescription Oakley Ice Iridium lenses. and it worked every time.
  • Sunglasses block some UV light but usually, let IR pass through. Unless the frames or curvature of the lenses interferes, iris scanning should work normally. Face ID should be able to reject the frames as well and intelligently draw the face as if they weren't there using a projected path. Safety glasses made for use with lasers will block both because they filter UV and IR light out.
  • Apple always delivers when it comes to execution. On the other hand, Samsung always rushed, so they can claim "first !". no doubt, that Face ID will work as advertised. It will be faster and more consistent-accurate than Iris.
    But, when it comes to true security, all of these are a joke. Nothing beats 18 digit alpha numeric password (symbols included). I keep my phone unlocked with trusted device (Pebble or BT headphones) and at any time I leave without it, phone locks and password is required. I also, can turn off my headphones or watch to activate the lock when needed. Same can be done with turning the BT off, by quick touch on BT toggle.
  • Agreed! Apple always delivers on execution and quality. As the above comment mentioned minus their map apps. We ll see in time how good their 3D facial ID is but one thing is they didn't rush with it. A lot was put into it. If they can get the fingerprint sensor embedded /underneath the screen as originally planned for next year that ll be pretty sweet!
  • I'm actually surprised at how good the Iris Scanner on my Galaxy S8+ is.. it usually works even when looking down at the phone.
  • Samsung wasn't first, nor did they claim it. It also wasn't rushed and it works great.
  • It works pretty well now. Issues with iris unlock on the Note 7 were pretty widely reported before a certain other issue with the device took the headlines.
  • It's pretty terrible for me. Iris scanner that is. I do wear glasses though.
  • Rushed did you say actually samsung has been working on iris scanner tech for quite a while. An the technology has been around even longer . Oh an by the way it works ..of course you wouldn't no that cause you don't own the phone ^^^ udazavlanje
  • they can work on their tech for a decade and still release a product half baked. I'm not talking about Samsung S8 series performance, but on Every single thing they announced and released with all new feature. their FPS sucked, their face recognition was inconsistent, screen-on while watching also sucked. My wife had every single note series and I had S series til GS5. also a lot of my friends and family had still have latest and greatest of Samsung. it more than enough to know about Samsung delivering on promise.
  • Samsung's implementation of Iris scanning is very consistent and very fast. While I like the step forward in 3D mapping that Apple has tak