In the current iOS 13 beta testing, Apple is testing a new feature called FaceTime Attention Correction that corrects your gaze during a video call to look like you’re looking at the camera. To try it, you’ll need the beta operating system installed on your phone and even then, it’s only available for iPhone XS and iPhone XS Max.
Gizmodo and app designer Mike Rundle were one of the first people to notice this when testing the iOS 13 beta. Victoria Song from Gizmodo FaceTimed her friend and she said that her eyes looked “kinda funny”. Despite that though, it seemed that the gaze correction is “not that noticeable”.
You can see how the feature works in the comparison photos during Rundle’s FaceTime conversation with his friend Will Sigmon. Rundle calls it “insane” and “some next-century shit”. He’s not wrong. The correction is very subtle and it actually looks good, or makes you look good. But it’s still really really weird!
Such a feature is useful though for conversations with people you need to look like you’re paying attention to when you’re not or just simply multi-tasking. Dave Schukin, a software engineer, responded with a video on Twitter that Apple uses ARKit “to grab a depth map/position of your face, and adjusts the eyes accordingly”. In his video, you can clearly see the warping of the line across the eyes and nose.
So far, there’s no word from Apple about whether they’ll be launching the feature for the other phones. If they do, does that concern you or do you think it’s interesting? To me, it’s definitely interesting but it does open a whole lotta questions about where this feature might lead us to in the future.