The iOS 13 fixes the issues Apple users experienced with iOS 12, focusing more on the FaceTime problems that would arise during conversations.
What was wrong with FaceTime?
Apple offers its own video chat application, FaceTime, that allows users to make secure video calls using their Apple techs. As to seem like you are making eye contact with the person you video-called you need to look into the camera, the conversation appears not so real because people usually look at the screen and not into the camera.
According to manufacturers, it is a must that the camera that faces the front is placed on the top of the screen, so this is not a flaw. However, as Apple is always looking for methods to improve their devices, FaceTime seems now as personal as it was supposed to from the beginning.
Mark Rundle discovered the FaceTime Attention Correction introduced by Apple in iOS 12 beta 3, and he took the screenshot that is all over the internet now. Now, even though a person is not looking directly into the front camera, it seems like they are making eye contact when they are actually looking at the display.
How would iOS 13 correct FaceTime?
Dave Schukin, a twitter user, explained in a video they way the Face Time Correction works. As he mentioned, an ARKit is being used by the Face Time Correction to adjust the eyes in symmetry to the face’s map-position.
The usage of AR Magic is obvious when Save moved his hand in front of his face, and we could see something going on in front of his eyes and nose. Now, even though you might not be looking directly into the camera, you are still making eye contact with the person you are talking. We’ll reportedly experience that in iOS 13 version of FaceTime.
Dorothy has been a journalist for ten years and has been working with the Tech News Watch staff since the beginning of the news site. Her main contribution to Tech News Watch are mobile, IT and science news, with a focus on software updates and great outer space discoveries.
Leave a Reply