At the Neural Information Processing Systems conference in Long Beach, California, coming week, Google researchers Hee Jung Ryu and Florian Schroff are about to present a project that is an “electronic screen protector”, where Google Pixel handset uses its front camera and eye-detecting artificial intelligence to detect someone when looking at your screen. A demo video by Ryu shows the software interrupting Google messaging app to display a camera view, with the peeking perpetrator identified and given a Snap chat vomit rainbow.
Researchers Ryu and Schroff claim the system works in all sorts of different lighting conditions and can recognize a person’s stare at you screen in merely two milliseconds. Apparently, the AI software works so speedily as it’s being run on the phone, rather than sent for processing on the company’s powerful cloud servers.
Although we can’t say whether or not this will ever become a shipping feature. We’ve questioned Google if it can say more about the project ahead. Conversely, it’s good to see such an option, even if it’s not something you’d want to run non-stop. When reading sensitive email or watching a video you can turn it on.This wouldn’t stop people from seeing things in their peripheral vision, but it could encourage peeping types to mind their own business.
In recent times, Google has made a large push to make it easier for developers to integrate artificial intelligence into Android and other mobile devices. Google is heading with more and more on-device machine learning with the help of its TensorFlow Lite software and has added some easy tools that automatically detect numbers and addresses. The AI software tool takes less storage and computing power than Google’s standard AI tools which were released in late 2015, called TensorFlow Lite.
At the moment, the company released a new API to make it easier for developers to use TensorFlow Lite in their own mobile apps.