After almost a year of testing, Google has finally launched its AI-powered Google Lookout app that is designed to help visually impaired and blind by identifying the objects around them.
In order to use the app, all the user has to do is to point the phone forward after starting the app. Google Lookout will identify people, text, objects and other things as the person moves around and also narrate the user what the app sees.
Also, the app won’t give user the unnecessary information, but only tell them things that it thinks are important. Google explained that once the app is launched, which can be done by asking Google Assistant to ‘start Lookout’, there is no need to tap any more buttons.
Lookout has three modes of operation. The ‘Explore’ default mode is best for everyday tasks and chores and the new places. The ‘Shopping’ mode is to help with barcodes and currency, and the ‘Quick’ mode is best for sorting mail and reading signs and labels.
Moreover, Google claims that the app won’t always work with 100% accuracy, and it will continue to develop the app as it gets more feedback from users.
However, the app is currently only downloadable for US and for owners of Google Pixel devices, but Google says it is working to bring Lookout to more devices, countries and platforms soon.