The app is developed with Artificial Intelligence to figure out the objects through the Smartphone's native camera app. Lookout has three modes: Explore, Shopping, and Quick read. The organization says it has ensured that Lookout works in circumstances where outwardly tested individuals may require it the most. If Google Lens can identify a dog's breed from a photo, there's nothing stopping it from using the same tech to help visually-impaired people, and that's where Lookout comes in.
Lookout is created to help people learn about new places, help reading texts and basic daily tasks such as cooking and shopping.
Users only have to fire up the application once to use Lookout - they don't have to tap any other button in-app.
Google does mention that, "As with any new technology, Lookout will not always be 100 percent ideal".
To use Lookout, Google recommends that users wear their Pixel phone on a lanyard around their neck or in the front pocket of their shirt. In February, the tech giant launched two new apps for Android, Sound Amplifier and Live Transcribe, which are created to "make life a little easier" for those with hearing loss.
According to a blog entry by Google, Lookout utilizes man-made consciousness, similar to Google Lens, and gives clients a chance to hunt and make a move on items around them by just pointing the phone.
Google also rolled out automated closed captioning to Google Slides for US English in October, a feature that Google said makes presentations more accessible to audiences that are deaf or hard of hearing.
Judging by today's Google logo (spells "Google" in Braille), the company is either (a) celebrating Louis Braille's birthday (he was born on January 4, 1809), (b) about to engage in some new accessibility initiative, or (c) both.
Australia finally has a telecommunications products resource dedicated to people with disability.