How will Google’s new Lookout app help blind and visually impaired users?

9 May 2018641 Views

Share on FacebookTweet about this on TwitterShare on LinkedInShare on Google+Pin on PinterestShare on RedditEmail this to someone

A new app from Google could help visually impaired people become more independent. Image: pppp1991/Shutterstock

Share on FacebookTweet about this on TwitterShare on LinkedInShare on Google+Pin on PinterestShare on RedditEmail this to someone

Lookout is a new app due later this year that will help visually impaired people to navigate their environment.

Google I/O 2018 is in full swing, with a flood of exciting announcements coming from the company across its many departments.

An area of particular focus at this year’s event is accessibility and technology’s role in creating a better quality of life for those with disabilities and impairments.

In that vein, Google announced that it would be launching an accessibility app, known as Lookout, later this year for Android on the US Play Store.

The new app is aimed at helping the millions of visually impaired and blind people gain more independence with their smartphones.

Lookout: A new accessibility app

Lookout will use auditory clues to inform users of the text, people and objects in their environment.

Used in conjunction with a device worn on a neck lanyard or a shirt pocket with a camera facing outwards, Lookout will be able to let people become more aware of the surroundings.

This app is the latest iteration of a wave of accessibility tools, which have all but replaced expensive technologies that previously magnified screens and spoke directions aloud.

Product manager for Google’s central accessibility team, Patrick Clary, wrote: “Lookout delivers spoken notifications, designed to be used with minimal interaction, allowing people to stay engaged with their activity.”

The app has four modes to choose from: Home, Work and Play, Scan, and Experimental (where new testing features can be used). The modes are designed so the app can focus on elements likely to be found in these environments, eg furniture in the home or the layout of an office.

Lookout will be able to let users know if there is a chair in the way they need to avoid, and can also read text such as ‘push’ and ‘pull’ signs on doors. Machine learning will figure out what the user deems important – the more people using it, the smarter it will become.

It will likely be a Pixel-only app at first and will be available later in 2018.

Lookout is just one of several accessibility-focused announcements made at Google I/O. New features pegged for Android P will also improve the quality of life for many users.

Accessibility boosters for Android P

Sound Amplifier will make it easy for users to turn down background noise in crowded places such as airports by using a smartphone and a pair of headphones. Two sliders will control the loudness and tuning, and each ear has separate controls.

Google has also added Morse Code input to its mobile keyboard app, Gboard, which is currently available in beta.

Select to Speak, with optical character recognition, is available in beta. After you select the feature while pointing your camera or looking at a previously taken image, it will read text out loud and highlight it.

Meanwhile, Accessibility Menu makes it easier for people with physical disabilities to power off, take screenshots, control volume and swipe, among other things. It also allows users to enable Google Assistant or tap straight through to accessibility settings. It works in both portrait and landscape, and has large touch targets.

Ellen Tannam is a writer covering all manner of business and tech subjects

editorial@siliconrepublic.com