News

So, Google accessibility features useful for everyone

New improvements in the area of speech recognition to ensure that Android smartphones will soon be easier usable for the deaf and hard of hearing. And maybe also for the rest of the world.

For the blind and visually impaired people contain the smartphones and other electronics for many years already, software that reads or describes what is on the screen. That is also essential: without such a reader would be people with a visual impairment, not further than the home screen.

For the deaf and hard of hearing are the most used functions of smartphones to good use, but phone calls and the consumption of media with sound can still be an obstacle. With the rapid improvement of speech-to-text technology starts to change in to.

At its developer conference itself I/O presented by Google include Live Caption. Later this year it will be on the newest Pixel-phones Google may be to liveondertiteling to convert all video and audio files. Now, subtitles are only available on some videos, for example, on Netflix, or the site of the NPO, but with Live Caption, it should be possible to be anywhere subtitles to generate.

That also works without an internet connection. Google has the complex language models required for speech to decipher such reduce, they are now on a smartphone can be installed. That makes voice recognition is available not only in more situations, but also a lot faster.

The demo of Live Caption was one of the highlights of Google’s annual press presentation, last Tuesday. “It has the show stolen”, says product manager Brian Kemler, who is working on a Live Caption. “We knew it was cool, but not that the reaction was so positive.”

Liveondertiteling works well in first test

A test version of the technology is already impressive. Live Caption is activated by pressing the volume button on the smartphone, press, and then on a ondertitelingsicoon tapping.

The English subtitles are almost complete accurately, both at a radio item of the American public broadcaster NPR, a Fortnite-live stream on Twitch, as a clip from a talk show on YouTube. Only some of the names are incorrectly spelled.

By two times quickly on the subtitles to tap, you can choose between a display of two lines at a time, or a larger block of text. The Live Caption can be on the screen can be dragged, so he videos, or buttons on the screen does not block.

“Useful not only for people with disabilities”
Google product manager Patrick Clary

Because the subtitles totally on the device, appear lyrics as soon as they are pronounced almost without delay. The feature also works offline, for example with podcasts that have been downloaded. A disadvantage of the offlineverwerking is that a language model to the smartphone needs to be downloaded. For now only English is such a model available, so Dutch people will still take the function should wait.

45

Google presents liveondertiteling for smartphones

The “Curb cut effect” it brings accessibility to a wider audience

That the Live Caption feature is valuable for people who do not hear well is obvious, but also for other smartphone users can enable the subtitles are useful. Who, for example, in the train without headphones, but just as a video wants to look, can do that no other passengers to disturb.

The developers at Google call this the “curb cut effect”, named after reductions in curbs that make it possible for wheelchair users to the sidewalk, and to come off. That design is indeed thought to accessibility for people with disabilities, but has also benefits for parents with buggies or passengers with a heavy duty wheeled case.

“When we think about accessibility, we want to ensure that everyone benefits, not just the people with a severe disability,” says Google product manager Patrick Clary, who is himself a wheelchair-user.

At the same time, the accessibility for people with a disability remain central, says Kemler. He is also working on Live Transcribe, an Android app that since earlier this year is being tested. With Live Transcribe, it is possible to, but only with an internet connection – calls in seventy languages live in order to convert it into text.

Kemler says that his team of developers immediately realized that that function also for people without hearing impairment can be useful, for example, for journalists who, during a interview directly a transcript want to create. “We had to really step back and say: ‘We can not let us be tempted by all the wonderful possibilities and should focus first on accessibility,'” says Kemler. (Good news for journalists: now is working on a function to get the text from Live Transcribe to export.)

Project DIVA makes a Home accessible without voice control

A last toegankelijkheidsproject that perhaps a broader appeal will have, is Project DIVA. That is an initiative of the Italian Google developer Lorenzo Caggioni, whose brother Giovanni, the Down-syndrome and is also blind.

He may be the smart Google Home-speaker not using his voice to control, so built Caggioni a way with buttons in various formats and commands to the Home. A button can for example be used to music to start, and a other to Finding Nemo playing on the tv.

With so-called RFID sensors, it is also possible to perform actions by an object on a scanner. Put a Peppa Pig toy on the scanner, and the series starts on the tv.

Caggioni think that new inputmogelijkheden for smart speakers for everyone can be useful. Kids can create their own Project DIVA-buttons and scanners in each other, with instructions that are online are placed. Caggioni expected that many new applications will emerge: “If something works for Giovanni, it works for the rest of the family even better.”

Follow us

Don't be shy, get in touch. We love meeting interesting people and making new friends.

Most popular