Google I/O: Reduced audio and visual barriers with Android

Accessibility has traditionally been a top priority for Android. This year, Android operator Google is concentrating primarily on improvements for the blind, deaf, hard of hearing and visually impaired. The company will also be presenting new accessibility aids for app developers at its Google I/O developer conference.

The Android screen reader Talkback has matured to version 13.1. It now supports the new HID (human interface device) protocol via USB and braille tables in 38 languages. Other languages ​​are in the works. So-called actions are new, for example cutting, copying or pasting text in editable fields, or archiving or deleting emails in Gmail. For example, a spell check action in Talkback’s Reading Controls allows you to skip to the next or previous word that might be misspelled.

Talkback provides notifications when actions are available. They are activated by swiping to the right with three fingers; a one-finger swipe down reveals the list of available actions, from which one can then be selected with a double tap. The text input variant split-tap typing also involves tapping: the user selects the first letter on the on-screen keyboard with one finger and confirms this selection with another finger tapping somewhere on the touchscreen. Then, without lifting the first finger, he moves to the second letter, taps again with the other, finger, and so on.

Google gave a brief preview of Talkback 14. This version will support NLS eReaders from Humanware; she also knows new Braille commands for moving the cursor and cutting, copying, and pasting text with the on-screen keyboard. There are no details yet.

In December, Google released a free app called Reading Mode for the visually impaired and dyslexics. It adjusts the screen display of documents, websites and e-mails and can read texts using speech output. The text that has just been output is marked. The voice output can be adjusted to personal preferences. Of course, the app is also good for other users, for example if someone wants to have a text read to them while washing the dishes.

With the next Android version 14, the font magnification will be changed: it will no longer be linear. This means that fonts that are already large are not enlarged as much as smaller fonts. For app operators, this means testing their user interfaces at maximum font size to ensure no text is hidden or truncated.

In the acoustics area, Google promises Android 14 with more intuitive cooperation between Androids and hearing aids. In addition to easier Bluetooth pairing, there is a more precise selection of what the user would like to have streamed to his hearing aid(s). Settings will be separate for calls, notifications and media content – a significant improvement for hearing aid users. In addition, notifications will not only be able to attract attention with ringing and vibrations, but also with camera flashes or illuminated screens. Everyone can also benefit from this, for example in noisy environments or those where you want to disturb others as little as possible.

Live Caption can also work wonders in such situations: it not only recognizes speech in telephone calls, podcasts or videos and converts it offline into text, Live Caption can also output text entered in reverse in a telephone call as speech that the other person can hear. This not only helps the deaf or people with vocal cord inflammation, but anyone in noisy or particularly quiet environments where you don’t want to be inconspicuous, such as in a library or in a cemetery. Live Caption is already available in some Android phones, including all Google Pixel phones from Pixel 5 upwards. Google vaguely announced innovations for Live Caption in the current year at the I/O.

Android 14 brings a number of new APIs for programmers. At its developer conference, Google highlighted some accessibility APIs. The attribute accessibilityDataSensitive restricts access to information unless an accessibility application wants to access it. The Play Store ensures that apps do not falsely identify themselves as accessibility applications in order to be able to read sensitive data. Especially security-relevant applications such as bank apps will benefit from this.

setRequestInitialAccessibilityFocus is an attribute that tells the accessibility application which element to automatically focus on when invoked. In general, the Android Accessibility team advises against using this feature because it creates inconsistencies between apps. In exceptional cases, however, this attribute can be helpful.

With setMinDurationBetweenContentChangesMillis an element of the TalkBack app (version 14 and later) tells how often Talkback should output something while focusing on that element. A stopwatch that counts up seconds served as an example in the presentation. Do you really want to announce every single second? Or maybe every ten seconds is enough?

The Android Accessibility Test Framework is a collection of technical accessibility tests; Although they do not replace tests with real users, they can catch certain errors in advance and suggest solutions. From now on, this also works for user interfaces created with Compose.

Compose 1.4 also introduces a new algorithm for determining focus order. Now the tree structure of the display is no longer decisive, but the size and position of controls. This is particularly advantageous with a bar at the edge of the screen, as is the case with the new Pixel Fold. More focus order APIs are in the works.

In addition, Google would like to support the programming of accessible apps with additional materials. On Interface specifics at WearOS goes into this document one while it’s new on github Code samples for accessible programming with Compose gives. Google invites app developers Accessibility feedback admit.


(ds)

To home page

source site