Last week it was our Summer Miniconf and we had some fantastic speakers, one of them was Jamie Hale who spoke to us about adaptive apps for android. It was so good, we thought we would share it again. Have a read!
While iPhone has quite a few accessibility settings built in, with Android there’s a lot more variety. I like that it’s very customisable with a lot of options. I use Android on both my phone and my tablet, with a variety of apps and equipment that I either use (or have used) myself, or have experimented with.
My phone is the Samsung Galaxy S10, while my tablet (which I use less) is the Galaxy S5e. I have the S10 in an Otterbox case, with a tempered glass screen protector, and I have the S5e in a Spigen Tough Armor case. I own two Meru Flexzi mounts, the two strand mount and the three stand mount. I have extension kits for both. I then have Velcro attached to the back of both my phone and my tablet, to hold them onto the mount. This means I can keep them positioned in a way that works for me, without having to hold their weight.
I’m going to go through some problems you might have with accessing your Android device, and then look at solutions. They aren’t all things I use myself but are things I’ve tried for personal use or interest.
“I can’t easily reach the whole screen of my device”
In the built in Accessibility menu (settings // accessibility // interaction and dexterity // accessibility menu, there is a built in cursor function in their accessibility settings which lets you control the screen using a small, cursor pad placed on one side as if it were the touchpad for a laptop.
There is also an app called Reachability Cursor, which lets you use a pre-defined area of your screen as a cursor pad for reaching the rest of your screen. A free version gives you access to some of the functionality, while there’s a Pro version for £3.49 that allows you to do everything you could do by tapping your screen (such as long clicks, or clicking and dragging). I find that this works better than the built-in one, because you can more easily define where you can touch and which parts of the screen you need it to reach
“I can’t touch the screen of my device (or can tap a tiny part of it, but not use a cursor”
Using switches (I have the Blue2 bluetooth switch, and a handful of micro-light-touch switches (these ones, I think) for the Xbox), you can either have a grid separating each item, or two lines moving over your screen, and you merely tap the switch when you want to click on something. If you look in the accessibility menu, you can also use a keyboard as a switch. I’m currently trying to figure out if I can use a Bluetooth camera shutter to do this, because I want to have a switch to start Google Voice Access.
You can also use tapping anywhere on the screen as a switch, if you can reach a part of it but not enough to control a cursor – these are good instructions
“I can use my device but struggle to type on it”
Gboard – the google keyboard, is pretty good at taking dictation, though it can struggle with speech patterns and noise from NIV machines. I find it quite good, especially for quick messages (and have it in my email signature that there may be errors!). It also has a swype keyboard, where instead of having to touch individual letters you can drag a finger/stylus across the word without having to lift it.
“I can’t touch my device, but I can speak clearly”
Again this may struggle with NIV but Google Voice Access controls the whole device using your voice (or a mixture of that and a switch). It numbers every item on the screen for you to select, and is quite intuitive once you’re used to it, though the learning curve can be steep.
The other main problem with it is that it responds to the same wake-up command (“ok google” as the Google Home/Nest series, meaning that both respond when I talk in the house (which gets quite confusing). You can set up a trigger switch to start this, so I’m trying to figure out how to set up some form of voice controlled switch that responds to a different command.
“I can’t touch my device or a switch, but I have good head control”
There are two options here.
One is that Android can use the front facing camera for switch access (turning head left, right, up, down, blinking, or opening mouth). This means you don’t need an external button, but you do need the camera to be pointing at your face, so it’s better when you’re still than when you’re on the move.
The other is a head mouse, which controls the device based on you moving your head to move a pointer, picked up by the front-facing camera. I’ve used Eva Facial Mouse in the past, and Eva Facial Mouse Pro also exists, an update on Eva Facial Mouse but with a similar interface. This relies on you being still and in good lighting but actually works very well if you can move your head enough
“I can’t move at all”
There are systems for controlling an Android device using Eyegaze technology, but they’re not something I know much about.
I hope that was helpful for someone – and sorry that I don’t know more about Eyegaze control for Android! If you have any questions, contact me on @jamierhale on Twitter