deborah: The management regrets that it was unable to find a Gnomic Utterance that was suitably irrelevant. (gnomic)
deborah ([personal profile] deborah) wrote2014-05-14 08:02 pm

hands-free computing part 1: non-mobile

Happy Global Accessibility Awareness Day! I'm very excited about the presentation I'll be giving tonight (that's grown from this one) at Fresh Tilled Soil's Boston GAAD event. I'm looking forward to the other speakers, as well; I've been reading Kel Smith's book, actually.

I want to give a very brief overview of how I use technology, since enough people have asked. I'm including the various technologies (hardware and software) I use, as well as some of their perks and frustrations. Part one is my non-mobile experience: Windows, Linux, Mac.

For context, I used to be about 99% hands-free, and now I am more like 80% for actual coding/writing and maybe 50% for just dicking around online. Hooray, vast improvement! But I still have 100% hands-free days, and I need to be able to control the computer completely. I'm a programmer in my day job, and in my free time I sysadmin, code open source, write book reviews, and spend a lot of time on social media. In other words, I'm on a device the vast majority of my waking hours.

Operating System and Environment

Mac is easy to explain: I can't do it. I know people with limited typing who use Mac Dragon Dictate, and it is a reasonably good tool for dictating text. But it's not designed for command-and-control, and Macs are simply not tools for fully hands-free computer use. It's getting much better, and if you need dictation as a supplement it may well serve for you splendidly. But it's not adequate for full control, and I can't imagine programming via Mac Dragon Dictate.

Linux is also pretty easy to explain: I use command-line Linux via Putty terminals. The issues limiting speech recognition on Linux platforms are a whole new blog post (and one day I might make that post; I used to contribute to XVoice, so I have Many Feelings about the topic). When I must use Linux GUI apps, I either use Cygwin/X on Windows, and struggle with how slow it is, or I use virtual network connections or virtual machines, and struggle with limited voice control.

So on the desktop (and laptop and server), I'm married to Windows. I am not religious about Windows, and in fact if a Unix-backed OS would solve my hands-free problem, I would be a happy camper. But I do not have a choice about Windows as environment. Sorry, evangelists, I know Windows makes you sad. It makes me sad, sometimes, as well -- but it's also the only environment that is astonishingly good at hands-free and mouse-free computing. Convince Tim Cook and Richard Stallman to prioritize non-mouse accessibility as much as Microsoft has.

Software: Speech To Text / Speech Recognition and other tools

The software tools I use are:
  • Dragon Naturally Speaking Professional
  • Natlink / Vocola
  • PC By Voice SpeechStart
  • WorkRave
  • Firefox with the add-ons:
    • Mouseless Browsing
    • NoScript and RequestPolicy
    • A number of improved usability tools, such as Down Them All, Tab Mix Plus, Session Manager, etc
  • Old versions of Opera :(
  • The Windows setting Sticky Keys


Dragon NaturallySpeaking, for all its myriad flaws, is a dream of a program. Without it, I would never be able to be employed full-time, let alone as a programmer. NatSpeak allows complete hands-free control of a Windows computer (with a very small number of exceptions). I use it in conjunction with Vocola, a voice command language enabled by Natlink, a Python-based NaturallySpeaking extension language. I only started using Vocola and Natlink last summer, after well over a decade of using NaturallySpeaking, because it took that long to make the cognitive space available to learn yet another dictation based skill set. And yet, now that I've learned it, I'm awed by how much simpler it makes my life. Rather than building all of my Dragon commands laboriously in a complex, poorly documented Visual Basic interface, I rapidly create easily shared simple macros using Vocola.

PC By Voice SpeechStart was a lifesaver after a deeply annoying NatSpeak behavior that was introduced a couple of Dragon versions ago. NaturallySpeaking has two options for telling the microphone not to listen to you: "go to sleep," which tells the microphone to keep listening until it hears the phrases "wake up" or "listen to me"; and "microphone off," which turns off the microphone. If you are controlling your computer hands-free, you don't want to completely turn the microphone off, because then you will need to use your hands to re-enable it. But a couple of versions ago, Dragon made the microphone in sleep state much too sensitive, so that it heard completely random sounds and words as if you were saying "wake up." There was about a year in there where I sent some pretty ridiculous IMs, just by not noticing the microphone turning on. And then, to the rescue, PC by Voice. This tool takes advantage of the native Windows Speech Recognition (which is actually an adequate pre-installed speech-to-text solution if you can't afford NaturallySpeaking) to enable the NaturallySpeaking user to use the "microphone off" command. It then programs Windows Speech Recognition to only listen for the phrase "switch on microphone." I think in the years I've been using it I've had maybe two false positives ever, which is a fantastic rate.

WorkRave is a break timer. Once I learned how to change the audio tones to less aggravating sounds, I became much better at leaving it on and doing what it tells me. I am still prone to disabling it when I'm feeling irritable or rushed, but it is pretty much always right about when I need to stop, since WorkRave barely registers me dictating at all, and only complains when I type. Which I shouldn't be doing.

I find that any browser add-ons that make my life slightly easier are actually a huge productivity win. When you are dictating, so much computer use is cumbersome and cognitively wearying. Anything that speeds it up even a tiny bit makes a big difference. But I do use a few accessibility add-ons specifically for accessibility purposes. Mouseless Browsing is the big one -- as Opera's accessibility support has deteriorated over the years, Firefox plus Mouseless Browsing is what makes it possible for me to use the web. Basically, it puts a tiny actionable number next to every clickable link, so I can type/dictate the number to follow the link, open it in a new tab, download it, etc. Development is currently suspended and one of these days it's going to stop working and I'm going to have to fork the code and start maintaining it myself. :( As for NoScript and RequestPolicy, I use them because there are still sites on the web I can't use at all unless I disable JavaScript. Which is bad JavaScripting; it's not hard to write accessible JS, people.

Opera used to be amazing for accessibility. I haven't even updated to the most recent Webkit version and never will; when Opera 12.16 gets too old for modern web pages I'll have to give up on its wonderful single-keystroke control keys. Dammit. But for now I love it, interchangeably using Opera or Firefox as the need demands.

Windows sticky keys is an accessibility built-in that I believe all operating systems share: the ability to do chorded key-bindings (eg Alt + F4) as two separate keystrokes separated by a pause. Since I can't reliably use my pinkies or two simultaneous fingers of the same hand on the best of days, I wouldn't be able to type without Sticky Keys. ...Which, come to think of it, would be a good reason not to use it at all. Freaking keyboards, man. Anyway it's trivial to enable on any Windows computer (through the Control Panel for Ease of Access, or just hitting Shift five times in a row) and is moderately configurable in its behaviours.

Hardware


I actually don't use too much specialised hardware.

I use wireless microphones to talk to Dragon. Their recognition is worse than high quality wired mics, but I found wired microphones were too fragile; I was needing to replace them annually.

My keyboard is a re-mappable Kinesis Advantage Pro. It's not perfect: there's too much action on the keys, which limits how long I can use it before the vibration causes too much pain. But the ability to move keys from the weaker outside fingers to the Kinesis' thumb keypads moves much of the base typing to much stronger parts of my hands.

I use the LightIO Touchless Touchpad. It's actually deeply annoying for any fine control, but almost entirely pain-free since you don't actually need to touch it. And it's so annoying to use that I dictate more (which I should) instead of using the touchpad. I also use programmable touchpad controls as if I had a simple switch attached to my computer. For example, I've programmed "rest three fingers on the built-in touchpad" to mean "start NaturallySpeaking and Speech Start." Very handy when they crash!

I also use an ultraportable ThinkPad with a fingerprint sensor, which means I don't have to type my password, or move a heavy object.

And that's it for special hardware.

[personal profile] jordanwillow 2014-05-15 02:14 pm (UTC)(link)
This was super interesting. Thank you.
yendi: (Default)

[personal profile] yendi 2014-05-15 06:14 pm (UTC)(link)
FWIW, there are definitely folks at Apple evangelizing about physical access issues within the company -- at the UMB event I'm at today, Jon Landis, a chem professor who now works for Apple, spent a couple of minutes on an aside kvelling about how the education division helped drive Apple's requirement that all of their laptops be opening with one finger or one point of contact.

That said, they've really spent much more time focusing on the BVI range of accessibility issues than ones revolving around non-mouse access. It's helped make the mobile side amazing, but I'd love to see accessibility tools on the computers beyond Voiceover.

Thanks for this post, btw (and the follow) -- it's going in the ever-growing list of toolkit things to keep in mind.
synecdochic: torso of a man wearing jeans, hands bound with belt (Default)

[personal profile] synecdochic 2014-05-15 10:43 pm (UTC)(link)
Weirdly, I didn't have any problem teaching Dragon on the Mac to do command-and-control -- in fact, I found it much, much easier than dictation (but that's because the cognitive load of composing meaningful communication to speak out loud, for me, is orders of magnitude higher than the cognitive load of composing meaningful communication to type -- there's something really fucking broken in my brain-to-mouth pathway that's not broken in my brain-to-fingers pathway). I'm not sure if that reflects the fact I was using a very recent version of Dragon and things have improved considerably, or if it's because of my willingness to program all kinds of shortcuts, macros, etc.

It also helped, I think, that Dragon for Mac uses AppleScript instead of VB, and I'm reasonably competent in AppleScript (or used to be, and was warming it back up out of cold storage until I decided dictating-qua-dictating wasn't working for me and it wasn't worth investing more time and cognitive load in adapting the tools more).

If it weren't for the struggles I was having in finding a wireless microphone that worked for me and worked on my OS, I probably would still be using Dragon for command-and-control and saving my typing for the actual composition bits. I found it very helpful, especially since I have always been a keyboard C&Cer (even during the dark years when Mac operating systems made it as hard as possible to C&C via keyboard) and many of the keyboard shortcuts force the largest burden of chording onto the left hand, which is my worse hand. (I was going to say 'my bad hand', but, well, they're both pretty bad.) I often find myself, when doing heavy formatting work or rapid program switches, with my left hand splayed in very unnatural positions and held there unconsciously to minimize time: thumb on the clover key, pinky on the tab key, middle finger on the W key, index finger going back and forth between X, C and V. (Clover-X, clover-tab, clover-V, clover-tab, clover-W, clover-X, cover-tab... cut, switch program, paste, switch program, close window, cut, yadda.) It lets me get through data entry or formatting or whatever ten times faster than other people, but it does take physical toll.

(I'm not kidding on the 10x faster. I had a job in my early 20s that involved basically robotic data entry with a little bit of discretion, and it wasn't scriptable. I processed over 80% of stuff, in a department of 15 people.)

Anyway. I do not mean to suggest that I am in any way characteristic of what others' experience would be, just that I found C&C using Dragon Dictate to be very easy (and much less frustrating than the actual dictation/composition part).
alierak: (Default)

[personal profile] alierak 2014-05-16 09:23 pm (UTC)(link)
You mean Linus Torvalds, right, not Richard Stallman? My friend Alex was a typist for RMS in the early 90's when he had to dictate everything due to hand pain. I'm pretty sure this would have involved typing only, since everything in Emacs can be done without a gui.