Mar. 25th, 2011

deborah: the Library of Congress cataloging numbers for children's literature, technology, and library science (Default)
Yesterday, on the DCA blog, I posted "Accessibility and back office archives tools", for which I made a screencast of myself using NaturallySpeaking to use a less-than accessible tool. There was enough positive feedback about the screenreader screencasts to which I linked that I thought there might be some interest in these as well.




In an entirely unrelated aside, when did it become acceptable for un*x programs to start shoving everything -- configuration, logs, state, data -- into /usr/local? (Yes, Tomcat, I'm looking at you.) In my day, whippersnappers, you put your configuration into /etc, your logs into /var/log, your state into /var/run, and your data into whatever was appropriate based on your file system. With obvious modifications based on what operating system you are actually running, maybe using /opt or something instead of /usr/local, etc. In theory, you should be able to get by without even backing up /usr/local, because you could rebuild it completely from source or package, what with all your configuration and state and logs being stored in other places. And as a side effect, it always had a very controllable and knowable size, because it didn't have things like logs that grow arbitrarily if unexpected things happen, and sometimes are exceedingly difficult to roll on a regular basis, and yes, Tomcat, I am still looking at you.

Is this based on a theory of file system management that changed while I haven't been paying attention, or is it just sloppiness based to the new ubiquity of good un*x package management?
deborah: Kirkus Reviews: OM NOM NOM BRAINS (kirkus)
You know how [livejournal.com profile] diceytillerman and I are always talking about the beauty of criticism and deconstruction?

This is what we are talking about:

XKCD comic: Beauty )

Transcript )
deborah: the Library of Congress cataloging numbers for children's literature, technology, and library science (Default)
An open letter to those implementing mobile device accessibility:

I know that hands-free mobile device control is difficult, and I am grateful for the amount of voice control which has been implemented so far. The ability to dial a number, send a text, send an e-mail, or leave a memo are all useful. Now here's what I would like to see next:

  • A microphone which stays on until turned off, rather than tap-to-speak. I understand this could have implications for users who don't know how to use it, but then, the same goes for having a telephone in the first place.

  • A 36-item vocabulary, probably native to the phone, of the letters in the alpha-bravo alphabet and the digits 0-9.

  • The ability to start an app installed on the phone by saying "start [app name]". E.g. "start Angry Birds". (No, I have no idea how to control Angry Birds by voice. I just don't know the name of a lot of mobile applications, as I don't have one, because I still can't use one. Hence this post.)

  • A seven-item vocabulary, probably native to the phone, that can be used in webpages: page up; page down; back; forward; show numbers; go to address; press enter. "Show numbers" would put a number next to every clickable or selectable element (much like the Firefox extension mouseless browsing), allowing those items to be selected by dictating from the digit vocabulary.

  • The command "microphone off".

  • The command "dictate here", allowing the user to open up a remote-processed standard dictation window in any field or application.


Now, I will admit that I have never done any mobile programming, and I have no idea what the limitations are for vocabulary recognition. Am I mistaken in my belief that adding another 46 items to the local-to-the-device vocabulary (on top of the ones that already exist such as "send a memo to") is something a contemporary mobile device should be able to handle?

As a bonus, I see in the Android accessibility best practices that all applications should be designed to pay attention to the directional controller as well as just the touchscreen. Great, that opens up the possibility for four more voice commands: up, down, left, and right. That brings us up to 50 desirable items in the native vocabulary.

Can your phone handle that? And if not, can the next generation of your phone handle that? And if not, why not?



(Geeze, I'm starting to feel like I should add HV1569.5 to my default icon.)
Page generated Apr. 24th, 2014 05:51 am
Powered by Dreamwidth Studios