Thread: About Windows 11's high system requirements. You know, a lot of blind people, who don't have jobs, live on social security and disability money, and who definitely don't have the newest computers, won't get Windows 11. This could have been a great chance for Linux to step up and say loud and proud "Because we support every person's ability to choose their system, and use and learn about computers, we will never force upon users what system they must run. And because we stand proudly with people with disabilities, all blind people are welcome in the world of free and open source software, where they can learn and create just like everyone else."
But no. Gnome, one of the most popular desktops on Linux, is trash with accessibility. KDE is working on it, but that'll take years. Who's ever heard of Mate? And who makes current software for the command line, for users and not other developers?
Also, it's not enough that Gnome is trash, or KDE is slowly trying, or the command line is mainly for developers. When a user installs Linux and needs assistive technology, like Orca, they can't just enable it and go on their way. They have to check a box in settings to "enable" assistive technologies. That's a huge barrier, and shouldn't exist. But it does. Another roadblock. Why do these exist in a supposed welcoming community? Why do these exist if Linux is open to all? Why? If FOSS is communal, why are blind people, due to the huge barrier of entry, shut out of the FOSS OS? These are hard questions we should be working through. Why does the GUI require assistive technology support to be enabled in order for Orca to work with many apps? Why can't it be enabled by default? Does it slow stuff down? If so, why? And should we have to live with a slower OS because we're blind?
I think, you are right. And there is something you can do about it: provide a checklist on barrier-free design of UI. Most devs, who don't need assistive support, don't have a clue what this could be. They just don't know about it, because it's beyond their perception.
Just TELL them, what to do!
One of the things Microsoft is good at is funding UI research to see how people use computers and to meet national procurment laws they put the hard work into accessibility.
I watched https://emacsconf.org/2019/talks/08/ as an idea to try and understand how to use a computer without vision, but it could also help to have examples of use.
Someone once posted a recording a of screen reader going through an emojii heavy post, to make it very clear how annoying it is.
@tychosoft @devinprater @alienghic
When I think back, the first computers I used (Apple II, C64), I actually could use "headless", because you had to know all basic commands by heart. Then, with win3, there was a kind of GUI, but actually it wasn't that, what GUI now is. The description desktop fit, bc it was just a surface, where you could put your items - 1/3
just where you wanted them.
I organized my desk in a bunch of "drawers", folders with launchers.
I think, we should go back to the idea of a desk.
Everyone uses desks, but not the same way. A desktop for the blind must be a non-vision-UI.
I think, it's serious business, to think over, how that could work. Mouse involved? Hover-over with speech output?
Maybe we start a hashtag to have a wider @tychosoft @devinprater @alienghic - 2/3
Fosstodon is an English speaking Mastodon instance that is open to anyone who is interested in technology; particularly free & open source software.