Mastodon

Posts in "assistive tech"

Giving My Blog a Voice

As a visually impaired person, I tend to listen to most of the content I find on the web, whether it’s blog posts or social media entries. As a rule of thumb, I rely on the built-in text-to-speech feature on my iPhone, iPad, and Macs. It’s great in the sense that it lets me “read” almost anything I come across, and it’s reliable. I love it for the freedom it gives me.

That said, it could be better. The voice is pretty robotic, and it takes some getting used to, especially at the higher speeds I tend to listen at. Luckily, I’ve been using it for so long that it doesn’t bother me. Still, I know there are better options out there today. ElevenLabs, for instance, has some fantastic voices. They sound natural and, to my ear, are much more pleasant to listen to.

So I thought… “Why not combine my posts with ElevenLabs and create audio versions?” After playing around for a bit, I did just that. Here’s my current workflow:

  1. Write the first draft as I normally would.
  2. Edit it until it’s just the way I want it.
  3. Run it through ElevenLabs to generate an MP3 file.
  4. Download the file to my computer.
  5. Upload it to my blog host, copy the HTML, and paste it into the post.

It’s a very manual process, and I’m sure someone out there will tell me there’s a faster, smarter way to do this. I’m all ears to constructive criticism if it improves my workflow.

Now anyone who wants to listen to my most recent blog posts can just hit “play” and enjoy, whether they’re visually impaired or simply don’t feel like reading.

My intention was to do this for all of my posts, but I quickly found out that this costs real money. I blew past the token limit on the ElevenLabs free trial and had to upgrade to a paid plan. Right now it’s only a few bucks a month, but even that won’t cover converting every single post. That’s why I’m only able to offer this for the latest ones. I’ll slowly work on converting older posts, but it may take a while.

Again, if anyone knows of a service that offers ElevenLabs-level quality at a more cost-effective price, please leave a comment below. Thanks, and I hope you enjoy the new audio versions.

🎧 Listen to this post. Audio was generated using ElevenLabs.

#Accessibility #ElevenLabs #Blogging

Two Small Updates That Make a Big Difference

Two essential accessibility tools for us visually impaired computer users are the magnifier and text-to-speech. Personally, I use the Magnifier/zoom feature to navigate around, but I lean heavily on text-to-speech when “reading” anything on websites, PDFs, or other documents. Both Windows and macOS offer these tools, and they’re virtually the same across platforms.

I’ve had an easier time using them on macOS because I use a trackpad where a three-finger swipe combination zooms the screen in and out. This is very smooth, and I can control it precisely. Windows, on the other hand, relies on keyboard shortcuts to enable the Magnifier. You typically press the Windows key and the plus key to zoom in or the minus key to zoom out. The problem? It’s very clunky. I’ve always had to press the Windows key and repeatedly tap the plus or minus key to reach my desired zoom level. A workaround was to get a configurable mouse like the MX Master series and map the mouse keys to zoom in and out more easily. Well, apparently one of the latest Windows 11 updates has changed this.

Most of us have a preferred zoom level we like to work at comfortably, but we do need to zoom out occasionally to get our bearings. In the past, it was especially annoying to repeatedly press a key to zoom out in steps. This new update apparently allows you to press Control + Tab, then use the plus or minus keys to zoom in and out very quickly. You zoom in to whatever level you want, then press the key combination to zoom out all at once with the minus key or return to your zoom level by pressing the plus key.

The other improvement involves the text-to-speech voices. If you’ve ever heard anyone use text-to-speech or use it yourself, you know the voices tend to be pretty robotic. I have to say, they are getting better, especially if you’ve been using these voices for a while. They were horrible before and are pretty good now, although they still don’t sound completely natural. Well, Microsoft, along with other companies, has started working on that. In this Windows 11 update, there are new voices that sound more natural and are more comfortable to listen to, especially for the longer articles you might come across.

This YouTube video below, by The Blind Life, walks through both of these updates in more detail. If you or someone you know uses these features and would benefit from watching it, please share.

#Accessibility #AssistiveTech

Retinal Implant

This retina implant lets people with vision loss do a crossword puzzle

Thirty-eight patients in Europe received a PRIMA implant in one eye.

I’m always on the lookout for articles or videos about the latest technology to improve vision loss, suffering from it myself. This particular technology consists of a chip implanted in the human eye, designed to provide vision gain using an external camera mounted on a pair of glasses. It allows someone with peripheral vision but no central vision to make things out.

In this particular study, researchers implanted the chip in 38 different people. On average, participants were able to read five more lines on the eye chart, each line getting progressively smaller.

That’s amazing to me because even seeing one line below my current level would be a Herculean feat. I assume it’s the same for these people suffering from low vision.

It reminds me of Ray Kurzweil’s book The Singularity Is Near, where he argues that the technology we have as humans will one day merge with us to create a superhuman. He continues by saying that eventually, we will lose our humanity and human traits and become all cyborg, if I remember correctly. I don’t like that part, but I do subscribe to the idea of using advanced technology to improve our quality of life, like in this case.

I understand it’s not perfect because it requires a chip implanted in your eye as well as an external camera to help you see. But if you’ve ever met or spoken to someone with a visual impairment, you quickly discover that we already have to use a lot of external technology, and usually, it’s very bulky and extremely expensive because it’s in that vertical market.

I have my hopes set high for this kind of study and these trials. To me, it always seems like this field is stagnant, but that’s only because I want it to speed up and come up with a permanent solution for all people with visual impairments.

More Accessible Kindle

Better accessibility coming to Kindle books

Kindle Assistive Reader is… fully available in the update that was released a couple of days ago. It will essentially convert any Kindle book into an audiobook, utilizing an advanced text-to-speech system that sounds natural.

I had the old Keyboard Kindle and loved it because it had built-in speakers, which made it possible for me to listen to my ebooks. Even though the text-to-speech was terrible, at least it worked. Then Amazon decided to remove the speakers altogether and only included Bluetooth connectivity, but that experience wasn’t worth it to me. I’m glad to see that they’re once again making an effort to make their Kindles more accessible.

Smart Glasses

I haven’t been a fan of Facebook (Meta’s child company) since it was a standalone website back in my college days. Even then, that experience was short-lived. However, for the first time since, I paused to consider setting up a “burner” Meta account. My sole reason? To try out the Ray-Ban Meta Glasses with Be My Eyes and see what they offer the visually impaired community.

Apparently, people with vision loss are doing some pretty interesting things with this combination, gaining more freedom in their daily lives. For example, they’re using the Meta glasses to call up a volunteer for help with tasks like:

  • Configuring computer BIOS settings
  • Playing video games
  • Performing daily tasks hands-free

All of these possibilities are extremely enticing to me. My only real drawback to jumping on this technology right away is Meta.

Then again, I recently read about another similar pair of glasses that aren’t released yet—not until October, if memory serves. These are the Ally Solos Glasses, a product of the Solos and Envision partnership. Given Envision is a known assistive technology company, these glasses might be more tailored to visually impaired users, as Envision has extensive experience in that area.

Maybe I just talked myself into a solution that will work for me? I’ll definitely look more into the Ally Solos Glasses. If I like what I see, I may just pre-order them to see how they work in practice.

Follow Me On Mastodon | Buy Me A Coffee