Apple adding new disability friendly features to iOS

The blind Travels logo which features a silhouette of a photographer and a guide dog in front of the apple logo.

Apple today is announcing improvements to the iOS operating system, adding new features for users with a variety of disabilities. Of particular interest to the visually impaired community is the improvements to the voice over screen reader which will wil now allow users to identify the contents of images onscreen. If at it’s core, the screen reader will be able to identify people and their emotional state – smiling, crying etc. and be able to read the text on memes it will be a boon to the blind and low vision communities. I know Apple have been working on this technology for a few years now and I for one an happy to see it being implemented as a core feature of the operating system,. 

I’m a huge advocate for accessibility in social media, and as part of that I am constantly on a crusade to educate social media influencers on the importance of proper hash tag usage and how image descriptions can not only make the social media experience better for those who are blind and visually impaired, but can have the added benefit of reaching a new audience for their content. Good job on Apple for continuing to make accessibility in their products a priority

Meanwhile, improvements are coming to Apple’s VoiceOver screen reader to allow users to explore objects within images. Apple Watch will become more sensitive to muscle movement and tendon activity to give people with limb differences control over the device without using touch. And, the company is launching a new service called SignTime to allow customers to communicate with Apple support and retail representatives using American Sign Language.

I love to hear from my readers, if you have comments or questions about this or any other article on Blind Travels, please drop me a message on my contact form. You can read my article about making social media more accessible below, and also read the entire article on upcoming developments Apple is working on for their iOS platform

Disability Scoop’s article

Apple Launching New Accessibility Features For Those With Disabilities – Disability Scoop

My article on making social media more accessible. 

Making your social media content accessible to the visually impaired


Honda creates new shoe-based navigation for the visually impaired

Honda's new in-shoe navigation system. The photo has a pair of shoe and a smartphone with map app open.

With the rise of vision impairment disabilities in the population, Honda is joining other companies like Microsoft in creating products for the blind and visually impaired market. From Honda on the creation of Ashirase, Inc.

Honda Motor Co., Ltd. recently announced the establishment of Ashirase, Inc. It is the automaker’s first business venture to come up from Ignition, Honda’s new business creation program.

The Product

Ashirase, Inc., has created an in-shoe navigation system (also called Ashirase) which utilizes a smartphone app and GPS to aid the user in navigating a route entered into the app. Navigational alerts are delivered to the user through vibrators in the shoes. If the user is “on track” for their route, the vibrator in the front of the shoe activates. If the user is off course, there are vibrators in the left and right sides of the shoe to steer the user in the proper direction. The product is slated for market availability before March 31, 2023. According to the articles about the new device, it is intended to replace the white ane and free up th user’s hands when traveling. 

Questions

When I hear about new products that are intended to replace the white cane, I always have questions. First, would be obstacle avoidance. According to the available information, the shoes are intended to have a route input into the smartphone app then the vibrators keep you on track to your destination. For me, 90% or traveling from point A to B with my white cane is avoiding things in my path. If I am on a known route, then I know when I am off course and have landmarks along the path to rely on. A navigation aid like this would give a second source of reissuance that I am indeed headed on the proper path to my destination. 

I wonder why there are only three vibration points in the shoes. I pass my destination all the time and having a source of input that vibrated on my heel to let me know that I have passed my destination would be a big help. 

We reported recently on shoes that use LiDAR to detect obstacles in the user’s path and help them get around their environment. It sounds like a navigation system that tells you were you are going combined with a LiDAR system that tells you what is in your path could really be the next step in visually impaired navigation. These two companies should combine their technology and create an all in one solution. 

Conclusion

I love to hear from my readers! Feel free to drop me a message here if you have questions bout this or any other article on blindtravels.com. Follow me on social media – I’ll follow you back. 

Instagram: @nedskee

Twitter: @nedskee

My Photography:  www.tahquechi.com


Follow us on Facebook!

A silhouette of a man holding a camera standing next to a silhouette of a lab guide dog in harness. The Facebook logo can be seen in the background.

Did you know that Blind Travels has a Facebook page? Come and follow the page for updates on my upcoming trips, meetups, and speaking engagements. I also post a lot of my travel photography there, as well as tips and triks to get better images when you are traveling. See you there!

Blind Travels | Facebook

 


Google I/O and awesome accessibility features coming

Google I/O logo

Along with all the fun travel and destination reviews I do here on blindtravels, I love to talk about technology, especially when it relates to travel. This year’s Google I/O conference, a gathering of the minds which allows Google to show off all the cool new software and hardware they have developed. The software is generally in development and not ready for consumer use, but viewers of this conference get a glimpse into some of the great features for applications like Google maps that are headed our way. What was on tap for this year’s conference, and how will it help me travel more effectively? Lets dive in! 

LaMDA

The Language Model for Dialog Applications is a natural sounding conversational language interface. Anyone who uses Google Assistant or SiRi will know that there is not much dialog, it is more you asking the interface do do something and it responds. With  LaMDA, Google is trying to get a natural sounding dialog going with the user by delivering dialog and information to the user in a way that invites continued interaction with the interface. Google (and Apple) are continually developing their voice assistants capabilities and improving the way they interact with you. LaMDA seems like the next step in this evolutionary process and I look forward to the increased capabilities and more natural sounding interaction with my assistants. LaMDA is currently only working with text, but Google plans to implement the ability to interact with audio video and images. 

Google Maps

For those of us who rely on our feet for transportation, the new features in Google Maps are going to be great. Not only are they refining the granularity of the data you get, like where the sidewalks are (which is awesome) and pointing out landmarks or even where your hotel is in relation to your location, but they are mapping train stations, transit stations malls and airports, finally making inside travel easer. This is great, and the good news is that the airport mapping features are rolling out later this week. 

I’m not a big android user so I am going to skip the accessibility features for the new phones and OS. Overall the coming improvements are welcomed, especially the airport mapping. Here is a video which summarizes the features I spoke about along with all the new improvements coming for Google phones and photography.

Follow me!

I am running a contest on Instagram right now, anyone who follows me before June 1 will be entered into a drawing for a limited-edition photo print, so follow me now and I will follow you back! @nedskee I love to hear from my readers, if you have a question or comment on this article, feel free to drop me a message on my contact form here or on my social media links below.

My Photography sitehttp://www.tahquechi.com/

My travel sitehttp://www.blindtravels.com/

Twitter and Instagram: @nedskee

Ted | Blind Photographer (@nedskee) • Instagram photos and videos

Follow me and I will happily follow you back.


Do blind people use Instagram?

Instagram logo and blind photographer with guide dog

Visual impairment is not black and white, there are many levels between fully functional vision and being completely blind. Hearing impairment, and mobility impairment also have many varying levels, but in the case of mobility impairment, those differences are more clearly visible – or at least you might think they are. Someone traveling in a wheelchair might or might not be able to support their own weight, just like someone traveling with a cane might or might not have some level of functional vision. This applies to hearing impaired as well, just because someone’s primary method of communication is sign language does not mean they are completely hearing impaired. So, if you see me on my phone browsing social media, I might actually be looking at photos, but in a different way than you do.  

It might surprise you to learn that many blind people regularly use Instagram, Facebook and Twitter, but depending on how the content creator posts their information, some of these services are easier to use than others.

What is alt text?

Alt text is a short phrase that is used to identify images, typically on web pages. Screen readers use the alt text tag to give visually impaired viewers information about the image being displayed.  Most social media platforms have options for content creators to use alt text to add accessibility to posted content. Facebook (and soon Instagram) are automatically adding alt text to memes and other images rather successfully. This doesn’t mean that you should rely on Facebook or Instagram to generate the alt text for you, because we all want control of our posted content.

The alt text tag comes historically from HTML web language, you can see it in action on many websites, Hovering over a web image will often display a description, which comes from the alt text entered for that image. The alt text is also used by many content engines in place of an image which does not load. You might be asking how alt text can help content providers expand their audience, but you might need to change your mindset on what alt text is really intended for.

SEO vs. ALT Text

Anyone reading articles about getting started in Instagram, will quickly notice that most of the authors of these articles highly recommend using the image description and/or alt text description as another opportunity to add a bunch of tags for Search Engine Optimization (SEO). However, I will show you that if used properly, alt text and image descriptions can not only increase your ranking on SEO, but it can improve the experience of blind and visually impaired viewers of your content.

Visual Storytelling

Instagrammers are often referred as visual storytellers. Because of the limitations of the platform, you have a finite amount of space to describe your image in a way that the algorithm will bring you to the top of the heap in viewers feeds. Often content creators have to rely on tag clouds (comments with tags) and location tags to add additional information to the image in hopes of getting a better ranking on a very congested social media platform. This is where the improper use of alt text comes in to play for many creators, they will flood the alt text field with tags rather than add useful information for blind and visually impaired viewers as it is intended. As a storyteller, look at the alt text for Instagram posts as another opportunity to tell the story about your image. First, start thinking about what goes into writing a good image description.

Straightforward and clear

Screen readers often break up long text into smaller more manageable pieces. It is easier for those using screen readers if you provide a clear straightforward description of your image. With this in mind, look at the image above. You might be tempted to say something like “girls on the beach” as your description, but does that really tell the story? Don’t just think about whether it is males or females and how many of them and where they are, think about it in terms of the story.  “Six girls wearing rainbow swimsuits facing away from the camera with heir hands in the air sitting on a sandy beach with clear blue sky” is a description that would let you close your eyes and imagine the content of the image. If you were selling a product like swimsuits, you could add the brand name, or the specific colors of the products. This description could also be expanded to include the location of the shot if it were taken in Cozumel, Mexico for example. I opted to not include the location because there were no identifying landmarks or structures in the image, but if you were promoting a vacation property there is no reason you could not include that. Screen reader will read tags is you prefer to put them in #hashtag form, but it is a good idea to limit yourself to a couple. Search Engines will also process this additional information and hashtags when it crawls your posting. Once you have your description done, how do you add it to your new Instagram post?

Adding alt text to Instagram post

  1. Start by uploading (or taking) a photo to Instagram
  2. Add filters and edit the image, click Next
  3. Scroll down, and tap on “Advanced Settings” at the bottom of the screen
  4. Click on “Write Alt Text”
  5. Write your alternative text in the box provided and click Done (iOS) or Save (Android).
  6. To finish posting, click back. Finish your post with caption, tag accounts, etc. as you normally would.
  7. Tap “Share” once you are ready

Revisit old posts?

Is it worth revisiting old posts and adding alt text? Of course. Any post you would like additional SEO for, and to increase your audience for would benefit from properly implemented alt text descriptions.

Why add alt text

  • Makes your content more inclusive for people with visual impairments
  • Adds additional information for the Instagram algorithm
  • Expands your content’s discoverability way beyond Instagram

Follow me!

I am running a contest on Instagram right now, anyone who follows me before June 1 will be entered into a drawing for a limited-edition photo print, so follow me now and I will follow you back! @nedskee I love to hear from my readers, if you have a question or comment on this article, feel free to drop me a message on my contact form here or on my social media links below.

My Photography site: http://www.tahquechi.com/

My travel site: http://www.blindtravels.com/

Twitter and Instagram: @nedskee

Ted | Blind Photographer (@nedskee) • Instagram photos and videos

Follow me and I will happily follow you back.


Railway stations ‘mapped’ for visually impaired passengers

A camera with a blind person walking with a cane in the lens.

Sussex are taking accessibility to the next level for some of their railway stations. The improvements implemented to make the stations more accessible to blind and visually impaired travelers include some great looking tactile maps manufactured by the Royal National Institute of Blind People, higher contrast markings on stairs and public address system quality upgrades. 

I love to see public transportation accessibility being upgraded. I live in the United States and I wish there were more programs to increase the accessibility and functionality of public transportation here. Granted our transportation infrastructure is nowhere near the UK in terms of quality and functionality, but we can strive to get there. Better quality maps, and higher contest marking for stairs are beneficial to all riders. I really like the improvement of the public address system. Its a simple thing, but making it easier to understand which trains are arriving and departing can reduce stress on all riders, not just the hearing and vision impaired. As they continue to improve their mapping and accessibility for the railway stations I will report back. In the meantime, here is a list of the currently upgraded stations. 

Stations in the south to benefit from the investment

  • Brighton- Update of tactile maps

  • Crawley – ramp enhancements

  • Goring by Sea – stairs enhancement

  • Haywards Heath – update of tactile maps

  • Shoreham – stairs enhancement

  • Three Bridges -New ticket gate wide enough for wheelchairs, buggies and people with luggage

  • Worthing – stairs enhancement

You ca read more about the improvements they are making at the link below. 

https://www.itv.com/news/meridian/2021-05-16/sussex-railway-stations-mapped-for-visually-impaired-passengers

I love to hear from my readers, please drop me a line and let me know what you think of this article on my contact form or at my social medial links below. 

My Photography site: http://www.tahquechi.com/

My travel site: http://www.blindtravels.com/

Twitter and Instagram: @nedskee

Follow me and I will happily follow you back.


The importance of real time audio descriptions for the news

The blind Travels logo inside of a television with the closed captioning and audio descriptions logos.

I have been thinking about something for a while now, and I finally sat down to write an article about it. Why don’t television stations offer real time audio descriptions for blind and visually impaired viewers  during live broadcasts like they do real time subtitles for those who are hearing impaired? It seems like this should be a service that should be available out of fairness and accessibility for all.

Real time

When an important breaking news story happens, or a premiere sporting event like the Super bowl etc. Provisions are made for viewers who are hearing impaired, but not for those who are visually impaired. As recently as 2019, the FCC is debating the importance of real time closed captioning especially for news programs, something that they state benefits the general public. In an article from tvtechnology.com:

Citing a recent study that noted that 80% of viewers who use captioning are not hearing impaired, Suzy Rosen Singleton, chief of the CGB Disability Rights Office for the FCC, noted that “captioning really has become ubiquitous and is a huge benefit for the general population.”

Here is a link to the article from tvtechnology that talks about the FCC and real time closed captioning. 

https://www.tvtechnology.com/news/fcc-debates-evolution-of-live-captioning-for-news

Since such a large number of viewers use closed captioning, it makes monetary sense for media companies to consider the cost of hiring employees dedicated to real time closed captioning. I fear that this is not the case for real time audio descriptions for those of us who are vision impaired. This is a disheartening fact because the importance of real time news is as important to the visually impaired community as it is to the hearing impaired community. What would be involved in offering real time audio description to news programs and live events? Infrastructure change and cost.

The cost

Adding real time audio descriptions for blind and visually impaired viewers to news (especially breaking news) would of course include media companies bringing on the staff who voice the content, just like they would the staff to do real time closed captioning. The real problem is the dilemma of where the content would be delivered to the viewers. Most televisions in households today only have one Second Audio Program per channel (SAP) and that is generally utilized for non-English languages. This means that media companies would likely have to utilize a dedicated channel for the foreseeable future until industry-wide changes could be made. I don’t work in the television industry, so I am unsure if changes could even be made to allow channels to carry more than one SAP. Adoption would need to happen across the board from media companies which broadcast and produce the content, to cable providers and television and set-top box manufacturers. This all seems like an insurmountable amount of change that would need to happen to provide valuable content to blind and visually impaired viewers, but they made these changes for the hearing impaired, it just took time and a loud voice to advocate for change.  

What is happening?

First and foremost, when a national disaster or large breaking news event happens, blind and visually impaired viewers will not be restricted to what the news anchors are saying. I’m not going to make this political in any way, but I will reference the latest major breaking news event, the January 6th riots at the capitol. As I watched the news that day and listened to the anchors talk about what was going on, it was crystal clear that they were not delivering the moment-to-moment happenings as they were saying “look at that” and “can you believe that” because they were delivering the commentary based on the assumption that their viewers could see the content on the screen, This leaves blind and visually impaired viewers out in the cold in terms of knowing what is happening on the screen. Again, I’m not making this political which is why I am not mentioning the stations I was watching the events unfold on. The capitol riot is only the latest example, I can remember back to 9/11 listening to the anchors gasping in disbelief when the towers fell, and me wondering what was going on.

Benefits

As I mentioned earlier, since the infrastructure is not there on the televisions and set top boxes in people’s living rooms, it would likely come to media companies to secure another channel for delivering audio described content. Media companies have to lease the space for each channel they utilize on a service so a dedicated channel for audio described content is not likely. This is a sad truth even though the new channel would give the media company a targeted audience for advertisement. It is unlikely the delivery of real time audio descriptions would be seen as useful to the general public. 

Solutions

It seems like the only way to get this sort of service offered would be for companies like Apple or Roku that are content providers and also offer streaming boxes which have an audio channel for audio described content when available to step up and offer some sort of news programming with real time closed captioning and audio descriptions. Those of us who are disabled and would value such content would certainly support the additional charge for such a service. The problem with companies like Apple offering real time news programming is lies with the politically charged climate and division over news programming, it would likely have the negative consequence of seeming to align the company providing the content with a political party.

Inventers like Amir Mujezin who is visually impaired himself are also a good place to look for a solution to adding real time audio descriptions to content. He recently debuted a tool which will allow audio descriptions to be implemented into movies at a much-reduced cost compared to traditional methods. Innovate individuals like Mr. Mujezin will I am sure eventually be able to create a device which can give real time descriptions to on-screen content, it is just going to take time. Here is a link to an article which explains the new tool  Mr. Mujezin has created.:

Amir Mujezin designed a Tool for visually impaired People to better experience the Content

We can do it!

To get the ball rolling on the needed change, the blind and visually impaired community need to find a member of congress who would be willing to pick this up as a pet project. The overwhelming amount of change to have a viable audio described content solution implemented is staggering, but just because the problem is large doesn’t mean it is impossible. If the media companies hear from their subscribers in large enough numbers, they will sit up and start to take notice of the situation. This is a valuable service and it has already been implemented industry wide for the hearing impaired, so why should we think that with that president in place that they would not be willing to implement the same convenience for the visually impaired community.

Your opinion?

With a problem that requires a large solution like this I am sure that I have missed some important points on the topic. I’d love to hear what you think. Drop me a message here or on social media links below and let me know your thoughts. I’m willing to help get this started if I could get help.

My Photography site: http://www.tahquechi.com/

My travel site: http://www.blindtravels.com/

Twitter and Instagram: @nedskee

Follow me and I will happily follow you back.


IBM creates app to help blind people socially distance

A camera with a blind person walking with a cane in the lens.

When the COVID-19 pandemic first hit, many things changed very quickly and accessibility was not considered into many of the new guidelines and rules put into place. Stores places sighs on the floor marking aisles for one-way traffic and there were a million signs put up  instructing patrons to stay six feet apart and socially distance. Being blind or visually impaired instantly became ever a larger issue than it was previously. We could not read the signs, we could not see the one-way signs and socially distancing when you can’t see where the person in front of you is was a nightmare. Personally I found it even more difficult because my guide dog Fauna has no idea about social distancing. She is a social animal and is trained to walk right up to the person in front of us in a line. More than one I was yelled at in a store for going the wrong way down an aisle or standing too close to a person in line, so frustrating. 

The researchers at IBM have created a smartphone app called LineChaser which uses vibration and audible cues to help blind and visually impaired users stay the proper distance apart. IBM is well known for creating cutting edge object detection devices including heir suitcase-sized navigation system which uses LiDAR and RGB-D to detect objects and aid users in navigating the world. 

LineChaser does not require any special hardware, the app uses an off-the-shelf smartphone. It would be really easy for a blind or visually impaired user to sit back and say that this app should have been out last year, but development takes time. I worked in product development for 20 years and fully understand that the development of this app likely took time to design, implement and allocate development personnel. Regardless of the release date, I love to see companies, especially large ones like IBM developing apps that will make life easier for their blind and visually impaired users. LineChaser is not only for use during pandemic times, waiting in a line at the airport is one of the most frustrating things you can do as a blind or visually impaired person. First you have to find the end of the line, which can change over time, then you need to follow the person in front of you as the line moves. This task is significantly easier if you have a guide dog, but of course not everyone does. LineChaser has some great possibilities for alleviating the frustration of one aspect of blind life.  

You can read more about the LineChaser app here

https://www.slashgear.com/smartphone-app-helps-blind-people-stand-in-socially-distanced-lines-10672088/

I love to hear from my readers, feel free to drop me a line and connect. I love to hear what you think about the articles I post and am interested to hear what you would like to see more of in terms of content. 

My Photography site: http://www.tahquechi.com/

My travel site: http://www.blindtravels.com/

Twitter and Instagram: @nedskee

Follow me and I will happily follow you back.

 


Youth Not so bored game night every week via zoom

A camera with a blind person walking with a cane in the lens.

The Lighthouse for the Blind offer a ton of great resources and programs for the blind and visually impaired. I can personally vouch for the Lighthouse because I have worked with them and visited their campus a few times. I have found everyone there from the students to the team incredibly welcoming and friendly. One of the cool programs they offer is a weekly board game night via ZOOM for students under the age of 18 who are blind or visually impaired. 

Who: students under 18 that are blind or have low vision
What: weekly game night and guest mentor spotlight via zoom When: Tuesday evenings from 7:00 pm – 8:30 pm
Where: Anywhere you can access a Zoom meeting
RSVP: Zoom meeting information will be shared with those that RSVP by 5:00 pm of the day before the program
Parents & Guardians: we’d love your help getting your children connected to our virtual program, however, once they have joined, we ask that you please give them the space to participate individually.

It can be difficult to socialize when you are blind or visually impaired, but the addition of the COVID-19 pandemic have made it even worse. Many of my blind and visually impaired friends are finding themselves lonely and feeling trapped in their own homes because of pandemic fear. Programs like the ones offered from Lighthouse for the Blind can alleviate some of that lonely or isolated feeling. They are really great people give them a call. 

You can read more about this program here

Not So Bored Game Night (online)

I love to hear from my readers, feel free to drop me a line and connect. I love to hear what you think about the articles I post and am interested to hear what you would like to see more of in terms of content. 

My Photography site: http://www.tahquechi.com/

My travel site: http://www.blindtravels.com/

Twitter and Instagram: @nedskee

Follow me and I will happily follow you back.


I’m giving away a limited edition photo print!

A camera with a blind person walking with a cane in the lens.

I’m giving away a limited-edition print for my yearly follow-fest. If you follow me on Instagram @nedskee between now and June 1, 2021 you will be entered to win this print. I will choose one random winner from all the entries. And contact the winner via direct message. If you follow me on Twitter, @nedskee you will get an additional entry. There is no cost to enter, all you need to do is follow me.  

Instagram

https://www.instagram.com/nedskee/

Twitter

@nedskee

I don’t sell my work. So, the only way to get one of my prints is through this yearly contest. Thanks and I look forward to connecting with you!

My Photography site: http://www.tahquechi.com/

My travel site: http://www.blindtravels.com/

Twitter and Instagram: @nedskee

Follow me and I will happily follow you back.


© 2025: Blind Travels | Travel Theme by: D5 Creation | Powered by: WordPress
Skip to content