Captions in Everyday Use
Yesterday Henny Swan asked a simple question on the Twitters:
I'm curious to know, who uses subtitles on web content (X device) who's not deaf or hard of hearing? For example I did when breastfeeding.— Henny (@iheni) November 12, 2013
Adam Banks put together a Storify of the responses that show there are plenty of use cases for those not hard of hearing to get value from closed captioning.
In general, any context where either the audio track is loud enough that the viewer doesn’t want to disrupt those nearby, or the background noise is too much to hear the audio track clearly, is a case where captions have value for all users. Other cases that popped up include multi-tasking or working with a new language or just tough accents.
In short, closed captions have value for all users.
There is also no reason to panic about providing them, particularly if you use a video service that can do them for you. For example, back in 2010 YouTube committed to enabling auto-captioning for everyone, and Google has documents to help plus tutorials from others, such as this step-by-step or or this video.
Of course, as I was writing this post, Henny posted her own reference to the Twitter conversation: The weird and wonderful reasons why people use subtitles / captions
The Storify of responses I mentioned above is embedded here to spare you all the hassle of clicking the link and to bloat my page with unnecessary script blocks:
Update: November 14, 2013
While I was writing this, Dave Rupert was putting together a very neat experiment, Caption Everything: Using HTML5 to create a real-time closed captioning system.
It’s a neat proof-of-concept to show how real-time closed captioning is a possibility with current technology, albeit imprecise and cumbersome. If nothing else, hopefully it can bring more attention to a technique that, as demonstrated above, can benefit all users in everyday situations.
It’s such a nifty experiment, I am embedding it here (remember, this isn’t mine, this is Dave Rupert‘s code):