Chris Owen is a regular contributor for Vision Foundation. You can find his previous blogs here.
Since starting my new job in September my feet have hardly touched the ground and I’ve had very little time to think, let alone write! So, when I realised, I had one more day’s holiday to take before Christmas it seemed the perfect opportunity to take a day out, go for a swim and finally put pen to paper. When I write I’m one of those old-fashioned people who still sits with a notebook and pen, allowing the pen to jump over the page as the words flow. I used to write in pencil, using an old Cross mechanical pencil that my wife bought me for Christmas many years ago but, as my sight has faded, this has been replaced with a simple felt tip Pilot Signpen. The combination of a thick felt tip and my heavy writing hand makes for interesting times as I attempt to decipher the text whilst typing it up on my laptop!
So, what has gone on since last picking up my pen?
Aside from the new job, the most notable event was my first foray into public speaking when I was invited to speak at Durham University’s Advanced Computing’s Accessible Inclusion for Visual Impairment workshop. To be invited as a blind person to speak at an event like this was a real honour.
As someone who works in tech but is by no means technical himself, it was quite daunting to be tasked to talk about inclusion to a virtual room full of people far more intelligent than I. Rather than attempting to teach the preverbal granny to suck eggs, I instead focussed on the evolution of VI assistive technology – both physical and digital – and how putting the person at the heart of the development lead to some of the most beneficial designs, some of which have had far reaching benefits beyond the visually impaired community (to find out more you can watch the whole event on Open Life Science’s YouTube channel here or you can read about the evolution of the talking book in my article Agatha Christie to Aretha Franklin).
The other notable thing to happen in early Autumn was going to see the opening night of Dave Gorman’s current show “PowerPoint to the People”. As the blind spot in my macular grows, my detailed vision has virtually disappeared. As a result, I now only watch audio described TV shows and rely on Alt Text when flicking through photos on social media. Many photos shared on the family WhatsApp group are incomprehensible until I save them down from my phone to look at on my iPad.
Unlike subtitles, audio description isn’t widely available and with the increase in streaming the availability is sporadic at best. The thing about AD is its proprietary to the broadcaster or streaming provider. This means that each time a movie or show is broadcast or added to a streaming service such as Netflix, Amazon Prime, Disney+, iPlayer, etc the provider must commission and record the description soundtrack. Much like the original script, writing the description is an art-form. The writer has to find the balance between describing the scene in sufficient detail but succinctly enough that it doesn’t detract from the actor’s lines. It’s not as simple as just sitting down in front of a microphone, watching a programme, and talking about what’s happening.
But, because responsibility to record the AD is with each individual provider, you can watch the same programme on multiple platforms and get different quality descriptions – sometimes even recorded by the same person! As if that wasn’t bad enough, some broadcasters may simply decide the effort and cost is too great and simply not bother. AD availability is so hit and miss that you can watch something on live TV with description but streaming it online from the same broadcaster’s service, there is nothing. A recent example of this was during Film 4’s recent Star Trek marathon. Last Sunday I sat down to watch The Voyage Home (by far the best of the whole series, closely followed by First Contact) in the lounge where the movie was being very well described. Half-way through I moved to the kitchen to cook the roast dinner of slow roasted brisket, roast potatoes, and Yorkshire pudding. The kitchen doesn’t have an aerial, so I carried on streaming it on All 4 on the Apple TV but, here’s the thing, audio description isn’t available when you stream live TV! Ironically, as soon as the movie has finished, I can then watch on catch up and the description is available again. At a time when more people are foregoing traditional ways of watching TV to streaming everything online, this limitation has the potential to disenfranchise an entire demographic of viewers. It’s not just Channel 4 that has this limitation. BBC is exactly the same – MasterChef live via aerial – perfect, catch up – spot on, stream live on iPlayer – nothing.
This lack of consistency highlights how badly visually impaired viewers are treated by the broadcasters and how much content is inaccessible to us. Movies and series I used to love are no longer available – Spooks, Hustle, Doctor Who before Peter Capaldi’s 12th Doctor met Bill Potts, any Bond film before Daniel Craig took over in Casino Royale to name but a few.
So how can this be addressed?
In 2017, the UK government changed the law to introduce minimum quotas for audio description on streaming services but a poll by Ofcom in 2020 discovered that still over 80% of providers had no AD provision and, in 2022, the government still had not begun enforcing the new legislation (you can read more about RNIB’s campaign here).
This raises the question, are the broadcasters and distributors the right people to make content accessible? If I’d taken the time to write a screenplay, setting the scene to tell the story in just the right way, I’d be mortified if the description didn’t match my own vision. So many TV series and movies are written by a team, is it unreasonable to expect there to be a dedicated writer for the AD?
One of the best descriptions I’ve heard to date is for the Amazon Prime series The Marvellous Mrs Maisel which follows the ups and downs of divorcee comedian Midge Maisel as she tries to build a name for herself in 1950’s New York. Real care was taken to create a soundtrack that matched the tone of the series itself and even the narrator’s intonation reflected the scenes with sarcasm as she argues with her ex-father-in-law, sadness when she gets turned down for another gig or amusement during one of her many sets.
If responsibility for new content should be with the producers, it still begs the question who should take care of looking back at existing programming to ensure consistency. At present, because broadcaster owns the rights to the audio description they record, we end up with different descriptions of varying quality across platforms. Surely it would be better for these platforms to pool their resources and collaborate. What if every time a programme was described the details were added to a shared industry resource whereby any other broadcaster or distributor could search to find out what had already been created. This resource could be enhanced by including ratings from those of us who use audio description on a regular basis. The other broadcasters then have the choice of buying the existing recordings at significantly less expense and time or to create their own.
But what has all this got to do with going to see Dave Gorman at the Wycombe Swan? I’ve been to see a few of his shows over the years and, much like his TV shows, he uses many visual props to tell the story. As his shows are always so visual, I had concerns about being able to follow it properly, especially after reaching out and discovering that he does not typically describe his shows – with around 700 slides* for a 90 minute show the worry was that the description would overshadow the monologue. This raises an interesting question – can accessibility go too far? Is there a point when adding audio description for the sake of it gets too much and begins to detract from the main performance? Is there any value in describing a comedian on a Netflix show walking from one side of a stage to another if they’re just walking?
I watch blind comedians including Chris McCausland and Jamie MacDonald on shows like BBC 1’s “Have I Got News For You?” and when the images appear on screen there is no obvious description. As a blind audience member this appears to be especially cruel (especially as HIGNFY is not one of the BBC’s audio described shows), but what if they’re being fed the descriptions over their earpieces or have had a detailed preview during the rehearsal? Sometimes you need to be able to follow the flow of the conversation – especially in comedy where timing is everything – and having someone talking in your ear stops you from being able to do that.
The same goes for Dave Gorman’s stage show, it was hilarious even though there were times when I had no idea what was showing on screen. This was because the image is only placed to reinforce the joke.
The $64,000 question is whether audio description would have improved my experience?
Possibly, but only if it was done in a very specific and targeted way such as simply saying “a packet of cornflakes” or “Dion Dublin in a suit” at the appropriate time.
This piece has disappeared down an audio description rabbit hole and so if you’ve stuck with me then I congratulate you!
I’ll leave you with this final thought before putting down my pen and getting ready to take the dog for a wet and windy afternoon walk. We have come so far with accessibility and inclusion that we should take a moment to recognise this but, with the ever-changing technical landscape we cannot become complacent. Whether it’s calling out our favourite football team to include alt text on their social media images (hats off to Jurgen Donaldson aka NotJustABlindGuy for that one), petitioning the government to enforce their own legislation or just swapping a few messages on Twitter with an established comedian we’re all working to raise awareness and, with luck, bring about change.
*[17 January 2023 amended to correct total number of slides in show]