SubComm Infographics – Who Watches Subtitles?

By Selma Akseki

Selma Akseki is currently working as a freelance translator and subtitler. She holds a M.A. degree in English Translation and Interpreting from Hacettepe University. Her research interests lie in the field of audiovisual translation with a focus on subtitling, media accessibility, and audience reception. As a reflection of her interests, she wrote her master’s thesis on subtitling for the d/Deaf and hard of Hearing (SDH) which presented a profile of Turkish d/Deaf and hard of hearing viewers and their preferred subtitling strategies used in SDH.

Recently subtitles have been making the headlines in various media. The majority of those headlines seem to be asking the same question, “Why is everyone watching TV with subtitles on?” There are several reasons, ranging from personal needs and wants to technical reasons which are related to the way we consume content. Let’s go back to where it all began and trace the steps that turned subtitles into an integral part of our daily lives.

Silent films had shots of texts displayed at certain intervals during a screening, serving as an aid to understanding the plot for the viewer. These texts would include quoted speech, explanation of an action, motivation of a character, etc.; anything to help the narration. While initially they were called ‘sub-titles’, later they were renamed as ‘intertitles’ so they wouldn’t be confused with the new form of subtitles emerging with sound films. These new ‘subtitles’ were displayed at the bottom of the screen (coexisting with the image unlike intertitles) and gave a translation of speech for audiences who didn’t know the language spoken in the film (Dupré la Tour 2005). In the infographic, we see Abhishek using subtitles to watch a film in the cinema for the very same reason. Another example for this type of use can be seen where Sofia is watching a K-Pop (Korean popular music) video with English subtitles as she doesn’t understand the language of the video. 76% of UK adults and teenagers who participated in the Ofcom VoD survey 2022, said they watched non-English content on VoD services with English subtitles. Overcoming the language barrier is the most common reason for watching subtitles. 

Photo by Monica Flores on Unsplash

The rise of sound films created a new audience who needed a different type of subtitles as they weren’t able to access the sound of a film. This audience needed subtitles that presented the text version of spoken content (not the translation), with the addition of relevant sound information (sound effects and music), and speaker identification; anything that couldn’t be understood or inferred from the images on screen. This type of descriptive subtitles is referred to as captions, closed captions [CC], or subtitling for the d/Deaf and hard of hearing (SDH) in different countries or contexts. It might be timely to elaborate on the three subgroups of this viewer group, i.e. the d/Deaf and hard of hearing audience. Hard of hearing viewers generally have mild to moderate hearing loss which means they have residual hearing. Since they use the oral and written language, they benefit from subtitles greatly. But what about viewers who have severe or profound hearing loss, who are deaf? Here we need to make another distinction between deaf (with a lowercase d) and Deaf (with a capital D). For deaf (with a lowercase d) viewers who have lost their hearing after acquiring speech and language abilities, and continue to use the spoken language, subtitles are still the main access tool to video content. On the other hand, for Deaf (with a capital D) viewers, who were either born deaf or experienced pre-lingual hearing loss, and use sign language as their primary means of communication, the spoken language and its writing system is a foreign language which they may or may not have a good command of. For those viewers, the best way to access video content is sign language interpreting. But this doesn’t mean they are not making use of subtitles. On the contrary, they use them when sign language interpreting is not available, and the truth is that the majority of video content doesn’t offer sign language interpreting, especially content on paid TV,  streaming platforms and social media platforms. In the infographic, Fred is watching a TV show with subtitles because he is Deaf or he has hearing loss, showing how subtitles are an indispensable access tool for all d/Deaf and hard of hearing viewers who constitute over 5% of the world’s population (430 million people).

There is also another group who benefits greatly from SDH which often goes unmentioned: prosopagnosics, i.e. people who have face blindness. Prosopagnosics often have difficulty in recognizing faces, and detecting facial expressions. “One of the most common complaints of prosopagnosics is that they have trouble following the plot of television shows and movies, because they cannot keep track of the identity of the characters.” (Faceblind.org, para. 3) Speaker identification in captions, thus, is a vital feature for prosopagnosics, who make up almost 3.08% of the population, as research suggests.

People also use subtitles as an educational tool, be it in a formal setting like a classroom, or a part of an individual’s lifelong learning journey. In the infographic, Weiwei, a language teacher, is using same-language subtitles, which is basically a transcription of the speech in a video, to encourage their students to learn the language. Studies show that same-language subtitles benefit hearing persons learning that language, even more than subtitles in their native language. Colm, likewise chooses to watch a French film with auto-generated French subtitles on YouTube because he is learning the language. As you might have noticed, these two examples are related to learning a second or foreign language. But what about the educational value of same-language subtitles in one’s mother tongue or native language? The idea to use same language subtitling (SLS) as a mass education tool for literacy on television was brought forward by Brij Kothari (1998). He argued that since people seemed to prefer television over reading, offering SLS on Indian television could “add entertainment value to already popular song programmes; and as a consequence, contribute to the development of literacy skills, nationally” (p. 2507). After two decades of implementation and research, SLS became a part of India’s Accessibility Standards in 2019, improving the reading skills of a billion Indian viewers of all ages. The non-profit initiative PlanetRead (founded by Kothari) aims to promote the use of SLS globally both on entertainment TV programs and on digital platforms. Turn on the Subtitles, another non-profit in the UK, likewise encourages the use of SLS to improve children’s literacy. In the infographic, we see an example of this type of use by Deepal who is not a strong reader, and therefore is using SLS to learn to read better. 

Photo by charlesdeluvio on Unsplash

Who else watches subtitles? Nour is using subtitles because they help her to focus. According to the findings of a recent survey by Stagetext, a deaf-lead charity in the UK, 42% of 2,000 participants said they used subtitles or captions to help them concentrate. Subtitles were used the most by participants under the age of 25, and ‘concentration’ was a reason for more than half of this age group.  Another survey by Preply, an online language learning marketplace, revealed that ‘to stay focused on screen’, was the fourth most common reason (27%) for using subtitles (1260 American participants). When participants were asked whether subtitles were distracting them or helping them focus, 68% stated that subtitles were helping them ‘to hold attention on the screen’. These results seem to suggest that subtitles are used by a wide range of viewers who find it difficult to concentrate. There are several factors that may affect concentration. For instance, Kit is using subtitles because they are in a noisy environment. If we put external factors aside, e.g. a noisy environment, digital distraction, stress, etc.; there are also individualistic reasons that may affect a person’s focus. For instance, autism, dyslexia, and  attention deficit hyperactivity disorder (ADHD) seem to affect concentration. These conditions, alongside many others, fall under the category of neurodiversity (coined by Singer 1998) which emphasizes how neurological differences shape how we think and process information, and the way we interact with the world around us. Estimations based on studies reveal that the percentage of neurodiverse people (also called neurodivergent or neurodifferent) range between 15% and 20% of the world’s population (Doyle 2020). Although there isn’t extensive data yet regarding the use of subtitles by neurodiverse people, some preliminary data exists. A 2012 study showed that ADHD individuals retained more information from a video with subtitles despite the expectation it would be more demanding visually. In another very recent 2023 study, 8 out of 15 TikTok video creators self-described themselves as neurodivergent — either by naming their neurodivergence as ADHD (most of them) and/or autism, or by saying they had traits shared with the neurodivergent participants — and they pointed out that they were adding captions to their content as a reflection of how they themselves wanted to consume video content. We also see that the newly published Ofcom revised guidelines for access services, has placed emphasis on taking into account audiences with cognitive, neurodevelopmental and complex disabilities — who used subtitles “to help them validate what they heard, retain information, and/or improve their reading or spelling” — when designing access services, and also to consider broader benefits such as viewers using subtitles in noisy or public environments, or children using them to improve their literacy levels (p.15). As studies in this field become more widespread, hopefully, we will gain a better understanding in the ways subtitles can be of service for neurodivergent audiences. (For more information on how captions and subtitles can benefit viewers with different conditions, see Closed captions and neurodivergence, The power of inclusion recognizing neurodiversity, The impact of subtitles on video accessibility and inclusivity)

Photo by Immo Wegmann on Unsplash

When we turn to Annie we see that she’s using subtitles because she’s struggling to understand the speaker’s accent – or the sound is unclear.  A recent  YouGov survey  displays that 40% of adult U.S. citizens who watch TV with subtitles, use subtitles because it helps them understand accents, whereas 33% use them in a noisy environment and another 22% use them to avoid disturbing others. In the Preply survey, the audio being muddled (i.e. unclear sound) was the top reason (72%), the accent being hard to understand was the second (61%), and watching quietly at home (29%) was the third. The sound mixer Gunter Sics says that the muddled audio issue is mainly related to advances in technology. Portable microphones transformed acting and allowed for more naturalistic speech, bordering on mumbling. Audio technology became more sophisticated and gave filmmakers the opportunity to include more sounds. However, the sound mixed and edited at the end is designed for an experience in a movie theater with a specific sound system. And when these productions are being watched on a TV set, or on mobile devices, which do not possess the intended sound system qualifications, many details are lost, leaving the audio muddled and the dialogues hard to understand. So, it is safe to say that the audio quality is affected also by the way we consume video content. It seems like we all are using subtitles for one reason or another, and findings from various surveys and research seem to confirm this fact. A 2022 Ideal Insight survey found that 85% of Netflix users in the UK are watching content with subtitles which makes one wonder whether we will reach the 100% mark and if we will, how long it will take. 

 

Sources

Dupré la Tour, C. (2005). Intertitles and titles. In The Encyclopedia of Early Cinema, ed. Richard Abel, 326–331. London: Routledge. https://www.routledge.com/Encyclopedia-of-Early-Cinema/Abel/p/book/9780415234405 

Kothari, B. (1998). Film songs as continuing education: Same language subtitling for literacy. Economic and Political Weekly, 33(39), 2507– 2510. https://www.jstor.org/stable/4407205

Singer, J. (1998). Odd People In: The Birth of Community Amongst People on the “Autistic Spectrum”: A personal exploration of a New Social Movement based on Neurological Diversity. [Bachelor’s thesis, University of Technology Sydney].

Are there further open access resources you would recommend on this topic? Please comment below with links!

Leave a Comment

css.php