Closed Captioning: More Ingenious than You Know

1,370,423
0
Published 2018-12-12
You can support this channel on Patreon! Link below
Over the years I’ve found that there are plenty of people who use captions, not just those who need them. But did you know that it took until the 1970’s for anyone to think of putting captions on TV? I can sort of understand that, since captions might be annoying if you don’t want them, but closed captioning would save the day and provide access to all, but only when needed.

Here's the link for making translated captions for my videos:
youtube.com/timedtext_cs_panel?tab=2&c=UCy0tKL1T7w…
Please don't feel compelled to do it, but if you'd like to give a bit of your time, I promise you'll get that imaginary badge!

Technology connections on Twitter (I’m just as weird on the social Internet as I am here on YouTube):
twitter.com/TechConnectify

The TC Subreddit (warning: I’m not good at Reddit)
www.reddit.com/r/technologyconnections

Technology Connections 2 (my second channel where I upload weird stuff from time to time)
   / @technologyconnextras  

Here’s that nifty Chicago Tribune article:
www.chicagotribune.com/news/ct-xpm-1989-05-05-8904…

And here’s those captioning glasses:
www.smithsonianmag.com/innovation/teen-inventors-c…
(sadly, it looks like the creators had started an indiegogo for this, and it failed pretty badly. Quite a shame, as this could be quite useful! However, I suppose a phone app might be good enough. Might be)

Some history from the NCI:
www.ncicap.org/about-us/history-of-closed-captioni…

You can support this channel on Patreon! It has been amazing what Patreon has done for this channel, but also for me (your dorky host) personally. Through the support of people just like you, Technology Connections has become my job and I am so excited and thankful for it! If you’d like to join the fine folks in a pledge to help the channel grow, please check out my Patreon page. Thank you for your consideration!
www.patreon.com/technologyconnections

And thank you to the following patrons!
Sha Nasti, Charles Surett, Ed Green, Stephen B. Hinton, Daniel Bernard, thegeoffreak, annoying and reprehensible idiot, Piotor Kowalski, Bob Slovick, Aleksei Besogonov, Michael Sims, Recycled, Meetupvideo, Jason Burgett, Wayne Marsh, Jib Systems, Lars Kuur, Alan Nise, Matt Dancer, Andrew Rosenwinkel, Fran Rogers, Tero Janhunen, Bob B, Mike Noe, Alan Smith, Philip Cosgrove, Joshua Doades, Rob Rymarczyk, Scarfacecapwn, Andreas Lunderhage, Ennex The Fox, LEONARD PEZZANO, Steve Kralik, Νικο Σα, Hank Eskin, Kirill Polstainen, Felix Winkelnkemper, Christopher Lawhead, BoostCookie, 98abaile, JustWusky, Dan Jones, Exilis, Till Bockemühl, Owen O Byrne, Project A118, Charles, Sebastian Sparrer, timeslapsey, George Stamoulis, Sarmad Gilani, Paul Moffat, Linh Pham, Laria, Michael Greb, Max, Alessandro Robert Nilsen, Ryan Benson, fussel, Brannan Barber, Jonathan Haas, Neil Forker, Vincent Beetle, Warmo, James Pinakis, Bruce Davis, Conor Kileen, Johnni Winther, Marke Hesse, Brian M Knoblock, Sean Sandercock, Robert Wolfanger, Cannon Fodder, Andre van Soest, lululombard, Nicholas Boccio, Armando Fox, Nelson, bluegoose, Kajico, Jason Hughes, Eli Krumholz, Angelo van der Sijpt, William Evans, Philip, Martin, X39, Richard Lantz, Dustin Crain, Gideon Rigger, Oliver Lee, AJay Janschewitz, Lennart Sorensen, Mitch Radoll, Viorel, Betsy Ecklund, Reachan Kekeis, Michael Scott, Sha Nasti, Loh Phat, Vivian Pypher, Torin Zaugg, John Donaldson, Brandon Whiting, Robert Tait, Zachary Hazlett, Steven Lynch, Nathan Blubagh, Joel, Peter Stewart, Liam O’Flynn, Russell Brower, Brannan Barber, Patrick Barry, Tyler, Robin Johnsen, Brian Wolman, Deryn Rouge, Ed Green, Eric, Phia Westfall, Markus Schumacher, Besenyei Adam, Colin Grimshaw, Mats Svardal, Shannon Potter, Jeremy Hastings, Mark Wayt, Jose Miguel Castillo, Matthew Reynolds, Andrew Mertzenich, John Lavallée, Dave Howlett, Matthew E. Cooper, Sonic the Anonymous Hedgehog (nice), Guilherme Vieira Dutra, Lee Tustain, Nathan Bruer, Lauren Hahn, Scott Rowland, Will Wren, Weird shortwave listener, kalleboo, Colin Hill, William Gray, Vaughn B., Sven Slootweg, Braden McDorman, Stephen Bank, Matthew Walster, Julien Oster, Joseph Dufour, Dr. Bjoern Bieber, Anders Enger Jensen, Phil Sowers, Juan Molinari, Jake Hickman, Trae Palmer, Ray, Robert McCullough, Gary Hampson, Lennart Rosam, Chris Wallace, Matt Shea, Andrew Miller, PJ Gunasekera, Justin Teixeira, Charles Zilinski, Aaron Helton, Michael Holmes, Reto Jost, Ken Schafer, Thomas Beaver, AwesomeGuy64, John Wagner, Trey Harris, Benjamin Kier, Fredrik Grufman, Peter Pfundstein, Carlos V, Wilhelm Screamer, Jeffrey Frasure, Mat Stu, RYAN NGOGLIA, R_T, Harald E. Westlie, Charlycobra, Thomas Kolanus, Jeff Bigs, Brett Morgan, Isabell Reine, William Kisley, Daniel Johnson, Jesse Kempf, Tyler DeWitt, Reemt Rühmkorf

All Comments (21)
  • A few notes/corrections: It's 525 times per frame, not second. I messed that up. It's 15,750 times per second! Many of you have pointed out that you can indeed define the position of captions. But, it's a really clunky ordeal and has its downsides. If I were to upload a specific file type (which funny enough is based on the Line 21 caption standard) then I could do it. However, I hadn't realized that the viewer can reposition them at will, and if I do upload the file, this overrides that functionality. So, uh, it's still not great. I would really like YouTube to allow me to move them from their built-in editor, which is what the vast majority of creators use. I admittedly made an assumption about the Rear Window Captioning System, and that assumption was that it's still in widespread use. Nope. It may still be out there, but there are now wireless receivers that can be incorporated as a heads-up display or a little LCD panel, and apparently these are cheaper to implement than the Rear Window system (I'm surprised by that honestly, but whatever). But, I still think that Rear Window is the most clever, if not the most prevalent anymore. Regarding Blu-Ray and line 21--apparently this is kinda supported but it's limited to older players. I guess there was some deal with movie studios where Blu-Ray would phase out analog connections by 2009 or something like that, so the fact that my PS3 couldn't output it was just because it's too new. Or something like that, I'm fuzzy on the details. But now I'm curious if new releases are still encoded with text metadata for line 21 captions. Maybe I need to find an older Blu-Ray player to test it. And lastly, a few people have asked over many videos how I am filming CRTs without flickering or framerate issues. I'm about to answer that in a TC2 video, and you can check it out here! https://youtu.be/0j0IC0bu3dg
  • @JohnMichaelson
    If Youtube's auto-generated captions are any indication of what we can expect from real time captioning, the hearing-impaired will probably conclude people are insane.
  • 19:10 or so: yeah that aged well, youtube basically said screw community captions. They're gone, which is why alec is such a hero for continuing to caption his own stuff
  • "That's right youtube, this old VCR and this box are more sophisticated than your captioning system. Maybe work on that." Youtube: Disables community contributed subtitles
  • @krab9479
    My parents are both deaf, and I’ve used captions my whole life because of that. I’ve always wondered how they worked, especially with live TV broadcasts. Thank you for the explanation!
  • @RussellFlowers
    On the flip-side of closed captions... I once accidentally turned on the SAP (second audio program) feature on my old Sony TV. Apparently this was being used by the PBS program I was watching to describe the visual components of the show for the visually impaired. But I knew none of this, I just thought I was watching the most over-narrated program I had ever seen.
  • "Look at Sony being all backwards compatible... Wait" I laughed out loud.
  • This was a great overview on captions.. A few clarifications.. I was one of 2 programmers who wrote the first PC based "real-time" captioning software used at NCI and WGBH and was working in this world in the late 80s/early 90s. We ran the captions in the Senate committee hearing where they debated the 1990 law you discussed, as a demo to the Senators. You mentioned that there was "timing information" in the captions. This is not really true. In reality, with "pop-up" captions like you see in movies and pre-scripted shows, the captions being sent are stored in a secondary frame buffer, and then at the moment they want the screen to change, a "flip pages" type command is sent to change which of 2 pages is the active one being displayed (and then new captions can be pre-loaded into the non-visible frame) so the only "timing" of this is really that the "flip screen" command has to be sent at the exact instant they want the new captions to appear. With roll-up captions, which are called "real-time" captions and are used for live events such as news or sports (with the stenographers as you mentioned), there is no secondary screen because the intent is to get the words on the screen as fast as possible, so characters go up as quickly as possible. Every frame of video has two bytes of data, so with 30 interlaced frames per second, the MOST you can send is 60 characters per second. One of the reasons "roll up" captions are delayed is not only because of the human delay of typing the words that were heard, but also because the stenographers are actually typing "phenomes" or basically syllables. It is the computer's job to decide what word(s) are being typed, and for that, you have to wait for multiple syllables to make sure you can caption a multi-syllabic word. For example, if someone was captioning the word "Ratatouille" on a cooking show, that would be many syllables. If the computer started sending out words on every syllable, it might come out as "Rat a two E" which obviously doesn't make sense. Occasionally you'll still see a mis-caption like this when a word isn't in the dictionary being used for translation, but generally our software stored a certain number of syllables and tried to make the longest match. We allowed the captioner to select generally 4-6 syllable buffer, so there was a natural 6 syllable delay before the words would come out after they were typed. Some competitive software would send out the shorter words, and then backspace over the word and correct it with a longer match if one was found, but with a limit of 60 characters per second, typing the words, then backspacing and re-typing would cause delay problems of it's own, and created even larger problems, especially if you needed to backspace to a previous line, which was much harder. Our PC software, as originally installed at NCI and WGBH ran on DOS 3.x on IBM AT 286, and eventually 386 PCs, and had replaced the same software running on a "Jaquard" brand mini computer ("mini" term being used loosely as it was back then comparing to a mainframe, as it was physically huge!) I still have the old software somewhere, but unfortunately I lost my last stenograph input device in a house fire so I have no way really to input to it as intended. Those were fun days and I am proud to have had a small part in helping this technology mature!
  • @leandervr
    As someone who is hearing impaired, I deeply appreciate that you include close captions in your videos!
  • I love it when you see the CC stenographer backspace a word on live tv.
  • @h3llbring0r
    In Europe, we use(d) Teletext for captions. You have to select a specific page and then a semitransparent text layer with captions is visible. And it supports multiple colors as well as two font sizes.
  • One big factor in the channels I choose on YouTube is how accessible they are. I'm not HoH (hard of hearing), but I have auditory processing disorder, which means my brain doesn't always process audio accurately or quickly. Without captions, I miss a lot, especially when someone isn't enunciating well, is whispering or speaking too fast, or there's too much ambient noise. That being said, I am so thankful to creators like you who not only make pristine captions, but enunciate really well. Also, for those who don't need captioning, but can tolerate them, I suggest you turn them on, especially on large streaming platforms. The more people who use them and speak up when they're badly written, the better they will get. Same for audio description (initially for the blind and visually impaired, but useful for adhd and multitasking while watching). Disabled people will always be a minority, which means progress is pretty slow.
  • @NoNameAtAll2
    "maybe work on that, Youtube" meanwhile, youtube: "wE TuRn oFf cOmMuNitY cAptIoNs CauSe nOonE uSeS thEm"
  • Alright, but when's the Teletext video coming my guy? I need that juicy European captioning standard info presented to me via this channel.
  • @CosmosisAjax
    Love the literal "took an act of congress" joke at the end. Also, instead of "Special Thanks to the Following Patreons:", would have loved if it had said "Closed Captions provided in part by:" :)
  • @PingTheAwesome
    I rely on captions to be able to understand content online. I am deaf and need them. Thank you for captioning your videos. Thank you for putting this video out. Also, thanks for being a bit more...upfront about the fact that they are not a choice for us and that they are explicitly needed.
  • @jamestatum748
    Shout out to the new name for closed captioning, subtitles for the deaf and hard of hearing, abbreviated as SDH. I also want to call out a new audio track that’s gaining prominence on a lot of streaming services, audio description - a track that adds description of the visual elements on screen for the blind. It’s also a way to “watch” TV on road trips.
  • @Porygonal64
    And he neeeeeeeeeeever talked about teletext again
  • @luketurner314
    Just figured out YouTube captions are draggable, but that's not the same as the content creator specifying where a particular caption should appear
  • Hi Alec, thanks for another great video. Did you make that Teletext / Ceefax video you mentioned? I can't seem to find one.