"Take on Fake:" How AI-Generated Content Is Impacting Elections | Amanpour and Company

Published 2024-03-14
We’ve already seen digitally manipulated video and audio files – or deepfakes – infiltrating the 2024 election cycle. Just how will this impact voters? Sam Gregory, Executive Director of Witness, and Claire Wardle, who is Co-director of the Information Futures Lab at Brown University, are disinformation experts. They joined Hari Sreenivasan to discuss what’s at stake, both politically and technologically. This conversation is part of the The WNET Group’s series "Take on Fake," which analyzes fake or altered video, images and audio to debunk the viral spread of misinformation and get to the truth.

For more "Take on Fake" with Hari Sreenivasan, click below:
youtube.com/c/takeonfake

Originally aired on March 14, 2024

----------------------------------------------------------------------------------------------------------------------------------------

Major support for Amanpour and Company is provided by The Anderson Family Endowment, Jim Attwood and Leslie Williams, Candace King Weir, the Leila and Mickey Straus Family Charitable Trust, Mark J. Blechner, the Filomen M. D'Agostino Foundation, Seton J. Melvin, Charles Rosenblum, Koo and Patricia Yuen, Barbara Hope Zuckerberg, Jeffrey Katz and Beth Rogers, Bernard and Denise Schwartz, the JPB Foundation, the Sylvia A. and Simon B. Poyta Programming Endowment to Fight Antisemitism and Josh Weston.

Subscribe to the Amanpour and Company. channel here: bit.ly/2EMIkTJ

Subscribe to our daily newsletter to find out who's on each night: www.pbs.org/wnet/amanpour-and-company/newsletter/

For more from Amanpour and Company, including full episodes, click here: to.pbs.org/2NBFpjf

Like Amanpour and Company on Facebook: bit.ly/2HNx3EF

Follow Amanpour and Company on Twitter: bit.ly/2HLpjTI

Watch Amanpour and Company weekdays on PBS (check local listings).

Amanpour and Company features wide-ranging, in-depth conversations with global thought leaders and cultural influencers on the issues and trends impacting the world each day, from politics, business and technology to arts, science and sports. Christiane Amanpour leads the conversation on global and domestic news from London with contributions by prominent journalists Walter Isaacson, Michel Martin, Alicia Menendez and Hari Sreenivasan from the Tisch WNET Studios at Lincoln Center in New York City.

#amanpourpbs

All Comments (6)
  • @whalesong8040
    This kind of coverage is so incredibly critical in this day and age: thank you all SO very much! It is truly terrifying, how our world is changing (or not!). Such an apt metaphor of releasing a super fast new car without seat belts or regulations, etc....
  • @franknunez7204
    The examples shown here are easy to identify as fake, they are full of obvious lighting, contrast, and color, inconsistencies. However, some high end retouching professionals who specialize in the veracity of image compositing have the skill to eliminate many of these examples' tell tale signs of AI image synthesis, so the capability for evading human detection by even professionals is actually here already. There is even a company that invented totally undetectable retouching called VFRT (violation free retouching). They use chemicals and microscopic systems to help make composited or synthesized imagery totally undetectable.
  • People, in general, need to learn to think critically! Critical thinking should be taught in all levels of education, from Pre-Kindergarten through college. It should be a requisite to be able to receive any degree.
  • Uh, how do you trace back an image? We need the exact app to check, please use an example and take us step by step to authenticate if something is real or not, like Trump surrounded by smiling African Americans.
  • @kymskiver8862
    We need some form of regulation using A.I for accountability. It can't be worth taking the risk to generate such. Short of laws in place about marketing and political ads, misleading constructs created to make you believe a lie, it gets left to us to essentially guess. But by then, damage done. Its like the lawyer asking or mentioning something knowing the jury are going to be told by the judge to dismiss. Can't unhear, can't unknow. However there is the flip side of censorship. But if the chidden misbehave, you just can't let it continue, can you?