I Challenged My AI Clone to Replace Me for 24 Hours | WSJ

1,329,311
0
Published 2023-04-28
New AI voice and video tools can look and sound like you. But can they fool your family—or bank?

WSJ’s Joanna Stern replaced herself with her AI twin for the day and put "her" through a series of challenges, including creating a TikTok, making video calls and testing her bank's voice biometric system.

0:00 How to make an AI video and voice clone
2:29 Challenge 1: Phone calls
3:36 Challenge 2: Create a TikTok
4:47 Challenge 3: Bank Biometrics
6:05 Challenge 4: Video calls
6:45 AI vs. Humans

Tech Things With Joanna Stern
Everything is now a tech thing. In creative and humorous videos, WSJ senior personal tech columnist Joanna Stern explains and reviews the products, services and trends that are changing our world.

#AI #Tech #WSJ

All Comments (21)
  • @DannyIvan86
    Boss: "We need you to train an AI that looks, talks and acts like you." Two weeks later, Boss: "You have been let go."
  • @gatodario
    The scary part is how good these services became in so little time.
  • @ManPlusRiver
    So anyone who has their voice recorded and on the internet (YouTubers, podcasters, Hollywood actors, radio personalities) can be cloned against their will.
  • @38Unkown
    This is so scary. What is worse is people have no idea how to regulate or protect people against these types of systems if used maliciously.
  • @bonzo7681
    The scariest part of this video isn't AI, it's the dude drinking from a bottle of mustard at 1:39
  • @Smojero
    The thing with A.I is that it's a snowball effect. Once a milestone is reached, it just improves on it so quickly.
  • @GlennHanna8
    Watch out for scammers calling parents with their offspring's voice as if they were in serious trouble. Even if they don't succeed, hearing your child's voice in great distress can haunt the parent for decades.
  • @marcus_b1
    The primary issue with her experiment was the lack of ability to add emotions to the avatar, primarily via voice alone. Once that is incorporated it could 100% pass simple interactions that can be expanded from there. An individual would need their own personal ever growing learning model to successfully pull this off fully.
  • The crazy thing is that this is the worst the technology will ever be.
  • @Sawpainter_td
    The bigger problem is that people have such a cavalier attitude about taking part in these AI stunts. This is not a joke, and I think we're going to find that out very soon. I could not believe she referred to the AI as her better self. Pay attention people, this is the mentality that is out there right now regarding AI or AGI, and we are feeding right into it.
  • @tacobell1299
    I think it's a bit scary that the AI can replicate your vocie and possibly steal your bank information
  • @MrTeff999
    “Stay human everyone.” Love it!
  • I'm not sure why she labeled Challenge 1 as "PASS" when it was obviously a fail. Both people said it sounded like her but they could tell that something was off.
  • @04heinm
    "Good luck. I am inevitable." ... killer signoff!
  • @tristx7832
    While she talks of Using AI to provide more quality time for herself, she did not mention the future possibility of replacing her. Right now companies are looking at ways to harnest the power of AI to reduce cost. Just remember a few decades ago, US never thought of their factories moving to china but it did and many jobs where lost. As a result, we had many chain supply issue during the pandemic since we no longer produce most products. The time will come when companies realize replacing human for AI will lead to their demise when a hacker can just control all the AI to do their bidding.
  • @gameon2000
    The most scary part is: how easily and fast most people could be replaced.
  • The chase thing is insane and it seems like the bank has no answers yet.
  • Chat GPT wasn't making stuff up about ios16. Chat GPT is based off a data scrape that predates ios16. So as far as it can tell, it was being accurate.
  • @k22kk22k
    This video convinced me we are already on the edge of AI singularity. What a time to live.
  • @MubinNoor
    I feel like during the takes they used to generate her avatar she used a stern and cold read type voice, hence why her avatar had an abnormal intonation. I think if she would have read the prompts more personably and more conversational, the avatar would have had those qualities in it as well.