I Challenged My AI Clone to Replace Me for 24 Hours | WSJ

1,411,500
0
Published 2023-04-28
New AI voice and video tools can look and sound like you. But can they fool your family—or bank?

WSJ’s Joanna Stern replaced herself with her AI twin for the day and put "her" through a series of challenges, including creating a TikTok, making video calls and testing her bank's voice biometric system.

0:00 How to make an AI video and voice clone
2:29 Challenge 1: Phone calls
3:36 Challenge 2: Create a TikTok
4:47 Challenge 3: Bank Biometrics
6:05 Challenge 4: Video calls
6:45 AI vs. Humans

Tech Things With Joanna Stern
Everything is now a tech thing. In creative and humorous videos, WSJ senior personal tech columnist Joanna Stern explains and reviews the products, services and trends that are changing our world.

#AI #Tech #WSJ

All Comments (21)
  • @DannyIvan86
    Boss: "We need you to train an AI that looks, talks and acts like you." Two weeks later, Boss: "You have been let go."
  • @gatodario
    The scary part is how good these services became in so little time.
  • @38Unkown
    This is so scary. What is worse is people have no idea how to regulate or protect people against these types of systems if used maliciously.
  • @ManPlusRiver
    So anyone who has their voice recorded and on the internet (YouTubers, podcasters, Hollywood actors, radio personalities) can be cloned against their will.
  • @GlennHanna8
    Watch out for scammers calling parents with their offspring's voice as if they were in serious trouble. Even if they don't succeed, hearing your child's voice in great distress can haunt the parent for decades.
  • @Smojero
    The thing with A.I is that it's a snowball effect. Once a milestone is reached, it just improves on it so quickly.
  • @bonzo7681
    The scariest part of this video isn't AI, it's the dude drinking from a bottle of mustard at 1:39
  • @marcus_b1
    The primary issue with her experiment was the lack of ability to add emotions to the avatar, primarily via voice alone. Once that is incorporated it could 100% pass simple interactions that can be expanded from there. An individual would need their own personal ever growing learning model to successfully pull this off fully.
  • The crazy thing is that this is the worst the technology will ever be.
  • @Sawpainter_td
    The bigger problem is that people have such a cavalier attitude about taking part in these AI stunts. This is not a joke, and I think we're going to find that out very soon. I could not believe she referred to the AI as her better self. Pay attention people, this is the mentality that is out there right now regarding AI or AGI, and we are feeding right into it.
  • @04heinm
    "Good luck. I am inevitable." ... killer signoff!
  • @tacobell1299
    I think it's a bit scary that the AI can replicate your vocie and possibly steal your bank information
  • @MrTeff999
    “Stay human everyone.” Love it!
  • I'm 55 years old, been following this tech almost from beginning. Folks, AI has moved into real world long, long ago. Let me just say this, you have interacted with people who weren't people at all (android).
  • I'm not sure why she labeled Challenge 1 as "PASS" when it was obviously a fail. Both people said it sounded like her but they could tell that something was off.
  • @gameon2000
    The most scary part is: how easily and fast most people could be replaced.
  • The chase thing is insane and it seems like the bank has no answers yet.
  • @MubinNoor
    I feel like during the takes they used to generate her avatar she used a stern and cold read type voice, hence why her avatar had an abnormal intonation. I think if she would have read the prompts more personably and more conversational, the avatar would have had those qualities in it as well.
  • Chat GPT wasn't making stuff up about ios16. Chat GPT is based off a data scrape that predates ios16. So as far as it can tell, it was being accurate.
  • One huge issue with her experiment is that the voice model was trained on her reading a script. So it sounded exactly like her reading a script, haha. Ideally she would have used recordings of genuine conversation (recorded while she's talking on the phone, for example, so only her voice is captured). I've no doubt that she would have had a much, much more convincing model then.