How Hackers and Mechanics Unearth Tesla’s Hidden Autopilot Data | WSJ

Published 2024-07-29
Tesla closely protects data of each car’s Autopilot system. The Journal obtained a set of this data as a part of its investigation into Autopilot crashes. Here’s how it’s extracted from deep within the car.

Chapters:
0:00 Tesla Autopilot computers
1:03 Tesla data
1:31 Removing the computer
2:39 Hacking into the computer

Watch the full Tesla autopilot video investigation here: on.wsj.com/4fslzEv

#Tesla #Autopilot #WSJ

All Comments (21)
  • @andrewright6974
    If it’s a trailer, SAY IT’S A TRAILER. Your attempt to manipulate people into paying for a subscription with cliff hangers is pathetic.
  • @billywest1307
    Let me get this straight. You post this video about how you extract circuits from Telsa's, all with ominus music and how they dont want you to see it. But you never hint at what you found? Was it cool, evil, incompotent? Like give me a reason to have to pay for your rediculous site!
  • @TestMyVidsOut
    Where is the rest of the video WSJ. Was just getting interesting…..
  • @johnl.7754
    Reports on TESLA safety should include rate of accidents vs human drivers otherwise hard to compare
  • up until the last frame (3:29) in the left bottom corner: Acceleration: ON, Brake: no 🤯
  • @SB-zj2pm
    I don’t think any human would have been able to avoid that crash
  • @blip-hn6is
    why is every hacking videos contain ping and htop and random commands that isnt even reconized
  • @Numb_
    Wow if I knew this was just a 3 minute call to action and I would've skipped it. What a waste of time
  • @mapl3mage
    A key point made in their video is that Tesla does not use Lidar sensor, unlike most other car makers like BMW, Nissan, etc. instead, Tesla decided to cut corners by only using camera and sonic sensors....which works fine when there is sufficient ambient light, but can struggle under very low light conditions. The night time car crash shown in the video is one example of the limitation of the technology in any Tesla car. Because it was too dark, the camera had trouble detecting the stopped car blocking the road. By contrast, Lidar sensors work well even under extreme low light condition because it uses its own light source (laser). Maybe the car could have detected and safely stopped if it had a Lidar sensor, but eh...profit over safety is Tesla's motto.
  • @Sect10n31
    I’m so glad I read the comments first instead of just clicking on the wsj link.
  • @nufh
    Solely relying on a camera is not enough, especially during the night.
  • @maxmeier532
    Oh, it's just clickbait to subscribe the WSJ.
  • You know this is what China is doing to get Tesla's FSD technology
  • @NIAtoolkit
    As a human i wouldn’t have seen that truck
  • @bossanesta
    so this crash video is about autopilot, not FSD.. thats basically called advanced cruise control in other cars. thats mean driver need to be awake and aware the road. its just a insurance in case if you cant react. even tesla fsd is not level 3, only benz has the it, and limit to max 40mph and i think the headlight should be on high in such speeds and condition?
  • @choiswimmer
    So everything we knew about neural networks and computer vision. Exactly why camera vision isn't enough and radar/lidar is necessary
  • @iano4027
    This is FUD. Give thumb-dwn, unsub from WSJ. These crashes are not factually linked as caused by Tesla FSD. Use the twitter community to fact check press like this.