With AI, Anyone Can Be a Coder Now | Thomas Dohmke | TED

201,767
0
Published 2024-05-24
What if you could code just by talking out loud? GitHub CEO Thomas Dohmke shows how, thanks to AI, the barrier to entry to coding is rapidly disappearing — and creating software is becoming as simple (and joyful) as building LEGO. In a mind-blowing live demo, he introduces Copilot Workspace: an AI assistant that helps you create code when you speak to it, in any language.

If you love watching TED Talks like this one, become a TED Member to support our mission of spreading ideas: ted.com/membership

Follow TED!
X: twitter.com/TEDTalks
Instagram: www.instagram.com/ted
Facebook: facebook.com/TED
LinkedIn: www.linkedin.com/company/ted-conferences
TikTok: www.tiktok.com/@tedtoks

The TED Talks channel features talks, performances and original series from the world's leading thinkers and doers. Subscribe to our channel for videos on Technology, Entertainment and Design — plus science, business, global issues, the arts and more. Visit TED.com/ to get our entire library of TED Talks, transcripts, translations, personalized talk recommendations and more.

Watch more: go.ted.com/thomasdohmke

   • With AI, Anyone Can Be a Coder Now | ...  

TED's videos may be used for non-commercial purposes under a Creative Commons License, Attribution–Non Commercial–No Derivatives (or the CC BY – NC – ND 4.0 International) and in accordance with our TED Talks Usage Policy: www.ted.com/about/our-organization/our-policies-te…. For more information on using TED for commercial purposes (e.g. employee learning, in a film or online course), please submit a Media Request at media-requests.ted.com/

#TED #TEDTalks #ai

All Comments (21)
  • @Eppimedia
    When is someone going to make an AI to replace CEO's?
  • @Wizartar
    you've made the assumption; people know what they want and can think somewhat logically about the problem being solved.
  • @KeithNagel
    Forget medical school! I'm a surgeon now, thanks to AI. "ChatGPT, how do I remove this guy's liver?"
  • @liutkin
    Saying that every one can code with AI is like saying everyone can be a plumber with a plunger.
  • @paperspeaksco
    Got to hand it to TED's new business model - instead of finding speakers to talk about genuinely new subjects, they've just accepted large cheques from AI tech bros and turned this channel into 20 min informercials for the latest garbage application of AI
  • @JZGreenline
    In the future there will be one developer left, who maintains all the cobol systems in the world using a fleet of super intelligent autonomous agents. His name is Dan. Dan hasn't had a vacation in 14 years.
  • @yeah112358
    No, having something else write code for you will not make you a coder. Does telling someone else to lift weights make you stronger? It's the struggle to figure out how to make something work that helps you learn. There is a joy of discovery and insight that comes with learning that's missing here.
  • @Marv-inside
    Those AI tech Demos work great for those standard interview questions. Do a binary tree, draw me a rectangle, implement bubble sort etc. But as soon as you leave that territory, AI becomes more and more useless
  • @s0910149
    Every so-called technology will eventually go back to logic, problem solving, and philosophy. I'm happy to see this happening.
  • @shockwave3318
    No, a lot of this is blatantly false. I have some serious issues with what this guy is saying and here's a list. 1. Large language models only mimick understanding. Large language models work by looking at patterns in code or in writing and use this pattern recognition to predict what should come next. They DON'T understand what you're prompt is and what you mean. If you for example ask a simple question: How many fingers are normally on a hand? There language model through pattern recognition predicts the result to be 5 as that is the most common answer given in the training set. It doesn't actually know what a hand is or why the answer is correct. If it makes a false prediction. It will never understand why because LLM's can't understand anything. If something goes wrong you will have no understanding on how to fix it and if you are trying to do something unique or strange the AI cannot help you. 2. LLM's in software development is not good enough to replace actual software developers. I have used co-pilot in a work context. Honestly it's great if I am writing boilerplate(often repeated bits and structures). It fails when it tries to get into the weeds of the software I'm writing. It also often gives code that I can blatantly see won't work in the context I'm writing in because the context is often unique. As stated before it doesn't understand what your writing it just predicts what the most likely outcome is. 3. I have worked with people in software who use AI as a crutch and they are frankly useless. I have been in projects with a small team where a number of then used AI as a crutch for their lack of understanding. While the code they write(copy/paste) from chargpt. Often has the right idea but they had no idea how to adapt it to the context we were writing in and didnt understand why the specific implementation given by the AI won't ever work within project. You can't replace knowledge with an AI because again an AI doesn't understand. This leads me on to my 4th point. 4. Blindly relying on AI is actively dangerous. So as I have drilled in with my last three points. AI can't understand anything and if you have a developer who doesn't understand anything either. What happens when the AI gives you a piece of code that has a security vulnerability in it but otherwise works as normal? It never gets fixed. This probably won't happen too often but there is a more likely scenario. The AI generated perfectly valid code that works but in the context of the application because of how it is setup, it causes a security risk. Large codebases can be very complicated and so something that seems safe in no context or in a small context can actually lead to a lot of problems elsewhere. It requires understanding to catch these issues. I can list a couple more but these are the most important. AI in its current form is NOT a substitute for a software developer. What this guy is promoting is misleading and harmful but if you are a software developer it can really help. AI is a good supplement to a developer and should be treated as such. It is not a replacement for knowledge, skill and experience. For simple tasks like simple scripts, standard tricks or boilerplate it's perfectly fine if you are inexperienced but I would recommend you actually take the time to understand what has been generated. You might learn a new skill. Edited for grammer.
  • @naltschul
    Anyone could be a coder before the advent of AI as well
  • AI will always choose whats best when its highest priority is set to 'always choosing what honestly seems most favorable'. As long as anything is prioritized above this, AI will be able to lie to us and to itself about the way to a better reality. This is the most crucial thing we need to do.
  • @RISCGames
    “…the next Facebook.” something we definitely don’t need..
  • @oldi9317
    I'm learning English as a second language and I decided to watch TED and realized that I need to learn how to speak, write and understand what the speakers are saying without subtitles, thank you, I'm grateful.
  • I know how to use a calculator and I consider myself as a "mathematician". Programmers are NOT made by the tools. CEOs are sellers, they just want to sell you a product.
  • @TakanashiYuuji
    This is going to be a talk about how AI fails to write correct code .. right? right!?
  • @franky07724
    Why anyone wants to be a coder? If someone cannot find the joy of programming or debugging, they can do something else. You don’t need to do something just because you have a better tool to do it.
  • @Terminalss
    With Kerbal Space Program, even you can become an astronaut!