Why the Future of AI & Computers Will Be Analog

Published 2024-04-09
Why the Future of AI & Computers Will Be Analog. Secure your privacy with Surfshark! Enter coupon code UNDECIDED for an extra 3 months free at surfshark.deals/undecided Since digital took over the world, analog has been sidelined into what seems like a niche interest at best. But this retro approach to computing, much like space operas, is both making a comeback because of its power and speed. What I really wanted to know, though, was this: How can analog computing impact our daily lives? Why might it be the future of AI? And what will that look like?

Veritasium’s video on analog computers:    • Future Computers Will Be Radically Di...  

The Reserve Bank of New Zealand’s (RBNZ) MONIAC display:    • Making Money Flow: The MONIAC  

Watch How We Solved The Home Wind Turbine Problem    • How We Solved The Home Wind Turbine P...  

Video script and citations:
undecidedmf.com/why-the-future-of-ai-computers-wil…

Get my achieve energy security with solar guide:
link.undecidedmf.com/solar-guide

Follow-up podcast:
Video version -    / @stilltbd  
Audio version - bit.ly/stilltbdfm

Join the Undecided Discord server:
link.undecidedmf.com/discord

👋 Support Undecided on Patreon!
www.patreon.com/mattferrell


⚙️ Gear & Products I Like
undecidedmf.com/shop/

Visit my Energysage Portal (US):
Research solar panels and get quotes for free!
link.undecidedmf.com/energysage

And find heat pump installers near you (US):
link.undecidedmf.com/energysage-heatpumps

Or find community solar near you (US):
link.undecidedmf.com/community-solar

For a curated solar buying experience (Canada)
EnergyPal's free personalized quotes:
energypal.com/undecided

Tesla Referral Code:
Get 1,000 free supercharging miles
or a discount on Tesla Solar & Powerwalls
ts.la/matthew84515


👉 Follow Me
Mastodon
mastodon.social/@mattferrell

X
twitter.com/mattferrell
twitter.com/undecidedMF

Mastodon
mastodon.social/@mattferrell

Instagram
www.instagram.com/mattferrell
www.instagram.com/undecidedmf

Facebook
www.facebook.com/undecidedMF/

Website
undecidedmf.com/


📺 YouTube Tools I Recommend
Audio file(s) provided by Epidemic Sound
bit.ly/UndecidedEpidemic

TubeBuddy
www.tubebuddy.com/undecided

VidIQ
vidiq.com/undecided


I may earn a small commission for my endorsement or recommendation to products or services linked above, but I wouldn't put them here if I didn't like them. Your purchase helps support the channel and the videos I produce. Thank y

All Comments (21)
  • @ShawnHCorey
    (circa 1960) "It would appear that we have reached the limits of what it is possible to achieve with computer technology, although one should be careful with such statements, as they tend to sound pretty silly in 5 years."" — John von Neumann
  • @billmiller4800
    The biggest drawback of analogue circuits is that they are quite specific to a problem, so making something that will work for most/generic problems is difficult, where a digital computer is trivially easy in comparison. But, when you need something specific, the analogue computer can be significantly faster and more energy efficient. I look forward to the hybrid components that will be coming out in the future.
  • I think the best way to sum it up is: Analog is fast and efficient, but is hard to design (or at least formulate the problem). Once you build it, it is only good at solving that specific problem. Digital on the other hand is much more flexible. You have an instruction set and can solve any problem that you can write an algorithm for using those instructions, so you can solve multiple problems with that same machine. Tradeoff is the mentioned slower speed and efficency (compared to analog). My favourite story about digital vs. analog is the Iowa-class battleships: They were built in the 1940s and were reactivated in the 80s. The fire control computers (electromechanical analog computers using cams, differentials and whatnot) were state of the art back in the day, but given the invention of the transistor and all that since then, the navy did look at upgrading them to digital. What they found is that the digital system did not offer greater accuraccy over the analog. While over 40 years technology advanced quite a bit, the laws of physics remained the same so the old analog computers worked just as well.
  • @brucefay5126
    I studied analog computers/computing in the 1970s as part of my electrical engineering education. At one time (after that) I worked for a company in Ann Arbor, Michigan that made some of the most powerful analog computers in the world. (I was in marketing by then.) They were used, among other things, to model nuclear reactors and power plants. Incredibly powerful.
  • @olhoTron
    1:32 No, it may seem like a infinite set, but in practice it's limited by the signal-to-noise ratio, the SNR is efectively the number of "bits" of a analog computer, the rule of thumb is 6db ~ 1 bit, also each component adds its own noise on top of the signal, you lose "bits" as your computation becomes more complex, BTW that is also kinda true on digital computers, if you use floating point numbers you lose some precision with each rounding, however on digital its easier to just use more bits if you need them, on analog decreasing noise is not so trivial
  • @Legg99
    I built and used analogue computers (although I didn't call them that) to control the temperatures in a multi-layered cryostat for my Ph.D work in the mid '70s. I did the number crunching on a digital computer that filled the basement of the maths dept building using punch card input. 😮. 20 odd years later I found myself working with an engine management computer for a helicopter that was pure analogue. When I approached the system engineer at a large well known aerospace company who had the design control authority for the system to ask some fundamental questions about it he didn't have a clue - he was purely digital. I'm retired now but if drag my knowledge out of the closet along with my flared jeans and tie-dyed T-shirts perhaps I'll come back into fashion. 😁
  • @tbix1963
    Great video, thanks for sharing. The biggest problem with analog computers is there are so few people that know how to work on them. I’m reminded of a hydro electric plant I toured once that had an electro mechanical analog computer that controlled the units. At the time I visited it was already considered to be ancient and they were actively attempting to replace it simply because know body knew how it worked. They only knew how to turn it on, off, and wind the clock spring once a shift the exact number of spins that it needed to keep running. They had been trying to replace it with a new computer but none of the many attempts could match its precision in operating the plant and maintaining proper water flows. They were in constant fear that it might break. I checked back maybe 20 years later to ask how it was and know one working their knew what I was talking about. Sad that it was long forgotten by everyone at the plant. I thought it should have been retired to a museum, and still hope that possibly it was.
  • I started my career with analogue computers in the 1970s as they were still being used in industrial automation for motor control (PID: Proportional, Integral and Differential). I worked in a repair centre and built some test gear to allow me to calibrate them. It's no surprise to me that they have come back, within certain niche applications they are very powerful, although not particularly programmable, unless you count circuit design as programming :-)
  • @icedreamer9629
    Analogue computing is analogous to the P vs NP problem in pure mathematics. It is fantastic at anything which is hard to calculate, but quick to check. I'm this case, anything hard to figure out how to express, but with solidly defined parameters. It works by shunting a good deal of the difficulty solving the problem up front to the designers of the machine. It can take years of incredibly hard work to figure out how to express a single problem in analogue form, but once you DO computing the answer for any combination or variant of the problem is virtually instantaneous.
  • Gave a talk at DataVersity on context. Explained in the 70's we could build a nuclear power plant instrument panel with gauges of different ranges - this allowed the techs to scan the whole wall and immediately see if anything looked out of normal range (usually straight vertical). However, everyone wanted digital, not realizing that with that change, each number had to be individually read, consciously scaled and then thought about (compare with 'normal'). With digital came the necessity for alarms because it took too much mental effort to scan the wall. Something few consider to this day...
  • @BenGrimm977
    I'm skeptical about the broad assertion - 'Why the Future of AI & Computers Will Be Analog' - that analog computing will dominate in the future. Analog computing clearly has its niche, particularly with tasks involving continuous, real-world signals—such as audio and visual processing, or interpreting sensor data—where this technology presents a clear advantage. However, framing this niche strength as 'the future' and implying a universal superiority over digital computing seems a bit overstated to me.
  • @DantalionNl
    An analog clock is not continuous, its movement is a function of the internal gear ratios and the arm moves in discrete, quantifiable steps
  • @circattle
    The large hall-sized computers you show at the start of the video are actually digital computers. They just used vacuum tubes instead of transistors so took up a lot of space and used a lot of electricity.
  • @debrainwasher
    Whenever I design electronics, I often use analog preprocessing, since it takes much less energy to amplifie, integrate or filter signals with OP-amps using summing, PT1-, PT2-models, integrators and differentiators instead to using convolutions or FFT to construct FIR-, or IIR-filter, that needs a lot of processing power.
  • It's important to understand in the clock example - that even though the clock hand (second hand) was moving discretely (moving only every 1 second, and not continuously), it still is analog. Because it is not necessarily the continuous nature that makes something analog, it's if the physical process happening is analogous to the real event. The twist of the second-hand is directly proportional to how much of a minute has passed. However, in a digital clock, the voltages in the bits (0-5V) are not analogous or directly proportional to the time that has passed.
  • @JohnSostrom
    When I was in the Navy the ships I was stationed on had a single gun on the bow. This was a 5"/54 and was operated by an analog computer. If you watch the SyFy movie Battleship you can see this type of aiming a large gun round at a given target. One of the big advantages of analog computers is that they are not subject to EMP.
  • @Hippida
    Like, the first 75% of this video talk about what analog computers are not. Add some snappy comments and twist, and you still don't come close to what is mentioned in the video headline
  • @BunkerSquirrel
    One of the biggest hurdles to analog ASICs has been the extreme difficulty of designing the IC circuitry. In fact, the most expensive and labor intensive part of a processor has always been the analog circuitry to regulate and supply power, among other things. With the help of AI we’re going to see a massive explosion in this field, since it can easily pull from natural laws, fab constraints and desired outcomes to take much of the guesswork and iteration out of analog ASIC engineering. Literally machines building the next generation of machines.