Future Computers Will Be Radically Different (Analog Computing)

11,767,208
0
Published 2022-03-01
Visit brilliant.org/Veritasium/ to get started learning STEM for free, and the first 200 people will get 20% off their annual premium subscription. Digital computers have served us well for decades, but the rise of artificial intelligence demands a totally new kind of computer: analog.

Thanks to Mike Henry and everyone at Mythic for the analog computing tour! www.mythic-ai.com/
Thanks to Dr. Bernd Ulmann, who created The Analog Thing and taught us how to use it. the-analog-thing.org/
Moore’s Law was filmed at the Computer History Museum in Mountain View, CA.
Welch Labs’ ALVINN video:    • Self Driving Cars [S1E2: ALVINN]  

▀▀▀
References:
Crevier, D. (1993). AI: The Tumultuous History Of The Search For Artificial Intelligence. Basic Books. – ve42.co/Crevier1993
Valiant, L. (2013). Probably Approximately Correct. HarperCollins. – ve42.co/Valiant2013
Rosenblatt, F. (1958). The Perceptron: A Probabilistic Model for Information Storage and Organization in the Brain. Psychological Review, 65(6), 386-408. – ve42.co/Rosenblatt1958
NEW NAVY DEVICE LEARNS BY DOING; Psychologist Shows Embryo of Computer Designed to Read and Grow Wiser (1958). The New York Times, p. 25. – ve42.co/NYT1958
Mason, H., Stewart, D., and Gill, B. (1958). Rival. The New Yorker, p. 45. – ve42.co/Mason1958
Alvinn driving NavLab footage – ve42.co/NavLab
Pomerleau, D. (1989). ALVINN: An Autonomous Land Vehicle In a Neural Network. NeurIPS, (2)1, 305-313. – ve42.co/Pomerleau1989
ImageNet website – ve42.co/ImageNet
Russakovsky, O., Deng, J. et al. (2015). ImageNet Large Scale Visual Recognition Challenge. – ve42.co/ImageNetChallenge
AlexNet Paper: Krizhevsky, A., Sutskever, I., Hinton, G. (2012). ImageNet Classification with Deep Convolutional Neural Networks. NeurIPS, (25)1, 1097-1105. – ve42.co/AlexNet
Karpathy, A. (2014). Blog post: What I learned from competing against a ConvNet on ImageNet. – ve42.co/Karpathy2014
Fick, D. (2018). Blog post: Mythic @ Hot Chips 2018. – ve42.co/MythicBlog
Jin, Y. & Lee, B. (2019). 2.2 Basic operations of flash memory. Advances in Computers, 114, 1-69. – ve42.co/Jin2019
Demler, M. (2018). Mythic Multiplies in a Flash. The Microprocessor Report. – ve42.co/Demler2018
Aspinity (2021). Blog post: 5 Myths About AnalogML. – ve42.co/Aspinity
Wright, L. et al. (2022). Deep physical neural networks trained with backpropagation. Nature, 601, 49–555. – ve42.co/Wright2022
Waldrop, M. M. (2016). The chips are down for Moore’s law. Nature, 530, 144–147. – ve42.co/Waldrop2016

▀▀▀
Special thanks to Patreon supporters: Kelly Snook, TTST, Ross McCawley, Balkrishna Heroor, 65square.com, Chris LaClair, Avi Yashchin, John H. Austin, Jr., OnlineBookClub.org, Dmitry Kuzmichev, Matthew Gonzalez, Eric Sexton, john kiehl, Anton Ragin, Benedikt Heinen, Diffbot, Micah Mangione, MJP, Gnare, Dave Kircher, Burt Humburg, Blake Byers, Dumky, Evgeny Skvortsov, Meekay, Bill Linder, Paul Peijzel, Josh Hibschman, Mac Malkawi, Michael Schneider, jim buckmaster, Juan Benet, Ruslan Khroma, Robert Blum, Richard Sundvall, Lee Redden, Vincent, Stephen Wilcox, Marinus Kuivenhoven, Clayton Greenwell, Michael Krugman, Cy 'kkm' K'Nelson, Sam Lutfi, Ron Neal

▀▀▀
Written by Derek Muller, Stephen Welch, and Emily Zhang
Filmed by Derek Muller, Petr Lebedev, and Emily Zhang
Animation by Ivy Tello, Mike Radjabov, and Stephen Welch
Edited by Derek Muller
Additional video/photos supplied by Getty Images and Pond5
Music from Epidemic Sound
Produced by Derek Muller, Petr Lebedev, and Emily Zhang

All Comments (21)
  • @anishsaxena1226
    As a young PhD student in Computer Science, your explanation of how neural networks come to be and evolved, and the math behind it, is the cleanest and most accessible that I have come across. As I focus on computer architecture, I came to this video without much expectations of learning anything new, but I am glad I was wrong. Keep up the great work!
  • @5MadMovieMakers
    Hyped for the future of computing. Analog and digital could work together to make some cool stuff
  • Analogue was never meant to die; the technology of that time was the limiting factor IMO. It appears like Analogue - Digital hybrid system can do wonders in computing.
  • AI researcher here, you did a great job on this. For anyone interested the book Perceptrons by Minsky/Papert is a classic with many proofs of limitations and explorations of the limits of the paradigm. It still holds up today and its fascinating to read what scientists were thinking about neural networks during the year of the moon landing!
  • @KraftyB
    Fun fact, that Graphics Card he’s holding at 18:00 is a Titan xp with an msrp of $1200, he says it draws 100w but it actually draws about 250w, so that tiny chip that only draws 3w is even more impressive
  • @belsizebiz
    For amusement only: My first day at work was in1966, as a 16 year old trainee technician, in a lab dominated by a thermionic valve analogue computer (or two). These kept us very warm through the winter months and cooked us during the summer. The task was calculation of miss distances of a now-obsolete missile system. One day I was struggling to set up a resistor/diode network to model the required transfer function, but the output kept drifting. I found the resistors were warming up and changing value. Such are the trials and tribulations of analogue computing....
  • FYI, the company in this video ran out of cash in 2022 and has now folded.
  • I had a business analysis course that tried explain the perceptron and I didn't understand anything. I don't have a strong maths background. This video is pure genius. The way you explain ideas is amazing and easy to understand. Thank you so much. This is my favourite channel
  • @jeffc5974
    One of the first things I learned in Electrical Engineering is that transistors are analog. We force them to behave digitally by the way we arrange them to interact with each other. I'm glad there are some people out there that remember that lesson and are bringing back the analog nature of the transistor in the name of efficiency and speed.
  • @Septimius
    I see Derek is getting into modular synthesizers! Also, funny to see how the swing in musical instruments to go from analog to digital and back is being mirrored in computing generally.
  • @NoahSpurrier
    I remember seeing an analog differential calculator in high school in my physics and electronics teacher’s lab. It was more of a museum piece. It was never used. RIP Mr. Stark
  • @asg32000
    I've watched a lot of your stuff for years now, but this is the best one by far. Great job of explaining something so complex, difficult, and relevant!
  • @funktorial
    started watching this channel when I started high school and now that I'm about to get a phd in mathematical logic, I've grown an even deeper appreciation for the way this channel covers advanced topics. not dumbed down, just clear and accessible. great stuff! (and this totally nerd-sniped me because i've been browsing a few papers on theory of analog computation)
  • My professor always said that the future of computing lies in accelerators -- that is, more efficient chips that do specific tasks like this. He even mentioned analog chips meant to quickly evaluate machine learning models in one of his lectures, back in 2016! It's nice seeing that there's been some real progress.
  • @bishalpaudel5747
    This is very well explained video on analog computing. Never could I have thought the topic of analog computing can be put out in 20 minute video with such a phenomenal animation and explanation. Respect your work and effort to make science available to all for free. Respect 🙏
  • @zebaerum7
    I used this content and visuals for my Electronics Engineering final year technical seminar. I loved the content, and the way it's put together. Thanks for choosing the most interesting stuff to put in my feed.
  • @dust7962
    The problem with this system of computing is that interference is a huge factor. When you only test to see if there is voltage or not then you don't need to worry about interference. But when you get into building systems that use varying voltages say 0v 0.5v 1v then you need to worry about interference, and the more numbers you add the greater this factors into being an issue. Interference could come in the form of microwaves, radiation, mechanical vibration (think fans cooling off a server rack,) and the list drags on as almost anything can cause interference. That ostiliscope used in the example is an expensive piece of equipment that minimizes interference, but the cost is far higher than with a binary number system.
  • @suivzmoi
    as a NAND flash engineer that bit about the usage of floating gate transistors as analog computers is interesting. particular because in flash memory there is a thing known as "read disturb", where even low voltages applied to the gate (like during a flash read) to query it's state can eventually cause the state itself to change. you would think it is a binary effect where if it's low enough it would just be a harmless read but no...eventually there will be electron build up in the gate (reading it many times at low voltage has a similar effect to programming it one time at a high voltage). in this particular application the weight would increase over time the more you query it, even though you didn't program it to be that high in the beginning. it's interesting because, it's sort of analogous to false memories in our brains where the more we recall a particular memory, the more inaccurate it could potentially become.
  • @lc5945
    I remember the first time I heard the term "deep networks", it was back in 2009 when I was starting my MSc (using mainly SVMs), a guy in the same institute was finishing his PhD and introduced me to the concept and the struggles (Nehalem 8 cores era)...the leaps in performance made in NN since then thanks to GPGPU are enormous
  • @HrLBolle
    Mythic Gate's approach to work: Kind of reminds me of the copper filament memory planes with ferromagnetic rings representing the bits used as memory for the AGC (Apollo Guidance Computer). The video on which this memory is based was released by Destin, aka Smarter Every Day, and accompanied his and Linus Sebastian's meeting with Luke Talley, a former IBM employee and, at the time of the Apollo missions, a member of the data analysis teams responsible for the Analysis, evaluation and processing of the telemetry data received from the Apollo instrument ring.