Future Computers Will Be Radically Different (Analog Computing)

11,720,466
0
Published 2022-03-01
Visit brilliant.org/Veritasium/ to get started learning STEM for free, and the first 200 people will get 20% off their annual premium subscription. Digital computers have served us well for decades, but the rise of artificial intelligence demands a totally new kind of computer: analog.

Thanks to Mike Henry and everyone at Mythic for the analog computing tour! www.mythic-ai.com/
Thanks to Dr. Bernd Ulmann, who created The Analog Thing and taught us how to use it. the-analog-thing.org/
Moore’s Law was filmed at the Computer History Museum in Mountain View, CA.
Welch Labs’ ALVINN video:    • Self Driving Cars [S1E2: ALVINN]  

▀▀▀
References:
Crevier, D. (1993). AI: The Tumultuous History Of The Search For Artificial Intelligence. Basic Books. – ve42.co/Crevier1993
Valiant, L. (2013). Probably Approximately Correct. HarperCollins. – ve42.co/Valiant2013
Rosenblatt, F. (1958). The Perceptron: A Probabilistic Model for Information Storage and Organization in the Brain. Psychological Review, 65(6), 386-408. – ve42.co/Rosenblatt1958
NEW NAVY DEVICE LEARNS BY DOING; Psychologist Shows Embryo of Computer Designed to Read and Grow Wiser (1958). The New York Times, p. 25. – ve42.co/NYT1958
Mason, H., Stewart, D., and Gill, B. (1958). Rival. The New Yorker, p. 45. – ve42.co/Mason1958
Alvinn driving NavLab footage – ve42.co/NavLab
Pomerleau, D. (1989). ALVINN: An Autonomous Land Vehicle In a Neural Network. NeurIPS, (2)1, 305-313. – ve42.co/Pomerleau1989
ImageNet website – ve42.co/ImageNet
Russakovsky, O., Deng, J. et al. (2015). ImageNet Large Scale Visual Recognition Challenge. – ve42.co/ImageNetChallenge
AlexNet Paper: Krizhevsky, A., Sutskever, I., Hinton, G. (2012). ImageNet Classification with Deep Convolutional Neural Networks. NeurIPS, (25)1, 1097-1105. – ve42.co/AlexNet
Karpathy, A. (2014). Blog post: What I learned from competing against a ConvNet on ImageNet. – ve42.co/Karpathy2014
Fick, D. (2018). Blog post: Mythic @ Hot Chips 2018. – ve42.co/MythicBlog
Jin, Y. & Lee, B. (2019). 2.2 Basic operations of flash memory. Advances in Computers, 114, 1-69. – ve42.co/Jin2019
Demler, M. (2018). Mythic Multiplies in a Flash. The Microprocessor Report. – ve42.co/Demler2018
Aspinity (2021). Blog post: 5 Myths About AnalogML. – ve42.co/Aspinity
Wright, L. et al. (2022). Deep physical neural networks trained with backpropagation. Nature, 601, 49–555. – ve42.co/Wright2022
Waldrop, M. M. (2016). The chips are down for Moore’s law. Nature, 530, 144–147. – ve42.co/Waldrop2016

▀▀▀
Special thanks to Patreon supporters: Kelly Snook, TTST, Ross McCawley, Balkrishna Heroor, 65square.com, Chris LaClair, Avi Yashchin, John H. Austin, Jr., OnlineBookClub.org, Dmitry Kuzmichev, Matthew Gonzalez, Eric Sexton, john kiehl, Anton Ragin, Benedikt Heinen, Diffbot, Micah Mangione, MJP, Gnare, Dave Kircher, Burt Humburg, Blake Byers, Dumky, Evgeny Skvortsov, Meekay, Bill Linder, Paul Peijzel, Josh Hibschman, Mac Malkawi, Michael Schneider, jim buckmaster, Juan Benet, Ruslan Khroma, Robert Blum, Richard Sundvall, Lee Redden, Vincent, Stephen Wilcox, Marinus Kuivenhoven, Clayton Greenwell, Michael Krugman, Cy 'kkm' K'Nelson, Sam Lutfi, Ron Neal

▀▀▀
Written by Derek Muller, Stephen Welch, and Emily Zhang
Filmed by Derek Muller, Petr Lebedev, and Emily Zhang
Animation by Ivy Tello, Mike Radjabov, and Stephen Welch
Edited by Derek Muller
Additional video/photos supplied by Getty Images and Pond5
Music from Epidemic Sound
Produced by Derek Muller, Petr Lebedev, and Emily Zhang

All Comments (21)
  • @anishsaxena1226
    As a young PhD student in Computer Science, your explanation of how neural networks come to be and evolved, and the math behind it, is the cleanest and most accessible that I have come across. As I focus on computer architecture, I came to this video without much expectations of learning anything new, but I am glad I was wrong. Keep up the great work!
  • @funktorial
    started watching this channel when I started high school and now that I'm about to get a phd in mathematical logic, I've grown an even deeper appreciation for the way this channel covers advanced topics. not dumbed down, just clear and accessible. great stuff! (and this totally nerd-sniped me because i've been browsing a few papers on theory of analog computation)
  • FYI, the company in this video ran out of cash in 2022 and has now folded.
  • Analogue was never meant to die; the technology of that time was the limiting factor IMO. It appears like Analogue - Digital hybrid system can do wonders in computing.
  • @belsizebiz
    For amusement only: My first day at work was in1966, as a 16 year old trainee technician, in a lab dominated by a thermionic valve analogue computer (or two). These kept us very warm through the winter months and cooked us during the summer. The task was calculation of miss distances of a now-obsolete missile system. One day I was struggling to set up a resistor/diode network to model the required transfer function, but the output kept drifting. I found the resistors were warming up and changing value. Such are the trials and tribulations of analogue computing....
  • I had a business analysis course that tried explain the perceptron and I didn't understand anything. I don't have a strong maths background. This video is pure genius. The way you explain ideas is amazing and easy to understand. Thank you so much. This is my favourite channel
  • AI researcher here, you did a great job on this. For anyone interested the book Perceptrons by Minsky/Papert is a classic with many proofs of limitations and explorations of the limits of the paradigm. It still holds up today and its fascinating to read what scientists were thinking about neural networks during the year of the moon landing!
  • My professor always said that the future of computing lies in accelerators -- that is, more efficient chips that do specific tasks like this. He even mentioned analog chips meant to quickly evaluate machine learning models in one of his lectures, back in 2016! It's nice seeing that there's been some real progress.
  • As a 70-year-old boomer my technical education involved building and testing very basic analog devices. Thanks for this video, it helped me to a better understanding of neural networks.
  • @lc5945
    I remember the first time I heard the term "deep networks", it was back in 2009 when I was starting my MSc (using mainly SVMs), a guy in the same institute was finishing his PhD and introduced me to the concept and the struggles (Nehalem 8 cores era)...the leaps in performance made in NN since then thanks to GPGPU are enormous
  • @NoahSpurrier
    I remember seeing an analog differential calculator in high school in my physics and electronics teacher’s lab. It was more of a museum piece. It was never used. RIP Mr. Stark
  • @5MadMovieMakers
    Hyped for the future of computing. Analog and digital could work together to make some cool stuff
  • I’ve been an engineer for 44 years. Great video. I actually worked on analog computers in the 70s when digital processing was still new. Never to this level though. Great job!
  • It's interesting how circular technology is. Back in the 1970's my first job out of uni was with a national research association focused on all things to do with ships in the UK. Whilst the primary focus of my work was providing QA services to the various research teams, including maintaining and enhancing language systems like RATFOR, and system management of the ICL, IBM, Perq, CV & DEC systems, I was also involved in developing two specific analogue/digital hybrid projects. One was focused on managing and monitoring loading balances for bulk cargo ships and the other was simulating ship navigation into ports in real-time. Both of these projects involved interfacing the analogue data from real-time sensors to digital monitoring and mapping algorithms. Unfortunately, at that time, analogue was seen as a historical burden and both were eventually canned. Now, almost 50 years later, it's great to see that our ideas of the 70's are coming back into fashion.
  • @asg32000
    I've watched a lot of your stuff for years now, but this is the best one by far. Great job of explaining something so complex, difficult, and relevant!
  • @melanezoe
    Freaked me out to see that opening analog plug board. That’s how I learned programming in my first data processing class at Fresno State University—in 1964. Eerie to have that memory rise.
  • @suivzmoi
    as a NAND flash engineer that bit about the usage of floating gate transistors as analog computers is interesting. particular because in flash memory there is a thing known as "read disturb", where even low voltages applied to the gate (like during a flash read) to query it's state can eventually cause the state itself to change. you would think it is a binary effect where if it's low enough it would just be a harmless read but no...eventually there will be electron build up in the gate (reading it many times at low voltage has a similar effect to programming it one time at a high voltage). in this particular application the weight would increase over time the more you query it, even though you didn't program it to be that high in the beginning. it's interesting because, it's sort of analogous to false memories in our brains where the more we recall a particular memory, the more inaccurate it could potentially become.
  • @Psrj-ad
    this make me want Derek to talk about Neural networks and AI related topics a lot more. its not just extremely interesting but also constantly developing.
  • @activision4170
    Great video. Never knew this was a thing. Very useful. Might one day just be an extra part on the motherboard designed for fast approximation calculations