"Why is AI everywhere right now?"
- patrickihalainen
- Apr 30
- 2 min read
Updated: May 2

Had lunch with a colleague yesterday. He asked me something simple, sharp, and surprisingly hard to answer. Here's a solid try:
"Why is AI everywhere right now?"
Short answer: it is not sudden. But the timing? That is no accident.
Here’s a take on the forgotten part of AI development.
1956 Dartmouth Conference. AI gets its name. Turing's influence is still in the air. But the machines? Nowhere near ready.
The computers of the time were massive, rigid structures.
Most computing in the 1950s relied on electromechanical systems or early vacuum tube machines like the IBM 701 and UNIVAC I.
These systems used punched cards, a method invented in the late 1800s and still standard well into the 1970s. Programs and data were literally encoded by punching holes in rectangular cards that were fed into machines.
By the 1960s, computers had advanced, but still required hands-on work.The Apollo missions were coded line by line, and the memory itself was literally woven.
NASA’s Apollo Guidance Computer, built by the MIT Instrumentation Lab, was among the first to use integrated circuits, compact for its time, but still deeply manual. Its firmware lived in core rope memory, assembled in textile-like workshops.
Margaret Hamilton led the software team. The memory was woven by women often referred to, with far too little credit, as “little old ladies”. A wire threaded through a core meant 1. Around it meant 0. Their precision got us to the moon.
Then came portable hardware. In 1980, John B. Goodenough unlocked lithium ion batteries. Sony commercialized it in the 1990s, powering laptops and cameras. Mobility began.
John was more than Goodenough. Look into his life. Even though we lost him two years ago, he remains one of our true innovators. In my humble opinion, we have not heard the last of John just yet. https://en.wikipedia.org/wiki/John_B._Goodenough
But the real shift came with companies like Nokia paving the way for powerful handheld computing, and Apple advancing it. Not through batteries, but through efficiency focused chip design. They picked up P A Semi in 2008, went all in on ARM. Smaller, faster, cooler. Suddenly computing did not sit on desks. It lived in our pockets.
And with scale came signal. Context. Behavior. Data that felt real.
Then came CUDA. NVIDIA opened up the GPU. A graphics chip became the new workhorse of AI.
Not because of vision. Because of cycles. Because the stack matured.
So no, AI did not suddenly show up. It is just that all the layers finally lined up. Compute, context, and curiosity.
This journey is not over. Next steps, who knows. AGI combined with technologies such as Willow would be a huge shift. In fact, so huge that my head cannot even wrap around it. If we remove friction from UI and get a more direct connection between operator and machine, we unlock something new again. This could take decades, or it could be here tomorrow.
Let's enjoy the ride. Whatever this is, it has been, and will continue to be, a once in history moment.
Comments