The world changed on Nov. 15, 1971, and hardly anyone noticed. It is the 50th anniversary of the launch of the Intel 4004 microprocessor, a computer carved onto silicon, an element as plentiful on earth as sand on a beach. Microprocessors unchained computers from air-conditioned rooms and freed computing power to go wherever it is needed most. Life has improved exponentially since.

Back then, IBM mainframes were kept in sealed rooms and were so expensive companies used argon gas instead of water to put out computer-room fires....

The Intel 4004 microprocessor, 1971.

Photo: Getty Images

The world changed on Nov. 15, 1971, and hardly anyone noticed. It is the 50th anniversary of the launch of the Intel 4004 microprocessor, a computer carved onto silicon, an element as plentiful on earth as sand on a beach. Microprocessors unchained computers from air-conditioned rooms and freed computing power to go wherever it is needed most. Life has improved exponentially since.

Back then, IBM mainframes were kept in sealed rooms and were so expensive companies used argon gas instead of water to put out computer-room fires. Workers were told to evacuate on short notice, before the gas would suffocate them. Feeding decks of punch cards into a reader and typing simple commands into clunky Teletype machines were the only ways to interact with the IBM computers. Digital Equipment Corp. sold PDP-8 minicomputers to labs and offices that weighed 250 pounds.

In 1969, Nippon Calculating Machine Corp. asked Intel to design 12 custom chips for a new printing calculator. Engineers Federico Faggin, Stanley Mazor and Ted Hoff were tired of designing different chips for various companies and suggested instead four chips, including one programmable chip they could use for many products. Using only 2,300 transistors, they created the 4004 microprocessor. Four bits of data could move around the chip at a time. The half-inch-long rectangular integrated circuit had a clock speed of 750 kilohertz and could do about 92,000 operations a second.

Intel introduced the 3,500-transistor, eight-bit 8008 in 1972; the 29,000-transistor, 16-bit 8086, capable of 710,000 operations a second, was introduced in 1978. IBM used the next iteration, the Intel 8088, for its first personal computer. By comparison, Apple’s new M1 Max processor has 57 billion transistors doing 10.4 trillion floating-point operations a second. That is at least a billionfold increase in computer power in 50 years. We’ve come a long way, baby.

When I met Mr. Hoff in the 1980s, he told me that he once took his broken television to a repairman, who noted a problem with the microprocessor. The repairman then asked why he was laughing.

Now that everyone has a computer in his pocket, one of my favorite movie scenes isn’t quite so funny. In “Take the Money and Run” (1969), Woody Allen’s character interviews for a job at an insurance company and his interviewer asks, “Have you ever had any experience running a high-speed digital electronic computer?” “Yes, I have.” “Where?” “My aunt has one.”

Silicon processors are little engines of human ingenuity that run clever code scaled to billions of devices. They get smaller, faster, cheaper and use less power every year as they spread like Johnny’s apple seeds through society. These days, everyone’s aunt has at least one. Today’s automobiles often need 50 or more microprocessors to drive down the road, although with the current chip shortage, many are sitting on lots waiting for chips.

Mobile computing paved the way for smartphones, robotic vacuum cleaners, autonomous vehicles, moisture sensors for crops—even GPS tracking for the migratory patterns of birds. It also has created an infinitely updatable world—bugs get fixed and new features are rolled out without a change in hardware.

The separation of hardware and software, of design and control, is underappreciated. It enables global supply chains, for good or bad (mostly good). Apple can design in California, manufacture anywhere, and add its software at any point in the manufacturing process.

I’m convinced that most wealth created since 1971 is a direct result of the 4004. All of tech. All of finance. All of retail—ask Walmart about its inventory system. Oil? Good luck finding and drilling it without smart machines.

So 50 years later, have we reached the limit? For the past 20 years, microprocessors have boosted performance by adding more computing cores per chip. That Apple M1 has 16 processor cores. Graphics processing units, often used for artificial intelligence and bitcoin mining, can have thousands of processor cores.

Someday Gordon Moore’s Law from 1965—the number of transistors per chip doubles about every two years—will poop out. Some day John von Neumann’s architecture of processor and memory, first described in 1945, will no longer meet our computing needs. My guess is that we have another decade or two to squeeze more gains out of our current chip technology and computer architecture.

Luckily for us, computing doesn’t stand still. The neural networks used by

Amazon’s Alexa to recognize your voice and Google to pick out faces in photos won’t replace the microprocessor, but they will likely serve as a complementary technology that can scale up for the next 50 years. Google is about to introduce a next-generation Tensor chip, updating the ones found in the company’s Pixel 6 phones. It is basically an artificial-intelligence accelerator in your pocket, allowing your phone to adapt to you—your own personal neural network to train. It is exciting and scary at the same time. What will we do with all this power? That question is like seeing the 4004 and being asked what microprocessors will be used for besides calculators in the future. Your answer would probably be off by a factor of a billion.

Write to kessler@wsj.com.