Jean-Louis Gassée, writing for Monday Note:
In the old days, circuits were prototyped by hand using a primitive breadboard. After the circuit was debugged and pronounced fit, it was translated into masks for printed circuit boards.
As integrated circuits grew to comprise thousands and then millions of logic elements, breadboards were virtualized: The circuit-to-be was designed on a computer, just as we model a building using architectural Computer Assisted Design (CAD).
A multibillion industry of software modules that could be plugged into one’s own circuit specifications soon emerged. Companies such as Synopsys, Cadence, and Mentor Graphics offered circuit design tools, and an ecosystem of third-party developers offered complementary libraries for graphics, networking, sensors… The end result is a System On a Chip (SOC) that’s sent off to semiconductor manufacturing companies commonly called foundries.
This was the fertile ground on which ARM has prospered. ARM-based chips aren’t simply more efficient and cheaper than Intel’s x86 designs, they’re customizable: They can be tuned to fit the client’s project.
And this on Intel’s reaction to ARM:
Intel didn’t get it. “Just you wait!” the company insisted, “Our superior semiconductor manufacturing process will negate ARM’s thriftier power consumption and production costs!” But that opportunity has passed. Intel miscalculated the iPhone, failed to gain any traction in the Android market, and had to resort to bribing (er…incentivizing) tablet manufacturers to use their low-end Atom processors. Earlier this year, they threw in the towel on mobile and are now focused on PCs and Cloud data centers.
Great post.