I was recently invited to be the keynote speaker at a “Technology Day” event in Milwaukee, WI. (If you have never attended one of these events, you should seriously consider it, as they provide an excellent venue for valuable interactive technology discussions between us and our customers.) Seizing upon the opportunity, I chose to speak about a topic that is near and dear to my heart, and perhaps the greatest threat facing the microcontroller industry today. I titled my speech “Will microcontrollers go the way of DRAMs?” For those of us old enough to remember the DRAM saga, it was a fairy-tale story of a high-tech product with handsome margins and excellent growth potential. But it didn’t end happily-ever-after, as DRAM margins began to erode, and the whole industry sank into the commodity space. One by one, manufacturers dropped out when they realized they couldn’t use innovation as a competitive weapon, and the whole situation degenerated into all-out fab wars. The problem was exacerbated when some unscrupulous DRAM manufacturers who were determined to own this market began dumping product below cost, thus hastening the inevitable end.
Could the microcontroller industry be headed toward a similar cataclysmic finale? As George Santayana put it, “Those who cannot remember the past are condemned to repeat it.” So, to understand where the microcontroller industry is going, we need to look at where it’s been.
The microcontroller had a quiet and humble nativity in the wee morning hours of July 4th, 1971, when Gary Boone and his team at Texas Instruments got their first prototype of a “computer on a chip” to work. Unfortunately, TI underestimated the full potential of this breakthrough, and for a couple of years, the TMS1000 was used exclusively in TI calculators. The microcontroller world would be a quite different place today had TI not waited until 1974 to make this chip available to the general market. By then, Intel had already established a dominant foothold in this fledgling market.
The first fifteen years of the microcontroller age saw relatively few suppliers jump into the fray. If you wanted to join this exclusive country-club of MCU players, you had to pay the hefty entry fees, and not many could afford the dues. Several barriers to entry included:
1. Expensive silicon foundry and fabrication facilities.
2. The priesthood of silicon mages who understood how to design MCU cores was small indeed, and those who could wield this magic were well compensated.
3. Unlike other semiconductor products that were pretty much stand-alone, additional resources were required to support an MCU family with assemblers, monitors, debuggers, development systems, etc.
Put all of these pieces together, and an image begins to form of a product with a huge appetite for development dollars which could only be satiated by big companies with deep pockets and/or well-endowed investors.
Now fast-forward to present day. The microcontroller market has matured into a global presence representing estimated annual revenue of $18B in 2015, split amongst 40 or so suppliers. What happened to cause this explosive proliferation? Let’s take a look at three main events which changed the microcontroller world forever...
1. Owning a silicon fab was a very prized, but very expensive proposition. Operating margins were directly linked to your ability to keep your fab running all the time at full capacity. As a result, many MCU manufacturers who owned fabs started renting out spare fab capacity in order to maintain a predictable backlog of silicon starts. Related to this effect, third-party fabs began to spring up all around the globe. Even many of the “big guys” started outsourcing some of their fab requirements to lighten their asset costs. Suddenly the first barrier was down, and smaller companies now had access to fabs to manufacture their own MCU products.
2. As the uC/uP market matured, the mystique of core design began to dissipate. Mature players in this market were looking to recoup early R&D costs by licensing older core technologies (like the 8051) to the general market. But perhaps the most devastating body-blow to the elitism of proprietary cores was delivered by a British company named Acorn Computers. In the late 80s they developed a core which they called the “Acorn RISC Machine”, and began licensing it to other companies, including Apple and DEC. In 1990, Acorn Computers spun-off the core design team into a new business called “Acorn RISC Machines, Ltd”, or simply “ARM” for short. Now, a non-proprietary core with an upwardly mobile migration path was generally available for a relatively small licensing fee. This concept caught on like wildfire, and by 2005, 98% of all mobile phones contained at least one ARM processor. The second barrier came down with a resounding “thud” that could be heard around the world.
3. Even with the first two barriers down, playing in the MCU space was still an expensive proposition due to the huge overhead burden required for support products and services. But it was inevitable that as the MCU industry matured, third party support services began to pop up everywhere. If you have ever been to Austin TX, you know exactly what I’m talking about. Guys on street corners carried cardboard signs that read “Will test silicon for food”. Neon signs glowed softly in the night…“Bubba’s Barbecue and peripheral design services”. (OK, maybe I’m exaggerating a little, but you get my point.) At the end of the 90’s, I recall a conversation I had with a division manager for the semiconductor company I worked for at the time. To paraphrase, he said, “Just about everything we do here can be outsourced. With my connections, I’m sure I could start my own MCU company if I wanted to, and just outsource everything. And if I can do it, I’m sure there’s a whole bunch of other people around here who can do it too!” As I left his office that afternoon, I finally realized that all the barriers were down, and the MCU industry was turning into the Wild West.
In the next part of this blog, I will discuss the strategies taken by some of the entrenched MCU suppliers to remain competitive in the face of almost certain commoditization.