Chipmaker increasingly on outside after decades of ‘Intel Inside’

Intel risks being left behind through lack of innovation

Intel’s recently-departed chief executiveBrian Krzanich addressing CES 2018 in Las Vegas in January. His departure in June rocked the company. Photograph: Mandel Ngan/AFP/Getty Images
Intel’s recently-departed chief executiveBrian Krzanich addressing CES 2018 in Las Vegas in January. His departure in June rocked the company. Photograph: Mandel Ngan/AFP/Getty Images

When disgruntled Fairchild Semiconductor engineers Robert Noyce (co-inventor of the integrated circuit) and Gordon Moore jumped ship to set up their own California chip company 50 years ago this week, they could hardly have imagined that their new venture would transform the world to quite such a degree.

But even in 1968, Intel – short for "integrated electronics" – quickly attracted a team of some of the world's best electrical engineers (among them, a Hungarian emigrant, the hard-headed Andy Grove). Soon, it produced its first breakthrough product: a single integrated chip for Japanese calculator company Busicom that could do the work of the 12 specialised chips a surprised Busicom originally contracted for.

By 1971, Intel had cemented its place in history by producing the first microprocessor, a "computer on a chip" that brought together, on one tiny piece of silicon, operations that previously required a range of electronics and considerable space. The $200 4004 chip, just 4mm by 3mm, performed the functions of 2,300 transistors and had the same computing power as the famed top-secret ENIAC computer, which was a room-filling 3,000 cubic feet in size and required 18,000 vacuum tubes to perform similar calculations.

Moore’s Law

Thanks to the now famous Moore's Law – the Intel co-founder's observation in 1965 that the number of transistors (the brains) on a chip doubles every year, while the cost typically halves – Intel's chips continued to leap in capability while the devices they went into shrank in size and cost.

READ MORE

The power of large mainframe computers transitioned down to ever-smaller servers, desktop computers, laptops, tablets, mobiles and these days, tiny bits of intelligence that are helping to create an Internet of Things with computing power everywhere.

Not that these pioneers, whose innovations on silicon would launch the explosive, inventive growth of the region now nicknamed Silicon Valley, necessarily realised where this was going or how those chips would move from big industry into our homes.

“In the mid-1970s, someone came to me with an idea for what was basically the PC,” Moore once said. “The idea was that we could outfit an 8080 processor with a keyboard and a monitor and sell it in the home market. I asked: ‘What’s it good for?’ And the only answer was that a housewife could keep her recipes on it. I personally didn’t see anything useful in it, so we never gave it another thought.”

In a way, that sums up the strength and the weakness of Intel ever since. On the one hand, extraordinary invention and creativity (and ruthless business drive personified for years by eventual chief executive Andy Grove and his 'Only the paranoid survive' motto) has kept it on the top rung of iconic, industry-shaping Valley companies.

But since the glory years of the desktop computer revolution, when its “Intel Inside” chips, partnered with Microsoft’s dominant Windows operating system, powered the vast majority of PCs, Intel has sometimes struggled to find, much less define, new markets.

"Intel was big when the primary purchase we made was a desktop or laptop computer running Windows," notes Frank Gillett, vice-president and principal analyst at industry watcher Forrester. For years, Intel's only notable competitor was chipmaker AMD, which only ever carved out about 15 per cent of the semiconductor market, he says.

“Intel rode the drive to PCs to prominence, then did the same in enterprise [large business] servers,” the core of business networks and the power behind the fast-growing web.

But even as the company dominated the big server market, it failed to capitalise on a different trend towards small-but-powerful that also emerged a decade ago: smartphones.

And Intel misfired in a spectacular way by deciding not to go after supplying chips for Apple chief executive Steve Jobs's brand-new device, the iPhone, which launched in 2007.

"We ended up not winning it or passing on it, depending on how you want to view it. And the world would have been a lot different if we'd done it," admitted former Intel chief executive Paul Otellini in an interview five years ago, as he prepared to retire.

"At the end of the day, there was a chip that they were interested in that they wanted to pay a certain price for and not a nickel more and that price was below our forecasted cost. I couldn't see it. It wasn't one of these things you can make up on volume. And in hindsight, the forecasted cost was wrong and the volume was 100 times what anyone thought," Otellini told TheAtlantic.com.

Ouch.

"So around the same time, Intel invades the server industry and successfully takes out a whole swathe of that high-margin industry, but failed to go after mobile," says Gillett.

Specialised chips

The company then also failed to notice that some sectors, such as gaming and architectural design, wanted specialised chips for big-power, high-end, 3D capability.

Instead, says Gillett, Intel focused on producing general-purpose servers that could do many things well, rather than an individuated range that could do particular things extremely well.

The company has since repeatedly “tried to branch out” into specialised areas, such as neural processors and graphics processors, but without significant success, he says.

“All these related, competing markets have emerged, but Intel has been unsuccessful in expanding to compete.”

They’ve also made attempts to scale down chips to compete in the mobile sector, on devices such as phones and tablets, and to go even smaller, to microcontrollers for Internet of Things objects, but without much impact.

Intel recognised the growing area of the IoT maybe five years ago, says Gillett, but failed to gain a firm foothold. Instead, longtime competitor ARM has dominated.

Intel even produced its own brand of devices for a while, and dabbled in wearables such as smart watches, but moved on.

Industry observers say Intel finally hit the wall of the physical limits of Moore's Law in 2015

More recent news hasn’t been great, either. Media reports in April said Apple – which made a dramatic move in 2005 from IBM’s PowerPC chips to Intel chips for its desktops and laptops – instead would use its own chips by 2020.

Meanwhile, Intel's lucrative server market margins have been squeezed by the consolidation of the buyer market into a handful of big, powerful companies offering cloud and data services, such as Microsoft, Google and Amazon. Because they buy so many servers, they can also demand cut-rate prices at scale, notes Gillett.

And last month, the company lost its most recent chief executive and long-time Intel executive, Brian Krzanich, who resigned after acknowledging a consensual relationship with an employee years ago. Such relationships violate corporate policy.

The resignation isn’t likely to noticeably damage the company, says Gillett, and on the positive side, gives an opportunity for new leadership that could see the company rebound.

Lack of innovation

Perhaps Intel’s biggest problem now is that, as noted in its financial statements, most of its revenue still comes from its traditional x86 chip architecture products, but – as Apple’s move highlights – Intel hasn’t dramatically innovated on chip design in years.

Industry observers say Intel finally hit the wall of the physical limits of Moore’s Law in 2015. Since then, chips just could not get significantly smaller or more powerful or less costly, not using today’s materials and microelectronics.

The question for the company now, says Gillett, is whether Intel will find new markets and services, or continue with its ageing x86 architecture and just add varied capabilities on top of it.

His own view is that Intel needs to innovate in numerous cutting-edge technological directions, including continued experimentation and hard research in quantum computing and other game-changers for Moore’s Law, as well as bring a fresh eye to areas such as the Internet of Things, or the company’s recent push into the autonomous vehicle sector.

He also thinks Intel needs to consider a fundamental structural change, splitting the business to hive off its chip manufacturing side into one operation, and then have a separate company focusing on new technologies, markets and opportunities.

“Intel still has an enormous variety of technology and business fronts,” Gillett says. “But they are not in places where it is easy to innovate.”

Perhaps it’s time for some of Grove’s innovation-driving paranoia. Competitors underestimate the company at their own risk.