When you're strapping on the latest smart watch or ogling an iPhone, you probably aren't thinking of Moore's Law, which for 50 years has been used as a blueprint to make computers smaller, cheaper and faster.
Without Moore's Law it's quite possible that new types of computers like Microsoft's HoloLens, a holographic wearable with which users can interact with floating images, would not have been developed. For decades, Moore's Law has been a guiding star for the development of modern electronics, though in recent years its relevance has been subject to debate.
Moore's Law isn't a scientific theory, but a set of observations and predictions made by Intel co-founder Gordon Moore in an article [click here to download] first published in Electronics Magazine on April 19, 1965, which were subsequently modified. His core prediction states that the density of transistors, or the number of transistors on a given die area, would double every two years, which leads to double the performance. Loosely translated, that means in 18 to 24 months you could buy a computer that is significantly faster than what you have today with the same amount of money.
The tech industry originally interpreted this to mean that making chips would get cheaper with scaling: as transistor density doubles, chips shrink in size, processing speeds up, and the cost per processor declines. For the past five decades, the tech world has based product plans and manufacturing strategies around this concept, leading to smaller, cheaper and faster devices.
Manufacturing advances have also made chips power-efficient, helping squeeze more battery life out of devices.
Without Moore's Law, "I don't think we could have a smartphone in the palm of our hand," said Randhir Thakur, executive vice president and general manager of the Silicon Systems Group at Applied Materials.
But engineers have predicted that Moore's Law will die in the next decade because of physical and economic challenges. Conventional computers could be replaced by quantum computers and systems with brain-like, or neural, chips, which function differently than current processors. Silicon could also be replaced by chips made using new materials, such as graphene or carbon nanotubes.
Intel applied Moore's observations first to memory products, with the benefit being lower cost per bit. Then it applied Moore's Law to integrated circuits, and Intel's first chip in 1971, the 4004, had 2,300 transistors. Intel's latest chips have billions of transistors, are 3,500 times faster, and 90,000 times more power efficient.
Since then, Moore's Law has been flexible enough to adapt to changes in computing. It was the force behind supercharging computer performance in the 1990s, and lowering power consumption in the last decade, said Mark Bohr, senior fellow at Intel.
"The type of performance we had on desktops 15 years ago is matched by laptops and smartphones in our hand today," Bohr said.
Moore's Law is being used as a guiding principle in the development of wearables, Internet of Things devices and even robots that can recognize objects and make decisions. It also affects a diverse range of products such as cars, health devices and home appliances, which are relying more on integrated circuits for functionality, Bohr said.
But engineers agree that Moore's Law could be on its last legs as chips scale down to atomic scale, and even Intel is having a tough time keeping pace. Gordon Moore has revisited Moore's Law over the last 50 years and at multiple times expressed doubts about its longevity. In a recent interview with IEEE Spectrum, Moore said keeping up was getting "more and more difficult."
Intel's innovations have kept Moore's Law chugging along, with the most recent technology advance being FinFET, in which transistors are placed on top of each other so more features can be packed on chips. Intel has spent billions of dollars establishing new factories, and innovations such as strained silicon, high-k metal gate and FinFET have helped give Moore's Law a long lease on life.
"Because Intel works hard on it, new, computing-hungry applications are emerging every day," said Xian-He Sun, distinguished professor of computer science at the Illinois Institute of Technology in Chicago.
But it is becoming difficult to etch an increasing number of features on ever-smaller chips, which are increasingly susceptible to a wide range of errors and defects. More attention is required in designing and making chips, and additional processes and personnel need to be put in place to prevent errors.
In addition, with research under way into new materials and technologies, silicon may be on its way out, a change that could fundamentally transform Moore's Law. There's a lot of interest in a family of so-called III-V materials -- compounds based on elements from the third and fifth columns of the periodic chart -- such as gallium arsenide or indium gallium arsenide.
"Moore's Law is morphing into something that is about new materials," said Alex Lidow, a semiconductor industry veteran and CEO of Efficient Power Conversion (EPC).
EPC is making a possible silicon replacement, gallium nitride (GAN), which is a better conductor of electrons, giving it performance and power-efficiency advantages over silicon, Lidow said. GAN is already being used for power conversion and wireless communications, and could make its way to digital chips someday, though Lidow couldn't provide a timeline.
"For the first time in 60 years there are valid candidates where it's about superior material rather than smaller feature size," Lidow said.
The economics of manufacturing smaller and faster chips are also tumbling. It's getting more expensive to make advanced factories, and the returns on making those chips are diminishing. Important tools like EUV (extreme ultraviolet) lithography, which transfers circuit patterns onto substrates, would make it possible to shrink chips to even smaller sizes but aren't yet available.
"The semiconductor has always faced challenges, which have been speed bumps. Now we're going up against a wall," said Jim McGregor, principal analyst at Tirias Research.
Experts can't predict where Moore's Law will be years from now, but it will eventually fall as the physics and economics of making smaller chips no longer make practical sense. Nevertheless, the legacy of Moore's Law will live on as a model for bringing down the price of components, which leads to cheaper devices and computers, McGregor said.
Moore's 1965 article ushered in an era of ever-increasing technological change. "We've taken servers the size of a room down to a mobile chip. It's amazing what we've done in that period of time," McGregor said.
Join the CIO New Zealand group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.