The computing power that once fit inside an entire room now fits into your smartphone, all made possible by cramming more transistors onto a single integrated circuit over the past 50 years.
This exponential growth resulting in periodic increases in computing power was predicted by Moore’s Law, an observation and projection by Gordon E. Moore which plotted the number of transistors in a dense integrated circuit doubles approximately every two years due to improved design and manufacturing methods.
But Moore’s Law is more than just a prediction; it’s become the goal and self-fulfilling prophecy for the semiconductor industry, resulting in the transistor fabricated on a microchip being the most frequently manufactured human artifact in history.
According to David Brock, “Estimates of the number of transistors produced in a single year now match, or exceed, estimates of the total number of all the grains of sand on all the world’s beaches. With computing devices made of microchips, the price of computing has fallen over a million-fold, while the cost of electronics has fallen a billion-fold.”
Can innovation continue at this rate? Continue reading Will the future change how we define Moore’s Law?