top of page

From moorings to Moore's Law and beyond.

Updated: Jun 14, 2023

The last few weeks we have written a little about the technology behind the drastic speed increase in large racing yachts. Although impressive it pales in comparison to the rapid rate at which computing power has accelerated. Over the coming weeks we will delve into the world of semiconductors.... into Moore's Law, Quantum Computing and it's amazing possibilities (as well as its risks), a little on the promise of AI and then some on the future that is being created through using all this computing to manufacture new and interesting compounds allowing the field of material science to flourish. (and we will see how all this great stuff all ties together!)

But as the head line above indicates, the super yachts would never be going as fast as they do today without the discovery of and the advancement in the microprocessor. That increase in speed was described by Gordon Moore in 1965 and has since been known as Moore's Law. In short, the law states...every two years the speed of a computer doubles while the cost is halved.

First...A little Background

The first semiconductor was discovered in 1824, but it wasn't until the 1950s that scientists realized that they could use this material to make electronic devices. At first, people used transistors and diodes as switches or amplifiers in radios and televisions. Then they discovered how to use them as memory chips (also called RAM), which store data until it is needed again by the computer processor. Each one of these is really just a switch, a switch that directs current to another switch to another and so on... Today's technology uses microchips made from silicon dioxide and other materials like gallium arsenide or indium phosphide; these tiny pieces of silicon are called "semiconductors." These chips can be made small enough that there are more than 1 billion 'switches' on one chip!

The early years

of semiconductor technology were marked by a series of innovations that led to the development of the integrated circuit and computer revolution. In 1947, John Bardeen, Walter Brattain and William Shockley invented the transistor at Bell Laboratories. The transistor replaced vacuum tubes in computers, making them smaller and faster than ever before. In 1959 Jack Kilby developed an integrated circuit while working at Texas Instruments (TI). This single-chip device had all its components on one tiny piece of silicon--a major step forward in reducing costs while increasing functionality.

The Silicon Age

and the silicon chip, introduced in the late 1960s and early 1970s, was a revolutionary invention that enabled computers to become smaller and more powerful. This led to the rise of the semiconductor industry and its applications in many areas of life today.

In 1965, Gordon E. Moore

made an observation. Today know as Moore's was basically an empirical observation that the number of transistors per square inch on integrated circuits had doubled every year since their invention. He made this statement in a 1965 paper, "Cramming More Components onto Integrated Circuits".

In 1975 he revised his prediction to once every two years, later revised it to once every eighteen months and then again to twice as fast as originally stated--this would now be about fourteen months. So, twice the speed every year or so and now half the price.

In 1975, Japanese engineer Kenji Urada invented the first commercially manufactured silicon chip. This invention allowed for electronic devices to be made smaller and more efficient than ever before. Moore's law was pushed onwards through this new material.

Then In 1977, Motorola developed the first microprocessor - a computer on a single circuit board. Microprocessors are small computer processors that perform a limited set of operations. They can be used in computers, smartphones and other electronic devices.

Microcontrollers are programmable integrated circuits that combine a microprocessor core with internal memory and peripherals (e.g., timers/counters, serial interfaces). Microcontrollers were initially designed for use in embedded systems applications such as industrial automation or appliances; however now they have become common even in non-embedded applications such as video game consoles or personal computers because their ease of use makes them easier to program than an entire computer with multiple components would be. Moore's law marched forward.

In 1980, IBM released its first personal computer. The IBM PC was a huge success and became the first computer to be mass produced. It featured an Intel 8088 CPU operating at 4.77 MHz with one megabyte of RAM (random access memory), expandable up to 640KB by installing additional memory boards in sockets on top of the mainboard. And Moore's Law continued.

In 1987, Intel released the 80386 processor which brought powerful 32-bit processing to desktop computers for the first time. It was a massive step forward in computing power and opened up new possibilities for users. 20 years after his prediction Moore's Law was just as true as the day he penned it.

The Semiconductor Industry Today

The semiconductor industry today is a complex and dynamic one. Today's semiconductor companies are some of the biggest players in business; they employ thousands of people worldwide and produce billions of dollars worth of products every year. These companies include Intel (the largest manufacturer), Samsung Electronics Co., Ltd., Toshiba Corporation (which owns Hitachi), NVidia, AMD, and a whole host of others. In 2022, global semiconductor sales reached by some estimates over 460 billion U.S. dollars, a rise of more than 30 percent in just two years. Estimates for the next five years show anywhere from a 6 to 10 percent a year in total sales growth and a tripling in speed and quartering of price... McKinsey analysis believes the industry will top 1 Trillion dollars by 2030.

The Future

The future of semiconductors will be shaped by new materials and emerging technologies like artificial intelligence (AI) and 5G networks, as well as continued demand for more powerful computers and mobile devices. Healthcare, self-driving cars and cities that monitor and direct the flow of traffic for speed while minimizing pollution as well as robot helpers are no longer the stuff of science fiction. New materials are sure to be found and invented. Perhaps by a computer programed to make a new and faster material. In fact computers will begin to think for themselves in ways we can't today predict. New and improved modeling of currents weather patterns with more efficient modes of transportation will reduce the output of carbon emissions helping with our impact on the planet and climate change. No one knows if Moore's Law is indeed immutable but if we can continue to import more and more power to our computers one thing is for sure,,,if we do then an America's Cup boat traveling at over 100mph is just around the corner.

Information contained herein does not involve the rendering of personalized investment advice but is limited to the dissemination of general information.

A professional adviser should be consulted before implementing any of the strategies or options presented.

18 views0 comments

Recent Posts

See All


bottom of page