-
Chronological chart of semiconductor-related core technology trends
Click on the image to download the PDF.
-
1904
1904: Two-electrode vacuum tube is invented
Thomas Alva Edison / John Ambrose Fleming
Thomas Alva Edison, the inventor of the first practical incandescent light bulb, had also noticed that the direct electric current flowed from a heated metal filament in the bulb to the other electrode only when the latter had a positive voltage. John Ambrose Fleming used this effect (known as the Edison effect) to invent the two-electrode vacuum tube rectifier, which was soon to play an important role in the electrical circuits.
* The photo is for illustration purposes only
-
1946
1946: World's first general-purpose computer ENIAC is announced
ENIAC
ENIAC was the largest electronic machine ever made, equipped with some 18,000 vacuum tubes and weighing about 30 tons. It occupied a 160-square-meter room.
The computer consisted of a total of about 110,000 electronic circuit devices.
In contrast, today's microprocessor typically integrates tens of millions of transistors and yet is smaller than the palm of your hand. This astonishing decrease in size started when solid state semiconductors replaced vacuum tubes. -
1948
1948: Junction-type transistor is invented
Transistor / William Bradford Shockley
Toward the end of 1947, John Bardeen and Walter Brattain at AT&T Bell Labs developed the point contact transistor, which was the first transistor ever constructed. William Shockley and his team continued with this research, and announced the invention of the mechanically solid junction-type transistor in June 1948.
Subsequently, integrated circuits (ICs) were invented that packed large numbers of transistors into a small chip, followed by large-scale integrated circuits (LSIs) with an integration level of 1,000-100,000 components or more on a chip.* The photo is for illustration purposes only
-
1955
1955: Japan's first transistor radio is released
Transistor Radio / Sony
Early broadcast radio receivers used vacuum tubes. By replacing vacuum tubes with semiconductor transistor devices, radios became smaller, lighter, and less power consuming. Tokyo Tsushin Kogyo (Tokyo Telecommunications Engineering Corporation; now Sony) released Japan's first transistor radio TR-55 in 1955. While attempting to improve the yield of a transistor manufacturing process, the company's research team discovered the quantum mechanical effect known as tunneling.
* The photo is for illustration purposes only
-
1957
1957: Esaki diode is invented
Esaki Diode / Leona (Leo) Esaki
While pursuing a research project for creating faster transistors, Leo Esaki made an artificial structure consisting of two layers of different semiconductor materials separated by a thin dielectric layer of 10 nm or less, and discovered that the electron tunneling effect occurs within this structure.
The first electronic device that applied this effect was the Esaki diode, which became a major springboard for subsequent development of LSIs.* The photo is for illustration purposes only
-
1959
1959: Kilby patent is filed
Jack St. Clair Kilby
The Kilby patent refers to the patent on the basic concept of the integrated circuit developed by Jack St. Kilby at Texas Instruments (TI). The far-reaching patent yielded huge royalties to TI. Because Japan had originally delayed granting this patent for many years, its expiration date in Japan was as late as in 2001. As a result, Japanese semiconductor manufacturers were charged with large amounts of licensing fees for some of their products.
* The photo is for illustration purposes only
-
1965
1965: Moore's law is announced
Gordon Moore
In 1965, Gordon Moore (who later became a co-founder of Intel) predicted that the integration rate of LSIs would double every 18 months, implying that it would quadruple in three years and become 1,000-fold denser in 15 years. The prediction, known as Moore's law, was based on the analysis of historical trends in the computer manufacturing industry. The proposition was widely accepted in the semiconductor and computer industries, and the reality turned out to be almost exactly as Moore had foreseen.
-
1970
1970: Practical application of the semiconductor laser diodes
Semiconductor Laser Diodes
At around the same time, Izuo Hayashi and M. B. Panish at Bell Labs, along with Zhores Alferov’s team in the Soviet Union, succeeded in achieving continuous oscillation of semiconductor laser diodes at room temperature. Until then, lasers would either break down shortly after oscillation or only operate when cooled. The laser diodes that achieved continuous operation at room temperature were based on a double heterostructure, a concept proposed by H. Kroemer. In 2000, Alferov and Kroemer were awarded the Nobel Prize. Laser diodes were initially used in optical communications and later became common in daily life, serving as light sources in CD-ROMs and DVD players, automotive lighting, and laser pointers.
* The photo is for illustration purposes only
-
1971
1971: Invention of the microprocessor and memory by Intel
i4004 / Intel
A team led by Federico Faggin at Intel successfully commercialized the technology. He led Ted Hoff and Stan Mazor in this effort. Initially, Masatoshi Shima, then an engineer at the Japanese company Busicom, designed a chipset composed of 12 chips for the Busicom 141-PF calculator. However, Faggin's team managed to realize the same functionality with just four chips. Intel secured the rights to sell the product and released a 4-bit CPU known as the i4004. Since the CPU architecture could adapt to different calculators simply by changing the software, various models could be supported through software customization. The microprocessor began as a 4-bit device and now features a 64-bit architecture, becoming the heart of personal computers.
* Photo credit: Dentaku Museum
-
1977
1977: World's first personal computer Apple II is released
Personal Computer / Apple
Although many consider the Altair 8800 to be the world's first personal computer, its manufacturer called it a minicomputer along with other designations such as microcomputer and home computer. In contrast, Apple's Steve Jobs used the term "personal computer," which took hold with the commercial success of the Apple II.
* The photo is for illustration purposes only
-
1978
1978: The VLSI design textbook is published
Introduction to VLSI Systems /
Carver Mead & Lynn ConwayThe textbook Introduction to VLSI Systems, co-authored by Professor Carver Mead of the California Institute of Technology (Caltech) and Lynn Conway of Xerox PARC (Palo Alto Research Center), was adopted by numerous universities across the United States. Students who studied from this textbook went on to work in industry and applied their knowledge to semiconductor design. The book explained the fundamentals of logic LSI design, including the correspondence between logic circuits and circuit patterns, as well as signal lines and power lines. It is said that the strength of American fabless semiconductor companies is thanks to this textbook.
Professor emeritus Carver Mead of Caltech
(2002 photo by Norman Seeff)
Credit: Norman Seeff, Wikimedia Commons, CC BY-SA 4.0 -
1980
1980: Flash memory is invented
Flash Memory / Fujio Masuoka
Flash memory is a rewritable semiconductor memory device that is non-volatile (i.e., it retains the information when power is turned off). Unlike previous rewritable memory that could only erase the information one byte at a time, flash memory was designed to erase the entire block of data at once, and to reduce the cost per bit by 75% or more.
Flash memory was invented in 1980 by Fujio Masuoka (then a Toshiba employee). Today, flash memory is found in personal computers, mobile phones, digital cameras, IC cards, and many other applications that we use every day.* The photo is for illustration purposes only
-
1983
1983: Family Computer is released
Family Computer / Nintendo
The advancement in semiconductor technology has revolutionized family entertainment as well. In 1983, Nintendo released a video game console fitted with an 8-bit CPU. The Family Computer (Famicom) became a smash hit on the back of the popularity of Super Mario Bros., a game developed by Nintendo for this platform. Encouraged by Famicom's success, the company released a 16-bit game console known as Super Famicom in 1990.
The video game market thrived as palm-sized mobile platforms and other variations also entered the fray, and the consoles have become more advanced than ever.* The photo is for illustration purposes only
-
1985
1985: A Concrete equation for quantum computers is proposed
Quantum Computers
Richard Feynman, a Nobel Prize winning physicist (1965), is said to have proposed the basic concept of the quantum computer as an application of quantum mechanics.
In 1985, Professor David Deutsch of Oxford University formulated the first concrete equation for a quantum computer—specifically, the concept of a quantum Turing machine. He is often referred to as the "father of quantum computing."
The hardware for quantum bits (qubits) was created by Yasunobu Nakamura and Jaw Shen Tsai at NEC.
A quantum computer's fundamental unit is the qubit, which can exist in a superposition of 1 and 0 simultaneously. Another defining feature is quantum entanglement, whereby one qubit can be interacted with another even over long distances.
It is said that these two properties can reduce computation times from years to minutes. Today, IBM has commercialized quantum computers with over 100 qubits.* The photo is for illustration purposes only
-
1991
1991: Carbon nanotube is discovered
Carbon Nanotube / Sumio Iijima
The evolution of the semiconductor can no longer be discussed without referring to the advancement in nanotechnology. The discovery of the carbon nanotube by Sumio Iijima in 1991 was a pivotal event in this field.
Carbon nanotubes have many excellent properties, including over 1,000 times greater electric current density and 10 times better room-temperature thermal conductivity than copper. That makes carbon nanotubes a promising successor to silicon as semiconductor scaling continues.* The photo is for illustration purposes only
-
1993
1993: First practical blue LED developed
GaN LED / Shuji Nakamura
Light-emitting diodes (LEDs) have been considered ideal for making display panels, as they consume very small power to produce light. The application required three primary colors (red, green, and blue), however, and there had been no practical technology to manufacture blue LED until 1993, when Nichia Corporation and Shuji Nakamura developed the blue LED technology. Their achievement made full color LED displays possible, as well as the blue laser technology that enabled advanced DVD recorders and other applications.
* The photo is for illustration purposes only
-
1995
1995: Sharp releases liquid crystal display TV "Window"
Flat Panel TV / Sharp
In 1982, Epson released the world's first LCD television. It was a digital watch with a 1.2-inch reflective LCD that doubled as a TV. Epson went on to release a pocket color TV with a 1.2-inch transmissive TFT LCD in 1984.
No LCD TV with a reasonably large screen was available until 1995, when Sharp introduced a 10.4-inch TFT model as part of the Window series.* The photo is for illustration purposes only
-
2002
2002: Earth Simulator records world's fastest processing speed of 35.86 TFLOPS
Super Computer / Earth Simulator
The Earth Simulator was a highly parallel vector supercomputer built under the auspices of the then Science and Technology Agency. Designed to study global climate change and the mechanism of strata shifting and diastrophism, the Earth Simulator set the world record performance of 35.86 TFLOPS running the LINPACK benchmark in June 2002. The performance, equivalent to about 36 trillions of floating-point operations per second, was almost five times higher than the runner-up—the IBM ASCI White system. The Earth Simulator maintained the top position on the TOP500 list for about three years.
The defeat was dubbed the Computnik crisis in the U.S., as it was reminiscent of the shocking success of the Soviet satellite Sputnik. Determined to push back, the U.S. government set up a national supercomputer project and regained the top spot in three years.* Screen shot of an article on the TOP500 website
-
2004
2004: Graphene creation experiments prove successful
Graphene / Andre Geim Konstantin Novoselov
Graphene, or an atomic-scale sheet of carbon atoms, is attracting attention as a high-performance semiconductor material that could one day replace silicon. Its existence was predicted as early as in the 1940s, but it was much later that graphene was isolated. In 2004, Andre Geim and Konstantin Novoselov applied clear adhesive tape to a block of graphite, cleaving a thin layer of graphene.
Graphene is expected to enable highly efficient solar cells, very sensitive touch panels, and many other useful applications.* The photo is for illustration purposes only
-
2007
2007: Apple announces the iPhone
Smartphone / Steve Jobs
The announcement of iPhone by Steve Jobs on January 9, 2007 completely changed the landscape of the global mobile phone industry. The iPhone was designed to use what he called "the best pointing device in the world"–our fingers. Featuring a 3.5-inch widescreen multi-touch display panel, the device realized diverse functionality and ease of use at the same time.
* The photo is for illustration purposes only
-
2010
2010: Apple announces the iPad
iPad / Steve Jobs
Steve Jobs introduced Apple's fourth innovation after Macintosh, iPod, and iPhone on January 27, 2010. The iPad, according to him, was a multi-touch information device that would straddle between laptops and smartphones. The iPad was central to Jobs' vision of media integration, and became the key to Apple's consumer market strategy.
* The photo is for illustration purposes only
-
2011
November 2011: K computer achieves world's fastest processing speed of 10 PFLOPS
Super Computer / K Computer
The K computer, Japan's next-generation supercomputer that has been developed under the guidance of the Ministry of Education, Culture, Sports, Science and Technology, recorded the performance of 8.16 PFLOPS running the LINPACK benchmark in June 2011. The feat returned Japan to the top spot on the TOP500 list, years after the Earth Simulator won the honor. In November 2011, the K computer again claimed the top spot with the performance of 10 PFLOPS (10 quadrillion floating-point operations per second), and is still leading the list as of March 2012.
LINPACK is a software library for performing numerical linear algebra on computers. It is used for measuring the performance of supercomputers from around the world and determining the TOP500 ranking.* Screen shot of an article on the TOP500 website
-
2012
2012: The Invention of AI deep learning
AI Deep Learning
The catalyst for the current AI boom was a breakthrough in an image recognition contest: A group of students under Professor Geoffrey Hinton at the University of Toronto used NVIDIA GPUs and CUDA software to train a neural network to learn objects, and by applying the deep learning model to perform inference, they significantly reduced the error rate in image recognition and won the contest despite being first-time participants. Geoffrey Hinton received the Nobel Prize in Physics. Soon afterward, companies like Microsoft and Google replicated the experiments and reduced the error rate to single-digit levels. The AI boom began from this point, and NVIDIA CEO Jensen Huang later referred to this year as the “Big Bang of AI.”
* The photo is for illustration purposes only
-
2013
2013: Google Glass beta test starts
VR(Virtual Reality) / AR(Augmented Reality)
VR and AR are technologies for making computer-generated environments perceptible to humans just as they perceive real-world information. VR generates purely artificial environments by means of computer graphics, whereas AR overlays computer-generated images on real-world scenes to extend or augment reality. With the proliferation of personal computers and smartphones, VR and AR came to be applied to consumer services in the first decade of the 2000s. Google began beta testing its wearable AR tool Google Glass in 2013, setting in motion the trend for applying AR to wearable computers such as smartphones, smart glasses, and smart watches.
Credit: Peppinuzzo / Shutterstock.com
-
2015
2015: Ken Sakamura receives ITU 150 Award
IoT(Internet of Things)
Connecting via the Internet various objects embedded with microprocessors and sensors was an idea that has been entertained since as early as the 1980s. Computer scientist Ken Sakamura, one of the pioneers in the field, dubbed this concept "ubiquitous computing," putting forward an open computer system architecture known as TRON. It was much later, around 2014, that the concept came to be popularly known as the Internet of Things (IoT). Today, IoT is expected to significantly improve industrial efficiency and expand the potential of businesses. In 2015, the International Telecommunication Union (ITU) honored Sakamura with its 150th anniversary award.
* The photo is for illustration purposes only
-
2016
2016: AlphaGo defeats a human Go master
AI(Artificial Intelligence)
Artificial intelligence (AI) was first established as a research discipline at a conference in Dartmouth College in 1956. With the introduction of commercial databases in the 1980s, investment in AI research began to gain momentum. As the processing power of computers has dramatically increased over the past years, AI applications capable of deep learning are emerging that can uncover hidden relationships and patterns in big data in real time. In 2016, the AlphaGo AI program developed by DeepMind defeated the human world champion in the game of Go.
* The photo is for illustration purposes only
-
2019
2019: Practical implementation of EUV begins with TSMC’s 7nm process
EUV lithography
On October 7, 2019, TSMC announced that the first practical implementation of EUV lithography had begun with its N7+ process. Compared with the N7 process, known as the 7-nanometer node, the N7+ process increases integration density by 15-20% and correspondingly reduces power consumption.
EUV lithography is an exposure technology that emits light with a wavelength of 13.5 nanometers generated when high-powered laser beams strike droplets of molten tin (Sn). Because this wavelength is closer to X-rays than to visible light, reflective masks are used instead of transmissive masks, and the reflected light is focused by concave mirrors to reach the wafer.
ASML of the Netherlands is the only company that manufactures and sells EUV lithography systems, but Japanese companies are also competitive in producing EUV mask inspection systems and mask blanks inspection tools.Intel’s High-NA EUV lithography tool installed at Fab D1X, Hillsboro, Oregon
Credit: Intel Corporation -
2022
2022: The Invention of generative AI called ChatGPT
ChatGPT / OpenAI
OpenAI announced a generative AI technology called ChatGPT (Generative Pre-trained Transformer). Generative AI, trained on a wide range of information, is capable not only of answering questions in text but also of generating images, video, and audio. It processes large-scale text data using large language models to perform natural language processing, allowing the system to understand the meaning of entire texts and the relationships between different texts. Even when automatically generated text appears natural, its content may lack factual accuracy, and there is potential for malicious use or invasion of privacy. Accordingly, the establishment of ethical guidelines and the strengthening of monitoring systems have become necessary.
* The photo is for illustration purposes only
-
2022
2022: The Birth of Japan’s first pure-play foundry company
Japan’s first semiconductor foundry
In August 2022, Rapidus, Japan’s first full-fledged semiconductor foundry, was established. The company has succeeded in prototyping a 2-nanometer process using gate-all-around (GAA) transistors. Until now, transistor scaling progressed through 16 nm, 7 nm, 5 nm and 3 nm nodes by using FinFET—a three-dimensional transistor structure—to increase integration density. Japan, however, remained at the planar 40 nm process. To catch up with global leaders, Rapidus plans to skip the FinFET generation and leap directly to the GAA process. By April 2025, installation of over 200 units of equipment, including EUV lithography machines, had been completed, and by June the pilot line was finished.
* The photo is for illustration purposes only
The ever-accelerating evolution of semiconductors
The key inventions that laid the foundation of today's information technology (IT) had already emerged by the end of the 19th century, including the light bulb by Thomas Alva Edison, telephone and telegraph by Alexander Graham Bell, wireless telegraphy by Guglielmo Marconi, and cathode ray tube by Karl Ferdinand Braun. These basic technologies have made remarkable progress since then, due in large part to the development of semiconductors.
Let us trace the evolution of the semiconductor through the history of epoch-making discoveries and inventions, starting with the birth of the vacuum tube—a distant ancestor of semiconductor devices.