It was back in 1958 that the first Integrated Circuit was developed. The IC was invented by a Texas Instruments employee, Jack S. Kilby. The company was instrumental in making transistors, and it was Kilby’s task to design micromodules for the military. He did this by connecting a number of germanium wafers in a stacking fashion, with connections made by running wires up the wafers’ sides.


Kilby thought this entire process was very cumbersome, so he decided to simplify the matter by properly engineering a piece of germanium so it could act as many components at the same time. This is how the very first IC came to be.

In 1959, Fairchild Semiconductor developed the modern silicon diffusing process, which today is now the planar process. The IC process eventually evolved over the next few years to include moving the development process to computer-aided design by 1967.

In 1968, Robert Noyce, Gordon Moore and Andrew Grove, who all worked for Fairchild (Noyce was a co-founder), resigned from the company and founded their own: Intel. They left the former company because they weren’t sold on the future of the integrated circuit. Fairchild’s subsidiary semiconductor operation ended up resenting the managers because they believed the invention had a lot to offer the industry.


The Production of the Integrated CPU

A range of next-generation programmable calculators was being planned by a company called Busicom. They designed 12 chips, and wanted Intel to manufacture them because of their experience with count chips. Marcian Hoff Jr. was in charge of the project, but decided that its complexity outweighed the usefulness of a calculator.

Hoff contrasted the design with the DEC Program Data Processor 8, which featured a simple architecture, though it was able to perform high-level operations. Hoff determined that a general purpose processor would have a simpler design yet be capable of handling more tasks. That’s when the MCS-4 chipset and the 4004 integrated CPU came to be.

In 1969, Busicom seated Intel’s chip over its own 12 chip design. A contract between the two companies stated that Busicom had exclusive rights to purchase the new chip set. In 1971, Intel and Busicom agreed that Intel would have complete marketing rights to sell the chips freely in exchange for more affordable chip prices.


Design of the 8086

In 1969, Computer Terminals Corporation (CTC) requested that Intel come up with an LSI registers chip for an intelligent terminal that was being developed. Hoff, along with Stan Mazor who helped develop the 4004, agreed to put the entire processor on one chip: the 8008, an 8 bit processor. Work started on it right away, but was rejected by CTC because it needed many support chips and wasn’t fast enough.

The chip was a success, and eventually became an industry standard. The Intel 8080 came out in 1974, and featured 4,500 transistors, twice as many as the 4004. Its speed was predominantly the result of the use of electron-doped technology.

In 1978, Intel produced the 8086, its first 16 bit processor, which was compatible with the 8080 and 8085, which meant it needed to use an overlapping segment register process to access 1 Megabyte of memory.


The Early Years of the Microprocessor

Intel may have invented the microprocessor and grew the company into an industrial powerhouse by 1981, but they weren’t the only firm developing microprocessors. By 1976 , there were 54 companies making microprocessors. The second ever processor was developed 1972 – Rockwells PPS-4, a 4 bit processor. Texas Instruments introduced another 4 bit processor in 1974, the TMS 1000, which was also the first microcontroller with its own RAM and ROM on a chip.

The cost of the chip became much more affordable by the late 1970s, and therefore became the processor of choice for consumer electronics. More than 40 variants of the chip were available by then, and hundreds of millions were sold.


Microprocessor Clones

The success of the Intel 8008 led to Zilog and Motorola producing competing chips. Motorola was impressed with the 8008, which encouraged them to launch the 6800 in 1974. The 6800 was a processor within the same market as the 8080. While the 6800 was popular, it didn’t have Intel compatibility. In 1975, MOS Technologies came up with their own version of the 6800 – the 6502.

The 6502 had a simple design, and therefore was successful. It was also affordable and featured a single power line. For this reason, it was a favorite for the small home computer businesses. It could compete well against the later designed Z80, and had an influence on Reduced Instruction Set Computing (RISC).

The Zilog Z80 was important because it was compatible with the 8080, and even added another 80 instructions. The Z80 was a powerful processor with on-chip memory refresher circuits. This allowed system designers to develop computers without much extra circuitry and at an affordable price.

Motorola came out with an influential chip, the 68000, one year after Intel developed their first 16 bit processor. It could address 16 megabytes and act like a 32 bit processor internally. Brands like Macintosh, Amiga and Atari personal computers popularly used this chip.


The New RISC

While the RISC was considered a modern concept akin to the 1990s, it can really be traced back to the mid-19602 and Seymour Cray’s CDC 6600. RISC design was simple in design, and enabled sophisticated architectural tactics to be used to boost the speed of the instructions. While the term RISC wasn’t used, IBM formalized the principles in its IBM 801.

RISC grew popular in the late 1980s, after which IBM attempted to market the design as the Research OPD Mini Processor CPU. However, this was not successful. Eventually, the chip became the nucleus of the I/O processor for the IBM 3090.

RISC initially came from University research projects in the US, including the Berkeley RISC 1 which was started by David A. Patterson and his colleagues in 1980. He had been thinking about the challenges of designing a CPU with the VAX architecture. He sent in a document to Computer about it, but it was rejected because of the poor use of silicon. It was at this point that Patterson went down the path of RISC path.

The RISC I, II and SPARC families have register windows, which are a unique feature, considering a CPU only has a few registers that the programmer can see. But this set can be exchanged for another set as the programmer so chooses. This offered a low subroutine overhead thanks to quick context switches. By the mid-1980s, the term RISC became a buzzword, and Intel applied it to its 80486 processor.

At approximately the same time that the Berkeley projects were released, Acorn, a UK home computer firm, was in search of a processor to be swapped with the 6502. After reviewing commercial microprocessors, they determined that they weren’t advanced enough. In 1983, they started their own project and designed an RISC microprocessor, which resulted in the ARM, which is likely the closest to an authentic RISC of any processor.


The Transputer

In 1979, Inmos was developed by the UK government to create state-of-the-art silicon-based goods to be used across the globe. In 1980, Inmos worked on its first microprocessor, but two of the engineers, David May and Robert Milne, held inflexible positions regarding their ideas of the architecture for it. Milne believed that the Inmos chip, named the Transputer, should have been the first chip in the world to run Ada.

This contrasted the beliefs of May and Tony Hoare, an academic from Warwick where he worked with May and agreed with the simple approach to the Transputer design. Iann Barron, who was also with Inmos, got fed up with all the disagreements and forced his ideas on the team. He agreed more with May, but he also thought the multiplicity of individual processors could all work concurrently.

It was in 1985 when the transputer hit the market as the T-212, a 16 bit version with an instruction set similar to the RISC. Every chip had 4 serial links that allowed the microprocessor to connect to other Transputers within a network. The T-9000 was launched in 1994, with a design that was optimized to be used in parallel computers with a systolic array configuration.


The Advanced RISCs

In 1988, DEC developed a new architecture for the firm. It had gone from the PDP-11 to the VAX architecture a decade earlier. It became known as the Alpha AXP architecture with over 30 engineering groups in 10 countries working on it.

risc microprocessor

The team saw the architecture as one which would revolutionize the computer industry. The engineers looked back over the last 25 years and realized a 1000-fold increase in computing power. They saw the same for the next 25 years, and therefore concluded that their designs would able to be run at 10-times the clock speed. They placed memory management, interrupts, and exceptions into code called PALcode with access to CPU hardware in order to enable the processor to run many operating systems effectively. This allowed the Alpha not to be unbiased toward a certain computing style.

The Alpha architecture was selected to be RISC but focused on pipelining and multiple instruction issues instead of conventional RISC traits. Released in 1992, the chip was released and was considered the fastest single-chip microprocessor in the world.

The PowerPC from IBM and Motorola was different from the Alpha: the PowerPC came about because of the needs of IBM and Motorola to create a high performance workstation CPU. The PowerPC evolved from IBMs Performance Optimized with Enhanced RISC multichip processors.



The microprocessor has come a long way, and is an integral part of computing. While it had humble beginnings in the 1950s, by the mid 1980s a big change occurred. While the microprocessor was initially designed for calculator, it’s now being used in a number of designs.