Microprocessor
Definition & Facts
microprocessor, any of a type of miniature electronic device that contains the arithmetic, logic, and control circuitry necessary to perform the functions of a digital computer’s central processing unit. In effect, this kind of integrated circuit can interpret and execute program instructions as well as handle arithmetic operations.
In the early 1970s the introduction of large-scale integration (LSI)—which made it possible to pack thousands of transistors, diodes, and resistors onto a silicon chip less than 0.2 inch (5 mm) square—led to the development of the microprocessor. The first microprocessor was the Intel 4004, which was introduced in 1971. During the early 1980s very large-scale integration (VLSI) vastly increased the circuit density of microprocessors. In the 2010s a single VLSI circuit holds billions of electronic components on a chip identical in size to the LSI circuit. (For more about the history of microprocessors, see computer: The microprocessor.)
The production of inexpensive microprocessors enabled computer engineers to develop microcomputers. Such computer systems are small but have enough computing power to perform many business, industrial, and scientific tasks. The microprocessor also permitted the development of so-called intelligent terminals, such as automatic teller machines and point-of-sale terminals employed in retail stores. The microprocessor also provides automatic control of industrial robots, surveying instruments, and various kinds of hospital equipment. It has brought about the computerization of a wide array of consumer products, including programmable microwave ovens, television sets, and electronic games. In addition, some automobiles feature microprocessor-controlled ignition and fuel systems designed to improve performance and fuel economy.
What's Your Reaction?