[ad_1]
Microcomputers are small computers that use a microprocessor as their CPU. They are the backbone of modern computing and almost all computers today are microcomputers. They became technologically possible in the mid-1970s with the release of the Intel 8080 and slowly broke into the mainstream in the 1980s with the release of machines like the Apple II. Today there are over a billion computers in use worldwide.
“Microcomputer” is an old-fashioned term that refers to a computer that uses a microprocessor (integrated circuit) for its central processing unit (CPU). Also, this type of computer should be small enough to fit on a desk, as microprocessor-based computers larger than that are usually called “minicomputers.” Microprocessor-based computers are the backbone of the modern computing age, often thought of as “third generation” computers, in contrast to the first and second generation vacuum tube and bipolar junction transistors, which were common before the microprocessor was developed. Today almost all computers are microcomputers.
Computers were relatively slow and expensive in the 1940s, 1950s, and 1960s, often requiring large amounts of power and room-sized mainframes. Even a computer the size of a refrigerator could be called a “minicomputer,” due to its relatively small size. At this point, computers were only available to government, universities and large corporations and were meant to be used in timeshares. In 1958, however, Jack Kilinvented the integrated circuit, which opened up the possibility of much smaller computers.
In the late 1960s and early 1970s, minicomputers dominated the scene, being based on integrated circuits but too large to be called microcomputers. As early as 1956, Isaac Asimov wrote of the possibility of small personal computers, and by the mid-1970s, they had become technologically possible. In 1974, Intel released the Intel 8080, what has been called the first truly usable microprocessor. This circuit was later installed in many computers, such as the Altair 8800, which were the first true microcomputers. Early adopters included Bill Gates and Steve Ballmer, Harvard roommates who would go on to found software giant Microsoft.
In the early and mid-1980s, microcomputers began to slowly break out of the nerd realm into the mainstream. The Apple II, which made Apple Computer famous, was released in 1977 and more and more people started to realize its usefulness for business and education. Throughout the 1980s, Apple released more and more smaller and more powerful machines, increasing the appeal of personal computers. Many competitors have emerged, with operating systems such as DOS and Windows®. Today there are over a billion computers in use worldwide.
[ad_2]