What’s a digital computer?

Print anything with Printful



Digital computers store and manipulate data using numerical formats. They are electronic and programmable, while analog computers use physical phenomena to model different phenomena. The first digital computers date back to the 19th century and emerged from World War II. In the 21st century, computers rely on integrated circuits and nanotechnology can lead to new varieties of mechanical computation.

A digital computer is a machine that stores data in a numerical format and performs operations on that data using mathematical manipulation. This type of computer typically includes some sort of device for storing information, a method for inputting and outputting data, and components that allow you to perform mathematical operations on the stored data. Digital computers are almost always electronic, but they don’t necessarily have to be.

There are two main methods of modeling the world with a computing machine. Analog computers use some physical phenomena, such as electric voltage, to model a different phenomenon and perform operations by directly modifying the stored data. A digital computer, however, stores all data as numbers and performs operations on that data arithmetically. Most computers use binary numbers to store data, since the ones and zeros that make up these numbers are easily represented with simple on-off electrical states.

Computers based on analog principles have advantages in some specialized areas, such as their ability to continuously model an equation. A digital computer, however, has the advantage of being easily programmable. This means they can process many different instruction sets without being physically reconfigured.

The first digital computers date back to the 19th century. A prime example is the analytical engine theorized by Charles Babbage. This machine would have stored and processed data mechanically. That data, however, would not have been stored mechanically, but rather as a series of digits represented by discrete physical states. This computer would be programmable, the first in computing.

Digital computing came into widespread use during the 20th century. The pressures of war led to great advances in the field, and electronic computers emerged from World War II. This type of digital computer generally used arrays of vacuum tubes to store information for active use in computation. Paper or punched cards were used for long-term archiving. Keyboard input and monitors emerged later in the century.

In the early 21st century, computers rely on integrated circuits rather than vacuum tubes. They still use active memory, long-term storage, and central processing units. Input and output devices have multiplied dramatically, but still perform the same basic functions.

In 2011, computers are starting to push the limits of conventional circuitry. Circuit paths in a digital computer can now be printed so close together that effects such as electron tunneling must be taken into account. Work on digital optical computers, which process and store data using light and lenses, can help overcome this limitation.

Nanotechnology can lead to an entirely new variety of mechanical computation. Data can be stored and digitally processed at the level of single molecules or small groups of molecules. A surprising number of molecular computing elements would fit into a relatively small space. This could greatly increase the speed and power of digital computers.




Protect your devices with Threat Protection by NordVPN


Skip to content