[wpdreams_ajaxsearchpro_results id=1 element='div']

What Gbps?

[ad_1]

A gigabit is a unit of measurement equal to one billion bits of data, with eight bits forming a byte. Storage capacity is usually indicated in bytes, while bits are used to describe data transfer rates. The decimal and binary systems are used with computers, with the former being more common. Gigabit is not often used in data rates, except for fiber optic cables. Abbreviations can cause confusion, but uppercase letters usually refer to bytes. As technology advances, gigabit will become more familiar to the average computer user.

A gigabit is a unit of measurement used in computers, equal to one billion bits of data. A bit is the smallest unit of data. Eight bits are required to form or store a single text character. These 8-bit units are known as bytes. Thus, the difference between a gigabit and a gigabyte is that the latter is 8 times greater, or eight billion bits.

Storage capacity is normally indicated in bytes rather than bits. You probably won’t hear anyone describe a 200 gigabyte drive as 1,600 gigabits. Instead, bits are typically used to describe data transfer rates (DTRs), or how fast bits of information can move between devices, such as modems, Firewire, or Universal Serial Bus (USB) ports.

Two types of number systems used with computers are the decimal system and the binary system. The decimal system counts the kilo as 1000, while the binary system counts the kilo as 1024. This is because it takes 24 more bits of data to store 1000 bits of information on a hard drive or standard storage device. For simplicity, and when referring to data transfer rates, the most typical designation is the decimal system as follows:

1000 bit = 1 kilobit

1000 kilobit = 1 megabit

1000 megabits = 1 gigabit

1000 bytes = 1 kilobyte

1000 kilobytes = 1 megabyte

1000 megabyte = 1 gigabyte

Incidentally, the binary system that uses 1024 bits instead also uses different terminology. The kilobit becomes the kibibit; the megabit, the mebabit; and the gigabit, the gibibit.
Going back to our more familiar decimal designations, abbreviations can often cause confusion. For example, an Internet provider might advertise speeds of 1500 kbps, while a potential customer might assume the abbreviation refers to kilobytes. Typically, byte measurements are used with an uppercase letter, such as “kBps” or “KBps”. If all abbreviated letters are lowercase, the reference should be to bits. However, kilobit, megabit and gigabit could also be abbreviated as Kbit, Mbit and Gbit.
Gigabit is not often used in data rates, as most devices send information at slower rates in kilobytes and megabytes. One notable exception is fiber optic cable. In a November 19, 2007 Verizon press release, the company announced that it had successfully transmitted a video transmission down a fiber-optic cable at a whopping 100 gigabits per second (gbps). For comparison, the fastest Ethernet networks have a maximum throughput of 100 megabits per second (mbps). As of winter 2007, Verizon is deploying fiber optic service (FiOS) in the United States to provide television, digital telephone and Internet services.
As technology advances and data rates increase, the average computer user will no doubt become familiar with gigabit. Until then, most of us will be stuck at sub-kilobyte and megabyte speeds, eager to leap over the next hurdle.

[ad_2]