Radiometric Dating: What is it?

Print anything with Printful



Radiometric dating uses the decay rate of atoms to determine the age of materials like rocks. Ernest Rutherford discovered the process, which measures the half-life of radioactive isotopes. Carbon dating is a commonly used form, but contamination is a concern. Geochemist Clair Patterson used radiometric dating on meteorites to estimate the age of the Earth at 4.5 billion years.

Radiometric dating is a method by which the age of materials such as rocks can be determined. The process relies on the fact that some atoms decay or transform at a measurable rate over time, meaning that age can be established by calculating the rate of decay from a sample. The invention of radiometric dating was a crucial step in the process of determining the age of the Earth, a question that troubled scientists for centuries before finally reaching a widely accepted result in the 20th century.

The discovery of radiometric dating is largely credited to Ernest Rutherford, a British scientist who became interested in the study of radioactivity in the late 19th century. Radioactivity had only recently been introduced to the scientific community, mainly through the work of Marie and Pierre Curie. Rutherford, along with several collaborators, discovered that some radioactive isotopes, which are elements with unequal numbers of protons and neutrons, decay from an unstable version to a stable one. Radiometric dating could determine the age of a sample by measuring how long it took for half of the atoms in a sample to transform into the stable version. This measurement became known as the half-life and forms the basis of radiometric dating.

Radiometric dating is sometimes referred to as carbon dating, because one of the most commonly used forms of dating measures the half-life of carbon-14, an isotope of carbon with six protons and eight neutrons. Carbon dating, however, is only accurate for fossils and rocks that are less than 50,000 years old. Other half-life calculations are done for older samples, using a variety of different isotopes, including potassium and uranium.

One of the biggest concerns in this dating method is contamination. In order for a sample to be measured accurately, unstable parent and stable child isotopes cannot have entered or left the sample after the material originally formed. Because contamination is such a common problem, it is standard practice to test many different samples of a material to arrive at an accurate range.

The first truly accurate measurement of the age of the Earth was made by a geochemist named Clair Patterson in the late 1940s. Patterson’s genius was in realizing that the best possible estimates of the age of the Earth could be made using radiometric dating on meteorites, since meteorites date back to the formation of the solar system, and therefore were born around the same time as the birth of the Earth. By measuring the half-life of uranium in meteoric samples, Patterson gave an estimate of 4.5 billion years in the 1950s, which remains the most widely accepted figure in the 21st century.




Protect your devices with Threat Protection by NordVPN


Skip to content