Radiocarbon Dating Becoming Unreliable
Your T-shirt could soon contain the same carbon-14 as William the Conqueror’s robe from 1066 AD, scientists warn.
An article in PhysOrg claims that radiocarbon dating is becoming more unreliable as carbon emissions increase. Why would that be?
Fossil fuel emissions could soon make it impossible for radiocarbon dating to distinguish new materials from artefacts that are hundreds of years old.
Carbon released by burning fossil fuels is diluting radioactive carbon-14 and artificially raising the radiocarbon ‘age’ of the atmosphere….
But carbon-14 is carbon-14, isn’t it? Why should fossil fuels have any effect?
Fossil fuels like coal and oil are so old that they contain no carbon-14. When their emissions mix with the modern atmosphere, they flood it with non-radioactive carbon.
In radiocarbon dating terms this makes the atmosphere appear older, which is reflected in the tissues of plants taking in CO2 during photosynthesis, and their products such as cottons.
At the rate fossil fuel emissions are currently increasing, by 2050 a new T-shirt would have the same radiocarbon date as a robe worn by William the Conqueror a thousand years earlier.
Radiocarbon dating is not affected seriously yet, they claim, but will be by 2020 if emissions continue. Incidentally, lab techs already have to correct for another historical contingency:
The fraction of carbon-14 in the atmosphere decreased after the Industrial Revolution with the rise of fossil fuel combustion. But in the 1950s and 60s, nuclear weapons testing caused a sharp increase. Since then atmospheric observations show the levels have been dropping, and are now close to the pre-industrial proportions.
We’re glad to see their agreement that coal “should” be radiocarbon dead. If it is not, then that is prima facie evidence that coal is not millions of years old.
Evolutionists routinely dismiss claims of radiocarbon in coal, diamonds and dinosaur bones, because they already “know” from secularism’s moyboy dogma that these substances “are” millions of years old. There can’t be any radiocarbon present, therefore; if it is found, it must be from contamination—no matter how carefully the specimens were checked. (This is one of several ways they have to escape falsification.)
If coal and these other substances do contain radiocarbon, though, burning them into the atmosphere should not affect radiocarbon dates (at least to the degree they are concerned about). The article does, however, point out some of the assumptions that go into any dating method. The daters have to calibrate their method by deciding what a standard fraction of C14 must have been before man messed it up. Obviously nobody was measuring C14 fractions during the industrial revolution; Libby didn’t invent the method till the late 1940s. By then, nuclear tests were already altering the atmospheric proportions of carbon isotopes.
If daters have to infer the “natural” C14 fraction and correct for anomalies like nuclear testing and fossil fuel burning (assuming fossil fuels contain no C14), then what other corrections are not being made? Nobody knows if C14 production has always been in a steady state, yet that is required knowledge to make the method reliable. What are the unknown unknowns?
For dates that can be cross-checked from historical records, like the date of Hezekiah’s Tunnel (9/10/03), the method is probably fairly reliable. But what of dates before written records were kept? The Tunguska impact of 1908 flattened many square miles of forest. Did they correct for that? (see paper). There are coal beds that have been burning for thousands of years from lightning strikes. How do they know a meteor didn’t hit a coal seam in the unobserved past? How do they know volcanic eruptions didn’t release fossil fuels? Did a nearby supernova cause a spike in atmospheric production of C14? The older the age claim, the more it becomes a matter of worldview, not science.