Dating Methods

What are the methods used by scientists to date archeological finds? And do those methods tell the true age of buried organisms?

The method used by scientists to determine the age of archaeological finds is called radiometric dating. It involves measuring decayed radioactive elements and, by extrapolating backward in time, determining the age of an organism.

One element commonly used, in what's referred to as "radiocarbon dating" or "radiocarbon reading," is C-14, a radioactive isotope of carbon, which is formed in the atmosphere by cosmic rays. All living organisms absorb an equilibrium concentration of this radioactive carbon. When organisms die, C-14 decays and is not replaced. Since we know the concentration of radioactive carbon in the atmosphere, and we also know that it takes 5,730 years for half of C-14 to decay (called a "half-life cycle"), and another 5,730 years for half of what's left to decay, and so on, by measuring the remaining concentration of radiocarbon we can tell how long ago an organism died.

Since C-14 can only give dates in the thousands of years, elements with longer half-life cycles (such as Samarium-147, Rubidium-87, Rhenium-187, Lutetium-176, to name a few, with half-life cycles in the billions of years) are used to date what are believed to be older archaeological finds. The procedure is roughly the same; the amount of decay is measured against the initial amount of radioactive material, giving the object's supposed age.

One obvious flaw in this technique is that we don't really know the level of radioactive concentration acquired by an organism which lived before such recorded history. Scientists make a bold assumption that the atmospheric concentration of the radioactive material -- carbon or any other element -- being measured has not changed since the organism's death.

Another bold assumption made by scientists is that the rate of radioactive decay has remained constant throughout history.

Are these valid assumptions?

Hardly.

In 1994 Otto Reifenschweiler, a scientists at the Philips Research Laboratories in The Netherlands, showed that the radioactivity of tritium could be reduced by 40 per cent at temperatures between 115 and 275 Celsius. That is, under certain conditions, the environment can effect radioactive decay.

In 2006 Professor Claus Rolfs, leader of a group of scientists at Ruhr University in Bochum, Germany, in an effort to reduce nuclear waste radioactivity, has come up a with a technique to greatly speed up radioactive decay. Rolfs: "We are currently investigating radium-226, a hazardous component of spent nuclear fuel with a half-life of 1600 years. I calculate that using this technique could reduce the half-life to 100 years. At best, I have calculated that it could be reduced to as little as two years ... We are working on testing the hypothesis with a number of radioactive nuclei at the moment and early results are promising ... I don't think there will be any insurmountable technical barriers."

Reducing 1600 years to two years is a phenomenal 98 percent reduction. This means that an archeological find that has gone through environmental conditions similar to those in the lab could appear to be 300,000 years old when in fact it's only six thousand years old.

What's more, if scientists, with relatively limited resources, can speed up radioactive decay 800 times, the violent upheavals of earth's history could certainly have sped up radioactive decay by
far greater numbers. Thus, if radioactive decay increased, say, 1
million fold, an organism thought to be 4 billion years old, based on today's rate of radioactive decay, would be no more than 4,000 years old.

What's interesting is that earth's history of cataclysmic events is not questioned by anyone -- neither scientist nor Biblical scholar. They may differ in their accounts of what occurred, but not necessarily in the severity of the events.


The Bible's account of The Flood, of course, would have been the mother of all catastrophes. It entailed heat, pressure, and an unimaginable mixture of elements. This would certainly have far exceeded any extreme conditions created by scientists in a lab.

The scientific account of earth's formation and development is no less catastrophic:

Earth formed of the debris flung off the sun's violent formation about 4.5 billions years ago. Being a molten planet in it's initial stages, earth's dense materials of molten nickel and iron flowed to the center, and its lighter materials, such as molten silicon, flowed to the top. Eventually, earth cooled and solidified into a core, mantle and crust.

Earth's original atmosphere consisted of Hydrogen and Helium. This atmosphere subsequently heated to escape-velocity by solar radiation and escaped into space. It took about 2 billion years for oxygen to appear in earth's atmosphere, eventually resulting in an atmosphere consisting of 78% Nitrogen and 20% Oxygen.

Our planet has been pounded by meteorites throughout history. One such impact, in Mexico, an alleged 65 million years ago, was so intense that it resulted in mass extinctions, including the extinction of the dinosaur.

Earth has gone through several ice ages. The last one ended around 10,000 years ago, after lasting roughly 60,000 years. At one point 97% of Canada was covered in ice.

The fact is we're detecting natural variations in the rate of radioactive decay even today, in a relative calm period of global and cosmological history. "Recent reports of periodic fluctuations in nuclear decay data of certain isotopes have led to the suggestion that nuclear decay rates are being influenced by the Sun ... " reported the Cornell University website (arxiv.org/abs/1007.3318) on July 20, 2010.

And they're not alone.


* The Atlantic: TheAtlantic.com

(August 25, 2010) "Radioactive elements on Earth are like geological watches. A radioactive isotope of carbon is used to date human civilizations, among other things, because we know that its half-life is precisely 5,730 years; count how much of the carbon 14 has decayed and you can get a pretty accurate measure of how old something is. (If half of the expected amount is left, you'd say, 'This thing is likely 5,730 years old.')

"But what if the rate of radioactive decay -- the watch -- was not constant? One minute, the second hand is moving at one speed, and the next it has sped up or slowed down. And what if what changed that rate of decay was solar activity on the sun, 93 million miles away?

"That's what recent research at Purdue University suggests. In a slate of recent papers, physicists Ephraim Fischbach and Jere Jenkins argue that measured differences in the decay rates of radioactive isotopes cannot be explained by experimental errors. Instead, they seem to vary with the earth's distance from the sun and periodic changes in solar activity."


Ephraim Fischbach is a professor of physics, with a B.A. in Physics from Columbia University and a Ph.D. and M.S. in Physics from the University of Pennsylvania. Jere Jenkins is Director of the Radiation Laboratories at the School of Nuclear Engineering.

* AstroEngine - AstroEngine.com

(September 26, 2008) The paper entitled 'Evidence for Correlations Between Nuclear Decay Rates and Earth-Sun Distance' by Jenkins et al. studied the link between nuclear decay rates of several independent silicon and radium isotopes. Decay data was accumulated over many years and a strange pattern emerged; radioactive decay rates fluctuated with the annual variation of Earth's distance from the Sun (throughout Earth's 365 day orbit, our planet fluctuates approximately 0.98 AU to 1.02 AU from the Sun)." [1 AU (Astronomical Unit) is approximately 93 million miles, the distance from earth to the sun.]


Further studies of radioactive material on board spacecrafts, as they moved away from the sun, showed that distance from the sun is not the culprit, and the cause of radioactive variations remains a mystery.

* Stanford University - news.stanford.edu

"It's a mystery that presented itself unexpectedly: The radioactive decay of some elements sitting quietly in laboratories on Earth seemed to be influenced by activities inside the sun, 93 million miles away.

"Is this possible?

"Researchers from Stanford and Purdue University believe it is. But their explanation of how it happens opens the door to yet another mystery.

"There is even an outside chance that this unexpected effect is brought about by a previously unknown particle emitted by the sun. 'That would be truly remarkable,' said Peter Sturrock, Stanford professor emeritus of applied physics and an expert on the inner workings of the sun. 'It's an effect that no one yet understands. Theorists are starting to say, "What's going on?" But that's what the evidence points to. It's a challenge for the physicists and a challenge for the solar people too.'"

Consequently, with a varying radioactive decay rate, there's no way to tell what the radioactive saturation level of any substance or organism was years ago and how long it took for that radioactivity to decay, rendering current dating methods inaccurate and unreliable.