Posca, the Roman vinegar-based wonder-drink, is a bit of a mystery, because as much as people keep mentioning it, it is oddly absent from ancient literature. Posca appears in books and articles, being sipped by soldiers and passed around by pals, yet we don’t even have a recipe for it!
Back in 1976, when I got an M.S. in Statistics from Stanford, the dominant interpretation of probability and statistics was the Frequentist view. The alternative Bayesian interpretation was definitely a minority position.
In recent decades the Bayesian view has been gaining ground, especially after the spectacular success of one of its practioners, Nate Silver, in predicting the results of the 2012 U.S. Presidential election. Silver has written an excellent book, The Signal and the Noise: Why So Many Predictions Fail-but Some Don’t, about forecasting. He gives some vivid examples of Bayesian methods.
The main point of Silver’s book is quite clear in the title: Real world data is full of noise. All too often people see some random fluctuation in the data and think that it represents some real pattern. Silver gives examples from many fields, including sports, the stock market, earthquakes, politics, and economics, that show this. In other cases, e.g. weather forecasting and climate change, there is a discernable signal in all of the noise. Silver neatly debunks some of the bad statistical methods used by the deniers of global warning.
Another good book about Bayesian probability is From Cosmos to Chaos: The Science of Unpredictability, by by Peter Coles. Coles assumes a little more comfort with mathematical notation than Silver, but the actual arguments do not require more than algebra. While discussing the history of probability theory from its roots in gambling, he concentrates on physics and astronomy, which also contributed significantly to the development of statistics. He is a strong advocate of Bayesian probability and suggests the Bayesian view avoids some nasty issues in the interpretation of statistical mechanics and quantum mechanics, notably that in
the latter subject there is no reason for the Many Worlds Interpretation. Incidentally, he has also argued that the conventional interpretation of Sherlock Holmes is wrong. See The Return of the Inductive Detective.
The Frequentists vs. Bayesian debate has also made Xkcd. The implication is that some level we are all Bayesians, even if we don’t admit it.
I have a degree in Statistics (M.S., Stanford, 1976). I really enjoyed reading all of this.
If ever there was an iron-clad case to be made for math literacy, it’s what happened over the last few weeks with the New York Times‘ star statistician Nate Silver and his 538 blog (named after the 538 votes in the electoral college).
For the practical implications: How Conservative Media Lost to the MSM and Failed the Rank and File
Also this Tweet:
Karl Rove forms new SuperPAC to run negative ads against “the scourge of math and statistics that’s ruining America”.
It’s a mystery that presented itself unexpectedly: The radioactive decay of some elements sitting quietly in laboratories on Earth seemed to be influenced by activities inside the sun, 93 million miles away.
Is this possible?
Researchers from Stanford and Purdue universities believe it is. But their explanation of how it happens opens the door to yet another mystery.
Also discussed by Jeff Duntemann.
The article mentioned the decays Silicon-32 and Radium-226, as reported in Evidence for Correlations Between Nuclear Decay Rates and Earth-Sun Distance back in 2008. Similarly, Perturbation of Nuclear Decay Rates During the Solar Flare of 13 December 2006 indicated a solar dependence on the decay of Manganese-54. The suggestion that radioactive decay rates might in some way depend on the sun is quite extraordinary, and has prompted the reanalysis of a lot of data. Evidence against correlations between nuclear decay rates and Earth-Sun distance looked at the decay of 6 other radioisotopes without seeing any such dependence. There is no obvious dependence on atomic weight or other systematic difference between the elements.
There are different types of radioactive decay. Radium-226 decays by emitting an α particle (a Helium nucleus: 2 protons and 2 neutrons) while Silicon-32 is a case of β decay (emission of an electron). Manganese-54 decays by electron capture, which is essentially time-reversed β decay. α-decay is a manifestation of the of the strong nuclear force, while β-decay is a weak interaction. If the solar effect is real, then affects two differenct fundamental forces of nature.
John G. Cramer, in Radioactive Decay and the Earth-Sun Distance suggested that
…the Earth’s orbit has a very small eccentricity, so the annual variations in R [the Earth-Sun distance] are small. A better way of testing whether radioactive decay rates depend directly on 1/R2 would be to monitor a radioactive decay process within a space vehicle in a long elliptic orbit with a large eccentricity, so that R has a very large variation. As it happens, NASA has a number of space probes that match this description, because many space probes, particularly those that venture into the outer reaches of the Solar System, are powered by radioisotope-driven thermoelectric power sources containing a strong radioactive decay source that produces enough energy as heat to power the vehicle. The power levels of such thermoelectric generators are carefully monitored because they constitute the principal power source of the vehicle.
This has been done. According to Peter Cooper, in Searching for modifications to the exponential radioactive decay law with the Cassini spacecraft
Data from the power output of the radioisotope thermoelectric generators aboard the Cassini spacecraft are used to test the conjecture that small deviations observed in terrestrial measurements of the exponential radioactive decay law are correlated with the Earth-Sun distance. No significant deviations from exponential decay are observed over a range of 0.7 – 1.6 A.U. A 90% Cl upper limit of 0.84 x 10-4 is set on a term in the decay rate of Pu-238 proportional to 1/R2 and 0.99 x 10-4 for a term proportional to 1/R.
Deep-space probes usually generate power from the heat emitted by a chunk of radioactive material-plutonium-238 for the Cassini spacecraft. Cassini journeyed as close to the sun as Venus and then far back to Saturn, spanning a much wider range of distances from the sun than Earth does during its yearly orbit. If the sun had an effect on plutonium decay, the fluctuations would have been much more substantial than those seen in Earth-bound experiments. As a result, Cooper reasoned, Cassini should have measured substantial changes in its generator’s output. It didn’t.
The Stanford/Symmetry article included something new. Peter Sturrock, Professor Emeritus of Applied Physics at Stanford, suggested is that some of the variation in the Radium-226 and Silicon-32 decay rates is related to solar rotation. From Evidence for Solar Influences on Nuclear Decay Rates
Recent reports of periodic fluctuations in nuclear decay data of certain isotopes have led to the suggestion that nuclear decay rates are being influenced by the Sun, perhaps via neutrinos. Here we present evidence for the existence of an additional periodicity that appears to be related to the Rieger periodicity well known in solar physics.
Links to the research reports can be found at Variability of Nuclear Decay rates. Search for “Research papers: PERIODIC VARIATIONS: SCALE OF DAYS OR YEARS” and “Research papers: NON-PERIODIC VARIATIONS:” Subheading “Of Cosmic Origin”. Thanks to arXiv.org information about current research in physics is easily accessible.
Peter Sturrock was my first course advisor when I was a graduate student in his department at Stanford, 1972-1975. At one point there I wanted to take a course in mathematical statistics. I was a little hesitant about this, since the subject is somewhat off the main direction of graduate study in physics. To my surprise, Professor Sturrock strongly encouraged me to do so. Whatever the conclusion about the relationship between the sun and radioactive decay may be, there will be a lot of statistical analysis along the way.
High-Temperature Superconductivity is something I worked on at Stanford back in 1973-75. The world has changed. Now “high-temperature” means “above the boiling point of liquid Nitrogen” (77oK/-321oF). Back then it meant “above the boiling point of liquid Hydrogen” (20oK/-423oF), and it was still an unattained goal. We had to use liquid Helium. Still, some things remain the same:
- Dimensionality matters. Back then I was working with thin films, thin enough to act in some ways like two-dimensional objects. This is still important.
- The Ginzburg-Landau macroscopic theory seems to still work. The more detailed BCS Theory goes beyond G-L to a give a microscopic description of superconductivity and is very successful at liquid Helium temperatures. However, it appears to be in trouble at higher temperatures.
Fun fact: Decades from now, with school a distant memory, you’ll still be having this dream.
It is true for me: I left grad school in 1976. Allowing for trivial variations, this is still one of my two most frequent nightmares.