Tag Archives: stanford

A cold case solved

I was in a discussion about church safety this afternoon, and I recalled a nasty murder at Stanford University’s Memorial Church in 1974. I was a grad student at Stanford then and often attend Memorial Church (“Mem Chu”). I had friends who were very active there. Afterwards I discovered that the murder was solved — in 2018.

UFOs? OK. Alien Spacecraft? No so fast.

I found How Washington Got Hooked on Flying Saucers to be fascinating if somewhat depressing. “There is nothing new under the sun.”

This is a subject I have been watching from a safe distance for well over half a century, when I first read Martin Gardner’s Fads and Fallacies in the Name of Science (Few books have influenced me more than this one).

I met J. Allen Hynek in my last year of high school, 1967-68, when I was a student in the Astro-Science Workshop (Still around although in a different format) at Chicago’s Adler Planetarium. This program was organized and run by Hynek. Hynek was a pleasant and interesting speaker, and a good teacher, but he never spoke to us kids about UFOs.

Continue reading

The Lifetimes of Programming Languages

I started programming computers in February of 1967, when I was a junior in high school. After dropping out of grad school I began a 41 year career in information technology in January of 1977. I have seen a lot of computer languages come and go. So I read 5 Programming Languages You Won’t Likely Be Using by 2030 with some interest. The only one on the list I had ever used was Perl. It was kind of fun, but I did not get very attached to it 🙂

Meanwhile, some much older languages live on. Last year COVID-19 demonstrated how much the financial world still depends on COBOL:

Continue reading

Bayesian Probability

Back in 1976, when I got an M.S. in Statistics from Stanford, the dominant interpretation of probability and statistics was the Frequentist view. The alternative Bayesian interpretation was definitely a minority position.

In recent decades the Bayesian view has been gaining ground, especially after the spectacular success of one of its practioners, Nate Silver, in predicting the results of the 2012 U.S. Presidential election. Silver has written an excellent book, The Signal and the Noise: Why So Many Predictions Fail-but Some Don’t, about forecasting. He gives some vivid examples of Bayesian methods.

The main point of Silver’s book is quite clear in the title: Real world data is full of noise. All too often people see some random fluctuation in the data and think that it represents some real pattern. Silver gives examples from many fields, including sports, the stock market, earthquakes, politics, and economics, that show this. In other cases, e.g. weather forecasting and climate change, there is a discernable signal in all of the noise. Silver neatly debunks some of the bad statistical methods used by the deniers of global warning.

Another good book about Bayesian probability is From Cosmos to Chaos: The Science of Unpredictability, by by Peter Coles. Coles assumes a little more comfort with mathematical notation than Silver, but the actual arguments do not require more than algebra. While discussing the history of probability theory from its roots in gambling, he concentrates on physics and astronomy, which also contributed significantly to the development of statistics. He is a strong advocate of Bayesian probability and suggests the Bayesian view avoids some nasty issues in the interpretation of statistical mechanics and quantum mechanics, notably that in
the latter subject there is no reason for the Many Worlds Interpretation. Incidentally, he has also argued that the conventional interpretation of Sherlock Holmes is wrong. See The Return of the Inductive Detective.

The Frequentists vs. Bayesian debate has also made Xkcd. The implication is that some level we are all Bayesians, even if we don’t admit it.

On an issue in reasoning with probabilities, Ethan Siegel discusses the Inverse gambler’s fallacy in The Last Refuge of a Science-Denying Scoundrel.