Reader's Bill of Rights [after Daniel Pennac in Better than Life
from November 2003 Utne Magazine] includes the rights to:
Skip pages Not read Not finish Not defend your tastes |
Check
our disclaimer
or
other links.
Our BLOG (web log) of Books to read, with recommendations and warnings Index to Book Log archive. Best books read. Reviews of Harry Potter books. |
The Signal and the Noise
(2012) One of the best books read in 2014 |
The book is a reminder to think probabilistically, to understand that data are imprecise, noisy, and subject to biases of the collector and the analyst. And a reminder to learn from data collected over time. This page gives some notes on Nate Silver's 'The Signal and the Noise', read in 2014.
The signal is the truth. The noise is what distracts us from the truth. This is a book about the signal and the noise. [p. 17] |
Considers the failure of prediction surrounding the early 20th-century house-price bubble and the subsequent financial crisis and recession.
Considers the failure of prediction in politics.
Considers the failure of prediction surrounding successes in baseball.
Considers the dynamic behavior of the earth's atmosphere, which brings about weather.
Considers the dynamic behavior of the earth's tectonic plates, which can cause earthquakes.
Considers the dynamic behavior and the possible biases in economic systems:
[E]conomists have for a long time been much too confident in their ability to predict
the direction of the economy. . . .
A 90 percent prediction interval, for instance, is supposed to cover 90 percent of the possible
real-world outcomes, leaving only the 10 percent of outlying cases as the tail ends of the distribution.
If the economists' forecasts were as accurate as they claimed, we'd expect the actual value
for GDP to fall within their prediction interval nine times out of ten, or all but about twice
in eighteen years.
In fact the actual value for GDP fell outside the economists' prediction interval six times in eighteen years, or fully one-third of the time [for 1993-2010]. Another study, which ran these numbers back to . . . 1968, found even worse results: the actual figure for GDP fell outside the predicted interval almost half the time. [p. 181-2] [W]hile the notion that aggregate forecasts beat individual ones is an important empirical regularity, it is sometimes used as a cop-out when forecasts might be improved. [p. 199] [T]he amount of confidence someone expresses in a prediction is not a good indication of its accuracy — to the contrary, these qualities are often inversely correlated. Danger lurks, in the economy and elsewhere, when we discourage forecasters from making a full and explicit account of the risks inherent in the world around us. [p. 203] |
Considers the dynamic behavior of the spread of infectious diseases.
Bayes' Theorem. In the 18th century, a minister Thomas Bayes discovered a simple formula for revising probabilities using new data.
Let x = your initial estimate of a likelihood of a hypothesis. This is the prior probability.
An event occurs: this is new data.
Let y = probability of the event occurring if the likelihood of the hypothesis were true.
Let z = probability of the event occurring if the likelihood of the hypothesis were false.
The posterior probability is the revised estimate of the likelihood of the hypothesis:
yx -------------- yx + z (1 - x) |
Chess and the application of Bayes' Theorem.
Application of Bayes' Theorem to poker.
Application of Bayes' Theorem to the stock market.
Watch out for mislabelling on Fig 11-9 (p.357).
Application of Bayes' Theorem to global warming.
[U]nder Bayes' [T]heorem, no theory is perfect. Rather it is a work in progress, always subject to further refinement and testing. This is what scientific skepticism is all about. [p. 411] |
Application of Bayes' [T]heorem to terrorism attacks.
Quotes Nobel Prize-winner Thomas Schelling:
"There is a tendency in our planning to confuse the unfamiliar with the improbable. The contingency we have not considered seriously looks strange; what looks strange is though improbable; with is improbable need not be considered seriously." [p. 419] |
See his pp. 430-431 with log-log graphs on fatalities as a function of number of attacks, pre- and post- the 9/11/01 attacks. Since the 9/11/01 toll, the probability of a 10K-casualty hit in a 30-year period appears to perhaps 30%. But the extrapolation was not zero before that: for the prior 22-year period it was about 10%. Normalizing for the different spans of years, the revised likelihood is perceived more accurately as double what it was before the attacks.
The problem comes when we mistake the approximation for the reality. ... leave out all the messy bits that make life real and predictions more accurate. [p. 450] |
Bayes [T]heorem requires us to state — explicitly — how likely we believe an event is to occur before we begin to weigh the evidence. It calls this estimate a prior belief. . . . What isn't acceptable under Bayes' [T]heorem it to pretend that you don't have any prior beliefs. You should work to reduce your biases, but to say you have none is a sign that you have many. [p. 451] |
Our bias is to think we are better at prediction than we really are. [p. 454] |
Our web log books read. | On Buddhism. On How to Write Poetry. On Poetry. |
Check our disclaimer.
Copyright © 2014-2016 by J. Zimmerman. |