This week's 5 links on evidence-based decision making.
Hear me talk on communicating messages clearly with data. Webinar October 14: Register here.
1. Good judgment → Accurate forecasts → Better decisions Jason Zweig (@jasonzweigwsj) believes Superforecasting: The Art and Science of Prediction is the "most important book on decision-making since Daniel Kahneman's Thinking Fast and Slow." Kahneman is equally enthusiastic, saying "This book shows that under the right conditions regular people are capable of improving their judgment enough to beat the professionals at their own game." The author, Philip Tetlock, leads the Good Judgment Project, where amateurs and experts compete to make forecasts - and the amateurs routinely win. Tetlock notes that particularly good forecasters regard their views as hypotheses to be tested, not treasures to be guarded. The project emphasizes transparency, urging people to explain why they believe what they do. Are you a Superforecaster? Find out by joining the project at GJOpen.com.
2. Better evidence → Better access → Better health CADTH (@CADTH_ACMTS), a non-profit that provides evidence to Canada's healthcare decision makers, is accepting abstract proposals for its 2016 Symposium, Evidence for Everyone.
3. Coin flip study → Surprising results → Hot hand debate The hot hand is making a comeback. After a noteworthy smackdown by Tom Gilovich, some evidence suggests there is such a thing. Ben Cohen explains in The 'Hot Hand' May Actually Be Real - evidently it's got something to do with coin flips. Regardless of how this works out, everyone should read (or reread) Gilovich's fantastic book, How We Know What Isn't So.
4. Less junk science → Better evidence → Better world The American Council on Science and Health has a mission to "provide an evidence-based counterpoint to the wave of anti-science claims". @ACSHorg presents its views with refreshingly snappy writing, covering a wide variety of topics including public policy, vaccination, fracking, chemicals, and nutrition.
5. Difference of differences → Misunderstanding → Bad evidence Ben Goldacre (@bengoldacre) of Bad Science fame writes in The Guardian that the same statistical errors – namely, ignoring the difference in differences – are appearing throughout the most prestigious journals in neuroscience.