Evidence Soup
How to find, use, and explain evidence.

Wednesday, 19 August 2015

Science of criminal sentencing, pharma formulary decisions, and real astrology?

This week's 5 links on evidence-based decision making.

1. Criminal patterns → Risk assessment → Science of sentencing The Marshall Project describes the new science of sentencing, where courts use statistically derived risk assessments to inform their decisions about which prisoners should be released on parole, and how bail should be set. (Thanks to Gregory Piatetsky, @kdnuggets.)

2. Clinical & cost effectiveness → Evidence base → Pharma formulary Express Scripts, a large U.S. pharmacy benefits manager, has released its 2016 formulary outlining which drugs will be covered, and which will not. @BioPharmaDive explains the decision process: An independent group of physicians reviews the evidence on clinical and cost effectiveness of each candidate. "Me-too" products aren't making the cut.

3. March birthday → Atrial fibrillation → Real astrology? The Journal of the American Medical Informatics Association has findings from a retrospective population study that systematically explored (with a phenome-wide method) the connection between birth month and disease risk for 1,688 conditions. Authors claim that for 55 diseases, "seasonally dependent early developmental mechanisms may play a role in increasing lifetime risk."

4. Data-driven → Fewer middle managers → Nimble decision processes Data-driven management processes need careful driving, says Ed Burns. Benefits include transparent and objective decisions, and more nimble ones when analytics can eliminate middle managers. However, some efforts have backfired. More in this podcast by @EdBurnsTT, What are your tips for putting in place data-driven management strategies?

5. Aggregated economic data → Positive trends → Data-driven optimism Economist Max Roser is an optimist. Jeff Rothfeder writes in @stratandbiz about Roser's analysis of disparate data covering "everything from African development to violent death rates", and his conclusions that evidence unambiguously shows a world evolving for the better.

Tuesday, 11 August 2015

Book of Bad Arguments, evidence-based social services, and a fresh hell of confusing numbers.

This week's 5 links on evidence-based decision making.

1. Bad logic → Bad arguments → Bad decisions The Book of Bad Arguments is a simple explanation of common logical flaws / barriers to successful, evidence-based decisions. This beautifully illustrated work by Ali Almossawi (@AliAlmossawi) should be on everyone's bookshelf. Now available in several languages.

2. Home visits for children → Lifelong benefits → Better society Susan Urahn, EVP of Pew Charitable Trusts, says evidence-based policymaking can guide better use of taxpayer dollars. Her @GOVERNING piece explains how measuring effectiveness will transform delivery of social services. Since research confirms that early childhood affects lifelong behavior and health, states are investing in family support and coaching programs. [Note: Want to participate in an online collaboration on the evidence base for home visits? Let us know: tracy@uglyresearch.com.]

3. Words+Numbers → Statistics → Fresh Hell Hilda Bastian (@hildabast) reminds us to keep our sense of humor: "The comedic possibilities of clinical epidemiology are known to be limitless." Her wonderful blog is Statistically Funny. "OMG that spider is HUGE!" "Where? What - that little thing?" Words mean different things to different people, but numbers do also. Her latest post explains the evidence on why we struggle to get our heads around - and describe - big numbers and relative risks: We do okay with "18 out of 20" but not so much with "18,000 out of 20,000".

4. Data → Analytics → Decision skills You've got the data and the analytics. Now what? One finance executive found a way to improve his team's data literacy with data-driven decision making bootcamps.

5. Climate change evidence → Energy crystal ball → Economic predictions Kudos to @McKinsey for taking a hard look at the energy predictions they made 8 years ago. Their 2007 energy crystal ball was spot-on in several areas, but missed the mark in a few.

Tuesday, 28 July 2015

10 Years After Ioannidis, speedy decision habits, and the peril of whether or not.

1. Much has happened in the 10 years since Why Most Published Research Findings Are False, the much-discussed PLOS essay by John P. A. Ioannidis offering evidence that "false findings may be the majority or even the vast majority of published research claims...." Why are so many findings never replicated? Ioannidis listed study power and bias, the number of studies, and the ratio of true to no relationships among those probed in that scientific field. Also, "the convenient, yet ill-founded strategy of claiming conclusive research findings solely on... formal statistical significance, typically for a p-value less than 0.05."
Now numerous initiatives address the false-findings problem with innovative publishing models, prohibition of p-values, or study design standards. Ioannidis followed up with 2014's How to Make More Published Research True, noting improvements in credibility and efficiency in specific fields via "large-scale collaborative research; replication culture; registration; sharing; reproducibility practices; better statistical methods;... reporting and dissemination of research, and training of the scientific workforce."

2. Speedy decision habits -> Fastest in market -> Winning. Dave Girouard, CEO of personal finance startup Upstart & ex-Google apps head, believes speedy decision-making is essential to competing: For product dev, and other organizational functions. He explains how people can develop speed as a healthy habit. Relatively little is "written about how to develop the institutional and employee muscle necessary to make speed a serious competitive advantage." Key tip: Deciding *when* a decision will be made from the start is a profound, powerful change that speeds everything up.

3. Busy, a new book by Tony Crabbe (@tonycrabbe), considers why people feel overwhelmed and dissatisfied - and suggests steps for improving their personal & work lives. Psychological and business research are translated into practical tools and skills. The book covers a range of perspectives; one worth noting is "The Perils of Whether or Not" (page 31): Crabbe cites classic decision research demonstrating the benefits of choosing from multiple options, vs. continuously (and busily) grinding through one alternative at a time. BUSY: How to Thrive in a World of Too Much, Grand Central Publishing, $28.

4. Better lucky than smart? Eric McNulty reminds us of a costly, and all-too-common, decision making flaw: Outcome bias, when we evaluate the quality of a decision based on its final result. His strategy+business article explains we should be objectively assessing whether an outcome was achieved by chance or through a sound process - but it's easy to fall into the trap of positively judging only those efforts with happy endings (@stratandbiz).

5. Fish vs. Frog: It's about values, not just data. Great reminder from Denis Cuff @DenisCuff of @insidebayarea that the data won't always tell you where to place value. One SF Bay Area environmental effort to save a fish might be endangering a frog species.

Monday, 20 July 2015

The Cardinal Sin of data science, Evidence for Action $, and your biases in 5 easy steps.

My 5 weekly links on evidence-based decision making.

1. Confusing correlation with causation is not the Cardinal Sin of data science, say Gregory Piatetsky (@kdnuggets) and Anmol Rajpurohit (@hey_anmol): It's overfitting. Oftentimes, researchers "test numerous hypotheses without proper statistical control, until they happen to find something interesting and report it. Not surprisingly, next time the effect, which was (at least partly) due to chance, will be much smaller or absent." This explains why it's often difficult to replicate prior findings. "Overfitting is not the same as another major data science mistake - confusing correlation and causation. The difference is that overfitting finds something where there is nothing. In case of correlation and causation, researchers can find a genuine novel correlation and only discover a cause much later."

2. July 22, RWJF (@RWJF) will host a webinar explaining its Evidence for Action program, granting $2.2M USD annually for Investigator-Initiated Research to Build a Culture of Health. "The program aims to provide individuals, organizations, communities, policymakers, and researchers with the empirical evidence needed to address the key determinants of health encompassed in the Culture of Health Action Framework. In addition, Evidence for Action will also support efforts to assess outcomes and set priorities for action. It will do this by encouraging and supporting creative, rigorous research on the impact of innovative programs, policies and partnerships on health and well-being, and on novel approaches to measuring health determinants and outcomes."

3. Your biases, in 5 tidy categories. We've heard it before, but this bears repeating: Our biases (confirmation, sunk cost, etc.) prevent us from making more equitable, efficient, and successful decisions. In strategy+business, Heidi Grant Halvorson and David Rock (@stratandbiz) present the SEEDS™ model, grouping the "150 or so known common biases into five categories, based on their underlying cognitive nature: similarity, expedience, experience, distance, and safety". Unfortunately, most established remedies and training don't overcome bias. But organizations/groups can apply correctional strategies more reliably than we can as individuals.

4. PricewaterhouseCoopers (@PwC_LLP) explains how four key stakeholders are pressuring pharma in 21st Century Pharmaceutical Collaboration: The Value Convergence. These four: government agencies, emboldened insurers, patient advocates, and new entrants bringing new evidence, are substantially shifting how medicine is developed and delivered. "Consumers are ready to abandon traditional modes of care for new ones, suggesting billions in healthcare revenue are up for grabs now. New entrants are bringing biosensor technology and digital tools to healthcare to help biopharmaceutical companies better understand the lives of patients, and how they change in response to drug intervention." These include home diagnostic kits to algorithms that check symptoms and recommend treatments."

5. Remember 'Emotional Intelligence'? A 20-year retrospective study, funded by the Robert Wood Johnson Foundation (@RWJF) and appearing in July's American Journal of Public Health, suggests that "kindergarten students who are more inclined to exhibit “social competence” traits —such sharing, cooperating, or helping other kids— may be more likely to attain higher education and well-paying jobs. In contrast, students who exhibit weaker social competency skills may be more likely to drop out of high school, abuse drugs and alcohol, and need government assistance."

Tuesday, 14 July 2015

Data-driven organizations, machine learning for C-Suite, and healthcare success story.

1. Great stuff on data-driven decision making in a new O'Reilly book by Carl Anderson (@LeapingLlamas), Creating the Data-Driven Organization. Very impressive overview of the many things that need to happen, and best practices for making them happen. Runs the gamut from getting & analyzing the data, to creating the right culture, to the psychology of decision-making. Ugly Research is delighted to be referenced (pages 187-188 and Figure 9-7).

2. Healthcare success story. "Data-driven decision making has improved patient outcomes in Intermountain's cardiovascular medicine, endocrinology, surgery, obstetrics and care processes — while saving millions of dollars in procurement and in its the supply chain."

3. 1) description, 2) prediction, 3) prescription. What the C-Suite needs to understand about applied machine learning. McKinsey's executive guide to machine learning 1.0, 2.0, and 3.0.

4. Place = Opportunity. Where kids grow up has a big impact on what they earn as adults; new evidence on patterns of upward mobility. Recap by @UrbanInstitute's Margery Austin Turner (@maturner).

5. Open innovation improves the odds of biotech product survival. Analysis by Deloitte's Ralph Marcello shows the value of working together, sharing R&D data.

Tuesday, 07 July 2015

Randomistas fight poverty, nurses fight child abuse, and decision support systems struggle.

1. Jason Zweig tells the story of randomistas, who use randomized, controlled trials to pinpoint what helps people become self-sufficient around the globe. The Anti-Poverty Experiment describes several successful, data-driven programs, ranging from financial counseling to grants of livestock.

2. Can an early childhood program prevent child abuse and neglect? Yes, says the Nurse-Family Partnership, which introduces vulnerable first-time parents to maternal and child-health nurses. NFP (@NFP_nursefamily) refines its methodology with randomized, controlled trial evidence satisfying the Coalition for Evidence-Based Policy’s “Top Tier”, and producing a positive return on investment.

3. Do recommendations from decision support technology improve the appropriateness of a physician's imaging orders? Not necessarily. JAMA provides evidence of the limitations of algorithmic medicine. An observational study shows it's difficult to attribute improvements to clinical decision support.

4. Is the "data-driven decision" a fallacy? Yes, says Stefan Conrady, arguing that the good alliteration is a bad motto. He explains on the BayesiaLab blog that the concept doesn't adequately encompass casual models, necessary for anticipating "the consequences of actions we have not yet taken". Good point.

5. A BMJ analysis says the knowledge system underpinning healthcare is not fit for purpose and must change. Ian Roberts says poor-quality, published studies are damaging systematic reviews, and that the Cochrane system needs improvement. Richard Lehman and others will soon respond on BMJ.

Friday, 05 June 2015

Evidence-based executives and insight integrators.

Five for Friday.

1. Lori C. Bieda of SAS is spot on, describing how analytics professionals can grow into roles as trusted advisors for senior executives. In The Translation Layer: The Role of Analytic Talent, she explains that "Analytics teams... need to evolve from data providers into insight integrators." Lots of detailed observations and recommendations in this white paper: Highly recommended.

2. Very cool. An executive-level course in Improving Decision-Making Through Evidence-Based Management. June 30-July 2 (UK). Led by Rob Briner of Bath University (@Rob_Briner) and Eric Barends of the Center for Evidence-Based Management.

3. More on the insight economy. “The rise of insights as a service represents a seismic shift in the business world. We believe it will create a huge new market segment.” IBM's Joel Cawley says data is a commodity, and analytics tools are often too complicated to use. So the insight service approach clears both these hurdles.

4. Fresh voice: Eugene Wei writes about Supposedly Irrelevant Factors, pondering why some insist on believing in rational homo economicus - and how behavioral economics repeatedly debunks this. "Hand waving is required because there is nothing in the workings of markets that turns otherwise normal human beings into Econs."

5. Even more on the insight economy. Krista Schnell (Accenture) explains how companies are monetizing their data further up the value chain, closer to the insight/decision step, and further from raw data. Synopsis, plus a link to the slides from her Strata presentation. (@KristaSchnell)

Friday, 19 December 2014

Fun-with-Evidence Friday: Super food facts and egg nog decision making.

Chocclaims

Super facts. Recently I attended a chocolate & cheese party, and took along some bars from Vosges.  It's tasty, yes, but I was surprised by the back label, boasting super facts about this super dark, super food. Health benefits are said to include detoxification and general wellness. Nutritional boosters include anti-inflammatory.

Would you drink that egg nog if you knew its calorie count? The U.S. FDA is requiring menu labeling for alcoholic drinks. Advocates claim the "Centers for Disease Control and Prevention found that nearly 60 percent of adults use calorie information on menus to decide what to order". Not really: The CDC study includes the caveat that "data were not available to determine whether frequent or moderate [menu label] users choose more healthful foods than nonusers." So, evidently a 'label user' is someone who *says* they read the label. Will this regulatory intervention have any beneficial impact on pubic health? (Fun fact: In a previous life, I wrote a book about U.S. Nutrition Facts labeling requirements.)

Wishing everyone a delightful and prosperous 2015. 

Friday, 05 December 2014

"Big E" vs. "little e" evidence, the hero's journey, and provisional truth.

1. It's tempting to think there's a hierarchy for data: That evidence from high-quality experiments is on top at Level 1, and other research findings follow thereafter. But even in healthcare - the gold standard for the "gold standard" - it's not that simple, says NICE in The NICE Way: Lessons for Social Policy and Practice from the National Institute for Health and Care Excellence. The report, produced with the Alliance for Useful Evidence (@A4UEvidence), cautions against "slavish adherence" to hierarchies of research. "The appropriateness of the evidence to the question is more important, not its place in any such hierarchy." NICE articulates how its decision model works, and argues why it's also relevant outside healthcare.

2. Don't let "little e" evidence kill off good Big Ideas. Take note, lean startups + anyone new to the ways of validating ideas with evidence. In their should-be-considered-a-classic, Is Decision-Based Evidence Making Necessarily Bad?, Tingling and Broydon examine the different uses of evidence in decision-making (MIT Sloan Management Review). As a predictive tool, sometimes it's flat wrong: Those iconic Aeron chairs in tech offices everywhere? Utterly rejected by Herman Miller's market-research focus groups. It's good to have a culture where "small" evidence isn't just an excuse to avoid risk-taking. But it's also good to look at "Big E" Evidence, assessing what research is predictive over time, and replace older methods (focus groups, perhaps).

3. 10+ years ago, Billy Beane famously discovered powerful analytic insights for managing the Oakland A's baseball* team, and as a reward was portrayed by Brad Pitt in Moneyball. Now a bipartisan group of U.S. federal leaders and advisors has published Moneyball for Government, intending to encourage use of data, evidence, and evaluation in policy and funding decisions. On Twitter at @Moneyball4gov. *The A's play not far from Ugly Research HQ, and much to our dismay, the Moneyball competitive advantage has long since played out. But Billy is still a great analyst; for the moment, we're just holding our breath.

4. We've barely scratched the surface on figuring out how to present data to decision makers. All Analytics did a web series this week, The Results Are In: Think Presentation From the Start (recordings and slide decks available). One of the highlights was a comparison to Joseph Campbell's hero's journey, by James Haight of Blue Hill Research (on Twitter @James_Haight).

5. We're wired to seek certainty. But Ted Cadsby argues the world is too complex for our simplified conclusions. He suggests probabilistic thinking to arrive at a "provisional truth" that we can test over time in Closing the Mind Gap: Making Smarter Decisions in a Hypercomplex World.

Five for Friday: 5-Dec-2014.

Monday, 17 November 2014

Cognitive analytics, satisficing, sabremetrics, and other cool ways of deciding.

My weekly five links on evidence & decision-making. 

1. I was delighted when a friend sent a link to 10 Things I Believe About Baseball Without Evidence. Ken Arneson (@kenarneson) looks at sabremetrics with things like the linguistic relativity principle and the science of memory-based prediction. Warning to Oakland A's fans: Includes yet more reminders of the epic 2014 collapse.

2. The concept of satisficing came up on one of my current projects, so I reread some overviews. Herbert Simon coined this term to explain decision making under circumstances where an optimal solution can't be determined. (Simon sure has staying power: During grad school I wanted a copy of Administrative Behavior (1957), and there it was in paperback on the shelf at the Tattered Cover in Denver.)

3. Eric Topol (Medscape) interviewed Margaret Hamburg, US FDA commissioner, on weighing risks/benefits in complex agency decision-making. She talked specifically about the newly restrictive policy on opiate painkiller prescriptions (see page 3). I'd like to see FDA's analysis of unintended side effects, such as the concurrent rise of U.S. heroin use/overdoses.

4. Organizations are challenged with finding the pony in all their data, and marketing-spin hijinks have ensued. Seth Grimes (@sethgrimes) has a great discussion of efforts to go beyond the fundamental volume, velocity, etc. in Avoid Wanna-V Confusion.

5. Steve Ardire (@SArdire) authored a Dataversity paper on cognitive computing, including survey results and some choice comments, both pro and con, about the prospects for this emerging area. For creating business value, 'business intelligence/cognitive analytics' looks promising. Cognitive Computing: An Emerging Hub in IT Ecosystems.

Curated by Tracy Altman of Ugly Research