Evidence Soup
How to find, use, and explain evidence.

Tuesday, 28 July 2015

10 Years After Ioannidis, speedy decision habits, and the peril of whether or not.

1. Much has happened in the 10 years since Why Most Published Research Findings Are False, the much-discussed PLOS essay by John P. A. Ioannidis offering evidence that "false findings may be the majority or even the vast majority of published research claims...." Why are so many findings never replicated? Ioannidis listed study power and bias, the number of studies, and the ratio of true to no relationships among those probed in that scientific field. Also, "the convenient, yet ill-founded strategy of claiming conclusive research findings solely on... formal statistical significance, typically for a p-value less than 0.05."
Now numerous initiatives address the false-findings problem with innovative publishing models, prohibition of p-values, or study design standards. Ioannidis followed up with 2014's How to Make More Published Research True, noting improvements in credibility and efficiency in specific fields via "large-scale collaborative research; replication culture; registration; sharing; reproducibility practices; better statistical methods;... reporting and dissemination of research, and training of the scientific workforce."

2. Speedy decision habits -> Fastest in market -> Winning. Dave Girouard, CEO of personal finance startup Upstart & ex-Google apps head, believes speedy decision-making is essential to competing: For product dev, and other organizational functions. He explains how people can develop speed as a healthy habit. Relatively little is "written about how to develop the institutional and employee muscle necessary to make speed a serious competitive advantage." Key tip: Deciding *when* a decision will be made from the start is a profound, powerful change that speeds everything up.

3. Busy, a new book by Tony Crabbe (@tonycrabbe), considers why people feel overwhelmed and dissatisfied - and suggests steps for improving their personal & work lives. Psychological and business research are translated into practical tools and skills. The book covers a range of perspectives; one worth noting is "The Perils of Whether or Not" (page 31): Crabbe cites classic decision research demonstrating the benefits of choosing from multiple options, vs. continuously (and busily) grinding through one alternative at a time. BUSY: How to Thrive in a World of Too Much, Grand Central Publishing, $28.

4. Better lucky than smart? Eric McNulty reminds us of a costly, and all-too-common, decision making flaw: Outcome bias, when we evaluate the quality of a decision based on its final result. His strategy+business article explains we should be objectively assessing whether an outcome was achieved by chance or through a sound process - but it's easy to fall into the trap of positively judging only those efforts with happy endings (@stratandbiz).

5. Fish vs. Frog: It's about values, not just data. Great reminder from Denis Cuff @DenisCuff of @insidebayarea that the data won't always tell you where to place value. One SF Bay Area environmental effort to save a fish might be endangering a frog species.

Monday, 20 July 2015

The Cardinal Sin of data science, Evidence for Action $, and your biases in 5 easy steps.

My 5 weekly links on evidence-based decision making.

1. Confusing correlation with causation is not the Cardinal Sin of data science, say Gregory Piatetsky (@kdnuggets) and Anmol Rajpurohit (@hey_anmol): It's overfitting. Oftentimes, researchers "test numerous hypotheses without proper statistical control, until they happen to find something interesting and report it. Not surprisingly, next time the effect, which was (at least partly) due to chance, will be much smaller or absent." This explains why it's often difficult to replicate prior findings. "Overfitting is not the same as another major data science mistake - confusing correlation and causation. The difference is that overfitting finds something where there is nothing. In case of correlation and causation, researchers can find a genuine novel correlation and only discover a cause much later."

2. July 22, RWJF (@RWJF) will host a webinar explaining its Evidence for Action program, granting $2.2M USD annually for Investigator-Initiated Research to Build a Culture of Health. "The program aims to provide individuals, organizations, communities, policymakers, and researchers with the empirical evidence needed to address the key determinants of health encompassed in the Culture of Health Action Framework. In addition, Evidence for Action will also support efforts to assess outcomes and set priorities for action. It will do this by encouraging and supporting creative, rigorous research on the impact of innovative programs, policies and partnerships on health and well-being, and on novel approaches to measuring health determinants and outcomes."

3. Your biases, in 5 tidy categories. We've heard it before, but this bears repeating: Our biases (confirmation, sunk cost, etc.) prevent us from making more equitable, efficient, and successful decisions. In strategy+business, Heidi Grant Halvorson and David Rock (@stratandbiz) present the SEEDS™ model, grouping the "150 or so known common biases into five categories, based on their underlying cognitive nature: similarity, expedience, experience, distance, and safety". Unfortunately, most established remedies and training don't overcome bias. But organizations/groups can apply correctional strategies more reliably than we can as individuals.

4. PricewaterhouseCoopers (@PwC_LLP) explains how four key stakeholders are pressuring pharma in 21st Century Pharmaceutical Collaboration: The Value Convergence. These four: government agencies, emboldened insurers, patient advocates, and new entrants bringing new evidence, are substantially shifting how medicine is developed and delivered. "Consumers are ready to abandon traditional modes of care for new ones, suggesting billions in healthcare revenue are up for grabs now. New entrants are bringing biosensor technology and digital tools to healthcare to help biopharmaceutical companies better understand the lives of patients, and how they change in response to drug intervention." These include home diagnostic kits to algorithms that check symptoms and recommend treatments."

5. Remember 'Emotional Intelligence'? A 20-year retrospective study, funded by the Robert Wood Johnson Foundation (@RWJF) and appearing in July's American Journal of Public Health, suggests that "kindergarten students who are more inclined to exhibit “social competence” traits —such sharing, cooperating, or helping other kids— may be more likely to attain higher education and well-paying jobs. In contrast, students who exhibit weaker social competency skills may be more likely to drop out of high school, abuse drugs and alcohol, and need government assistance."

Tuesday, 14 July 2015

Data-driven organizations, machine learning for C-Suite, and healthcare success story.

1. Great stuff on data-driven decision making in a new O'Reilly book by Carl Anderson (@LeapingLlamas), Creating the Data-Driven Organization. Very impressive overview of the many things that need to happen, and best practices for making them happen. Runs the gamut from getting & analyzing the data, to creating the right culture, to the psychology of decision-making. Ugly Research is delighted to be referenced (pages 187-188 and Figure 9-7).

2. Healthcare success story. "Data-driven decision making has improved patient outcomes in Intermountain's cardiovascular medicine, endocrinology, surgery, obstetrics and care processes — while saving millions of dollars in procurement and in its the supply chain."

3. 1) description, 2) prediction, 3) prescription. What the C-Suite needs to understand about applied machine learning. McKinsey's executive guide to machine learning 1.0, 2.0, and 3.0.

4. Place = Opportunity. Where kids grow up has a big impact on what they earn as adults; new evidence on patterns of upward mobility. Recap by @UrbanInstitute's Margery Austin Turner (@maturner).

5. Open innovation improves the odds of biotech product survival. Analysis by Deloitte's Ralph Marcello shows the value of working together, sharing R&D data.

Tuesday, 07 July 2015

Randomistas fight poverty, nurses fight child abuse, and decision support systems struggle.

1. Jason Zweig tells the story of randomistas, who use randomized, controlled trials to pinpoint what helps people become self-sufficient around the globe. The Anti-Poverty Experiment describes several successful, data-driven programs, ranging from financial counseling to grants of livestock.

2. Can an early childhood program prevent child abuse and neglect? Yes, says the Nurse-Family Partnership, which introduces vulnerable first-time parents to maternal and child-health nurses. NFP (@NFP_nursefamily) refines its methodology with randomized, controlled trial evidence satisfying the Coalition for Evidence-Based Policy’s “Top Tier”, and producing a positive return on investment.

3. Do recommendations from decision support technology improve the appropriateness of a physician's imaging orders? Not necessarily. JAMA provides evidence of the limitations of algorithmic medicine. An observational study shows it's difficult to attribute improvements to clinical decision support.

4. Is the "data-driven decision" a fallacy? Yes, says Stefan Conrady, arguing that the good alliteration is a bad motto. He explains on the BayesiaLab blog that the concept doesn't adequately encompass casual models, necessary for anticipating "the consequences of actions we have not yet taken". Good point.

5. A BMJ analysis says the knowledge system underpinning healthcare is not fit for purpose and must change. Ian Roberts says poor-quality, published studies are damaging systematic reviews, and that the Cochrane system needs improvement. Richard Lehman and others will soon respond on BMJ.

Friday, 05 June 2015

Evidence-based executives and insight integrators.

Five for Friday.

1. Lori C. Bieda of SAS is spot on, describing how analytics professionals can grow into roles as trusted advisors for senior executives. In The Translation Layer: The Role of Analytic Talent, she explains that "Analytics teams... need to evolve from data providers into insight integrators." Lots of detailed observations and recommendations in this white paper: Highly recommended.

2. Very cool. An executive-level course in Improving Decision-Making Through Evidence-Based Management. June 30-July 2 (UK). Led by Rob Briner of Bath University (@Rob_Briner) and Eric Barends of the Center for Evidence-Based Management.

3. More on the insight economy. “The rise of insights as a service represents a seismic shift in the business world. We believe it will create a huge new market segment.” IBM's Joel Cawley says data is a commodity, and analytics tools are often too complicated to use. So the insight service approach clears both these hurdles.

4. Fresh voice: Eugene Wei writes about Supposedly Irrelevant Factors, pondering why some insist on believing in rational homo economicus - and how behavioral economics repeatedly debunks this. "Hand waving is required because there is nothing in the workings of markets that turns otherwise normal human beings into Econs."

5. Even more on the insight economy. Krista Schnell (Accenture) explains how companies are monetizing their data further up the value chain, closer to the insight/decision step, and further from raw data. Synopsis, plus a link to the slides from her Strata presentation. (@KristaSchnell)

Friday, 19 December 2014

Fun-with-Evidence Friday: Super food facts and egg nog decision making.

Chocclaims

Super facts. Recently I attended a chocolate & cheese party, and took along some bars from Vosges.  It's tasty, yes, but I was surprised by the back label, boasting super facts about this super dark, super food. Health benefits are said to include detoxification and general wellness. Nutritional boosters include anti-inflammatory.

Would you drink that egg nog if you knew its calorie count? The U.S. FDA is requiring menu labeling for alcoholic drinks. Advocates claim the "Centers for Disease Control and Prevention found that nearly 60 percent of adults use calorie information on menus to decide what to order". Not really: The CDC study includes the caveat that "data were not available to determine whether frequent or moderate [menu label] users choose more healthful foods than nonusers." So, evidently a 'label user' is someone who *says* they read the label. Will this regulatory intervention have any beneficial impact on pubic health? (Fun fact: In a previous life, I wrote a book about U.S. Nutrition Facts labeling requirements.)

Wishing everyone a delightful and prosperous 2015. 

Friday, 05 December 2014

"Big E" vs. "little e" evidence, the hero's journey, and provisional truth.

1. It's tempting to think there's a hierarchy for data: That evidence from high-quality experiments is on top at Level 1, and other research findings follow thereafter. But even in healthcare - the gold standard for the "gold standard" - it's not that simple, says NICE in The NICE Way: Lessons for Social Policy and Practice from the National Institute for Health and Care Excellence. The report, produced with the Alliance for Useful Evidence (@A4UEvidence), cautions against "slavish adherence" to hierarchies of research. "The appropriateness of the evidence to the question is more important, not its place in any such hierarchy." NICE articulates how its decision model works, and argues why it's also relevant outside healthcare.

2. Don't let "little e" evidence kill off good Big Ideas. Take note, lean startups + anyone new to the ways of validating ideas with evidence. In their should-be-considered-a-classic, Is Decision-Based Evidence Making Necessarily Bad?, Tingling and Broydon examine the different uses of evidence in decision-making (MIT Sloan Management Review). As a predictive tool, sometimes it's flat wrong: Those iconic Aeron chairs in tech offices everywhere? Utterly rejected by Herman Miller's market-research focus groups. It's good to have a culture where "small" evidence isn't just an excuse to avoid risk-taking. But it's also good to look at "Big E" Evidence, assessing what research is predictive over time, and replace older methods (focus groups, perhaps).

3. 10+ years ago, Billy Beane famously discovered powerful analytic insights for managing the Oakland A's baseball* team, and as a reward was portrayed by Brad Pitt in Moneyball. Now a bipartisan group of U.S. federal leaders and advisors has published Moneyball for Government, intending to encourage use of data, evidence, and evaluation in policy and funding decisions. On Twitter at @Moneyball4gov. *The A's play not far from Ugly Research HQ, and much to our dismay, the Moneyball competitive advantage has long since played out. But Billy is still a great analyst; for the moment, we're just holding our breath.

4. We've barely scratched the surface on figuring out how to present data to decision makers. All Analytics did a web series this week, The Results Are In: Think Presentation From the Start (recordings and slide decks available). One of the highlights was a comparison to Joseph Campbell's hero's journey, by James Haight of Blue Hill Research (on Twitter @James_Haight).

5. We're wired to seek certainty. But Ted Cadsby argues the world is too complex for our simplified conclusions. He suggests probabilistic thinking to arrive at a "provisional truth" that we can test over time in Closing the Mind Gap: Making Smarter Decisions in a Hypercomplex World.

Five for Friday: 5-Dec-2014.

Monday, 17 November 2014

Cognitive analytics, satisficing, sabremetrics, and other cool ways of deciding.

My weekly five links on evidence & decision-making. 

1. I was delighted when a friend sent a link to 10 Things I Believe About Baseball Without Evidence. Ken Arneson (@kenarneson) looks at sabremetrics with things like the linguistic relativity principle and the science of memory-based prediction. Warning to Oakland A's fans: Includes yet more reminders of the epic 2014 collapse.

2. The concept of satisficing came up on one of my current projects, so I reread some overviews. Herbert Simon coined this term to explain decision making under circumstances where an optimal solution can't be determined. (Simon sure has staying power: During grad school I wanted a copy of Administrative Behavior (1957), and there it was in paperback on the shelf at the Tattered Cover in Denver.)

3. Eric Topol (Medscape) interviewed Margaret Hamburg, US FDA commissioner, on weighing risks/benefits in complex agency decision-making. She talked specifically about the newly restrictive policy on opiate painkiller prescriptions (see page 3). I'd like to see FDA's analysis of unintended side effects, such as the concurrent rise of U.S. heroin use/overdoses.

4. Organizations are challenged with finding the pony in all their data, and marketing-spin hijinks have ensued. Seth Grimes (@sethgrimes) has a great discussion of efforts to go beyond the fundamental volume, velocity, etc. in Avoid Wanna-V Confusion.

5. Steve Ardire (@SArdire) authored a Dataversity paper on cognitive computing, including survey results and some choice comments, both pro and con, about the prospects for this emerging area. For creating business value, 'business intelligence/cognitive analytics' looks promising. Cognitive Computing: An Emerging Hub in IT Ecosystems.

Curated by Tracy Altman of Ugly Research

 

Friday, 07 November 2014

Data & Decision-Making: Five for Friday 7-Nov-2014.

1. "A gut is a personal, nontransferable attribute, which increases the value of a good one." This classic from Harvard Business Review recaps how policy makers have historically made big decisions. It's never just about the data. A Brief History of Decision Making.

2. James Taylor, an expert on decision management who I admire, is coauthor (with Tom Debevoise) of a new book, bringing a decision management perspective to process discovery and design. I like to think everything's better with explicitly identified decision steps. The notation idea (DMN) makes it much easier to model, execute, and interchange business rules. MicroGuide to Process and Decision Modeling in BPMN/DMN.

3. A reminder to look for the nonobvious. This analysis examines differences in parole hearing outcomes: The usual suspects, such as crime committed, don't always explain why one prisoner is paroled and another is not. Turns out, it's best to go up first thing in the morning. [NY Times] Do You Suffer From Decision Fatigue?

4. IBM sponsored a paper by Ventana Research connecting advanced analytics and business intelligence (finally!), making the excellent point that unless BI insights "are inserted into decision-making processes," they have minimal value. (And then sadness. Ventana weakens their case with "Robust new technology enables better decision-making." How do you measure ‘robust’?) Available at Advanced Analytics Enhances Business Intelligence.

5. Doctor 'decider' skills fade after lunchtime. Bring snacks. Researchers analyzed billing and electronic health records for 20,000+ patients. About 5% more got antibiotics prescriptions later in the day. Findings in JAMA Internal Medicine. Data Shows Docs Prescribe More Antibiotics Late in Day.

 

Are you interested in creating better ways to inform people's decision making? At Ugly Research, I'm  always looking for people who want to shape this idea.

Wednesday, 15 October 2014

The lean way to present evidence to decision makers.

There's a lot of advice out there about how to design presentations. But most of that fails to prepare you for delivering complex evidence to senior-level decision-makers. Here's what I do. Hope this helps.

First, ask yourself this: How might your evidence help someone better understand the steps required to reach an important goal?

  1. To develop an answer, put together what I call lean evidence, embracing lean management concepts. As explained by the Lean Enterprise Institute, "The core idea is to maximize customer value while minimizing waste." Keep this in mind when planning a presentation, writing your report, or sending that email: Focus on what's valuable, and reduce waste by stripping out nonessentials. Show how value flows over all the details to what's important to your audience.
  2. Skip the storytelling. Begin with "Boom! Here's my answer." You're not Steve Jobs, and this isn't a TED talk. You're delivering lean evidence to a busy executive, so think of all that buildup as waste. Stay true to lean, and get rid of it. Jeanne Tari, VP at Power Speaking, makes a similar point, saying the way to present to executives is to "bottom line it first, then have a dialogue".
  3. Go easy on the pretty pictures. Everybody loves eye candy. But the data visualization is not the point: It just helps you make your point.
  4. Connect dots that matter. Keep the focus on your insights, and how they can help the decision maker improve outcomes. (If you find that you're simply reporting results without connecting at least two important things together, then go back and re-evaluate.)
  5. Avoid the dreaded SMEs disease. Provide enough detail about your methods to establish credibility as a subject matter expert. Then stop. Pay yourself $5 for every word you delete. Andrew a/k/a @theFundingGuru gives this advice, and I agree.