Evidence Soup
How to find, use, and explain evidence.

104 posts categorized "science & research methods"

Tuesday, 02 February 2016

The new ISPOR #pharma health decision guidance: How it's like HouseHunters.

Househunters_decision_checklist

'Multiple criteria decision analysis' is a crummy name for a great concept (aren't all big decisions analyzed using multiple criteria?). MCDA means assessing alternatives while simultaneously considering several objectives. It's a useful way to look at difficult choices in healthcare, oil production, or real estate. But oftentimes, results of these analyses aren't communicated clearly, limiting their usefulness (more about that below).

The International Society For Pharmacoeconomics and Outcomes Research (ISPOR) has developed new MCDA guidance, available in the latest issue of Value for Health (paywall). To be sure, healthcare decision makers have always weighed medical, social, and economic factors: MCDA helps stakeholders bring concrete choices and transparency to the process of evaluating outcomes research - where as we know, controversy is always a possibility.

Anyone can use MCDA. To put it mildly, it’s difficult to balance saving lives with saving money. Fundamentally, MCDA means listing options, defining decision criteria, weighting those criteria, and then scoring each option. Some experts build complex economic models, but anyone can apply this decision technique in effective, less rigorous ways.

You know those checklists at the end of every HouseHunters episode where buyers weigh location and size against budget? That's essentially it: People making important decisions, applying judgment, and weighing multiple goals (raise the kids in the city or the burbs?) - and even though they start out by ranking priorities, once buyers see their actual options, deciding on a house becomes substantially more complex.

MCDA gains traction in health economics. As shown in the diagram (source: ISPOR), the analysis hinges on assigning relative weights to individual decision criteria. While this brings rationality and transparency to complex decisions, it also invites passionate discussions. Some might expect these techniques to remove human judgment from the process, but MCDA leaves it front and center.


Looking for new ways to communicate health economics research and other medical evidence? Join me and other speakers at the 2nd annual HEOR Writing workshop in March.                                      


Pros and cons. Let’s not kid ourselves: You have to optimize on something. MCDA is both beautiful and terrifying because it forces us to identify tradeoffs: Quality, quick improvement, long-term health benefits? Uncertain outcomes only complicate things further.

MCDA is a great way to bring interdisciplinary groups into a conversation. It's essential to communicate the analysis effectively, so stakeholders understand the data and why they matter - without burying them in so much detail that the audience is lost.

One of the downsides is that, upon seeing elaborate projections and models, people can become over-confident in the numbers. Uncertainty is never fully recognized or quantified. (Recall the Rumsfeldian unknown unknown.) Sensitivity analysis is essential, to illustrate which predicted outcomes are strongly influenced by small adjustments.

Resources to learn more. If you want to try MCDA, I strongly recommend picking up one of the classic texts, such as Smart Choices: A Practical Guide to Making Better Decisions. Additionally, ISPOR's members offer useful insights into the pluses and minuses of this methodology - see, for example, Does the Future Belong to MCDA? The level of discourse over this guidance illustrates how challenging healthcare decisions have become.  

I'm presenting at the HEOR Writing workshop on communicating value messages clearly with data. March 17-18 in Philadelphia.

Tuesday, 12 January 2016

Game theory for Jeopardy!, evidence for gun control, and causality.

This week's 5 links on evidence-based decision making.

1. Deep knowledge → Wagering strategy → Jeopardy! win
Some Jeopardy! contestants struggle with the strategic elements of the show. Rescuing us is Keith Williams (@TheFinalWager), with the definitive primer on Jeopardy! strategy, applying game theory to every episode and introducing "the fascinating world of determining the optimal approach to almost anything".

2. Gun controls → Less violence? → Less tragedy?
Does the evidence support new US gun control proposals? In the Pacific Standard, Francie Diep cites several supporting scientific studies.

3. New data sources → Transparent methods → Health evidence
Is 'real-world' health evidence closer to the truth than data from more traditional categories? FDA staff explain in What We Mean When We Talk About Data. Thanks to @MandiBPro.

4. Data model → Cause → Effect
In Why: A Guide to Finding and Using Causes, Samantha Kleinberg aims to explain why causality is often misunderstood and misused: What is it, why is it so hard to find, and how can we do better at interpreting it? The book excerpt explains that "Understanding when our inferences are likely to be wrong is particularly important for data science, where we’re often confronted with observational data that is large and messy (rather than well-curated for research)."

5. Empirical results → Verification → Scientific understanding
Independent verification is essential to scientific progress. But in academia, verifying empirical results is difficult and not rewarded. This is the reason for Curate Science, a tool making it easier for researchers to independently verify each other’s evidence and award credit for doing so. Follow @CurateScience.

Join me at the HEOR writing workshop March 17 in Philadelphia. I'm speaking about communicating data, and leading an interactive session on data visualization. Save $300 before Jan 15.

Tuesday, 22 December 2015

Asthma heartbreak, cranky economists, and prediction markets.

This week's 5 links on evidence-based decision making.

1. Childhood stress → Cortisol → Asthma
Heartbreaking stories explain likely connections between difficult childhoods and asthma. Children in Detroit suffer a high incidence of attacks - regardless of allergens, air quality, and other factors. Peer-reviewed research shows excess cortisol may be to blame.

2. Prediction → Research heads up → Better evidence
Promising technique for meta-research. A prediction market was created to quantify the reproducibility of 44 studies published in prominent psychology journals, and estimate likelihood of hypothesis acceptance at different stages. The market outperformed individual forecasts, as described in PNAS (Proceedings of the National Academy of Sciences.)

3. Fuzzy evidence → Wage debate → Policy fail
More fuel for the minimum-wage fire. Depending on who you ask, a high minimum wage either bolsters the security of hourly workers or destroys the jobs they depend on. Recent example: David Neumark's claims about unfavorable evidence.

4. Decision tools → Flexible analysis → Value-based medicine
Drug Abacus is an interactive tool for understanding drug pricing. This very interesting project, led by Peter Bach at Memorial Sloan Kettering, compares the price of a drug (US$) with its "worth", based on outcomes, toxicity, and other factors. Hopefully @drugabacus signals the future for health technology assessment and value-based medicine.

5. Cognitive therapy → Depression relief → Fewer side effects
A BMJ systematic review and meta-analysis show that depression can be treated with cognitive behavior therapy, possibly with outcomes equivalent to antidepressants. Consistent CBT treatment is a challenge, however. AHRQ reports similar findings from comparative effectiveness research; the CER study illustrates how to employ expert panels to transparently select research questions and parameters.

Tuesday, 17 November 2015

ROI from evidence-based government, milking data for cows, and flu shot benefits diminishing.

This week's 5 links on evidence-based decision making.

1. Evidence standards → Knowing what works → Pay for success
Susan Urahn says we've reached a Tipping Point on Evidence-Based Policymaking. She explains in @Governing that 24 US governments have directed $152M to programs with an estimated $521M ROI: "an innovative and rigorous approach to policymaking: Create an inventory of currently funded programs; review which ones work based on research; use a customized benefit-cost model to compare programs based on their return on investment; and use the results to inform budget and policy decisions."

2. Sensors → Analytics → Farming profits
Precision dairy farming uses RFID tags, sensors, and analytics to track the health of cows. Brian T. Horowitz (@bthorowitz) writes on TechCrunch about how farmers are milking big data for insight. Literally. Thanks to @ShellySwanback.

3. Public acceptance → Annual flu shots → Weaker response?
Yikes. Now that flu shot programs are gaining acceptance, there's preliminary evidence suggesting that repeated annual shots can gradually reduce their effectiveness under some circumstances. Scientists at the Marshfield Clinic Research Foundation recently reported that "children who had been vaccinated annually over a number of years were more likely to contract the flu than kids who were only vaccinated in the season in which they were studied." Helen Branswell explains on STAT.

4. PCSK9 → Cholesterol control → Premium increases
Ezekiel J. Emanuel says in a New York Times Op-Ed I Am Paying for Your Expensive Medicine. PCSK9 inihibitors newly approved by US FDA can effectively lower bad cholesterol, though data aren't definitive whether this actually reduces heart attacks, strokes, and deaths from heart disease. This new drug category comes at a high cost. Based on projected usage levels, soem analysts predict insurance premiums could rise >$100 for everyone in that plan.

5. Opportunistic experiments → Efficient evidence → Informed family policy
New guidance details how researchers and program administrators can recognize opportunities for experiments and carry them out. This allows people to discover effects of planned initiatives, as opposed to analyzing interventions being developed specifically for research studies. Advancing Evidence-Based Decision Making: A Toolkit on Recognizing and Conducting Opportunistic Experiments in the Family Self-Sufficiency and Stability Policy Area.

Tuesday, 10 November 2015

Working with quantitative people, evidence-based management, and NFL ref bias.

This week's 5 links on evidence-based decision making.

1. Understand quantitative people → See what's possible → Succeed with analytics Tom Davenport outlines an excellent list of 5 Essential Principles for Understanding Analytics. He explains in the Harvard Business Review that an essential ingredient for effective data use is managers’ understanding of what is possible. To counter that, it’s really important that they establish a close working relationship with quantitative people.

2. Systematic review → Leverage research → Reduce waste This sounds bad: One study found that published reports of trials cited fewer than 25% of previous similar trials. @PaulGlasziou and @iainchalmersTTi explain on @bmj_latest how systematic reviews can reduce waste in research. Thanks to @CebmOxford.

3. Organizational context → Fit for decision maker → Evidence-based management A British Journal of Management article explores the role of ‘fit’ between the decision-maker and the organizational context in enabling an evidence-based process and develops insights for EBM theory and practice. Evidence-based Management in Practice: Opening up the Decision Process, Decision-maker and Context by April Wright et al. Thanks to @Rob_Briner.

4. Historical data → Statistical model → Prescriptive analytics Prescriptive analytics finally going mainstream for inventories, equipment status, trades. Jose Morey explains on the Experfy blog that the key advance has been the use of statistical models with historical data.

5. Sports data → Study of bias → NFL evidence Are NFL officials biased with their ball placement? Joey Faulkner at Gutterstats got his hands on a spreadsheet containing every NFL play run 2000-2014 (500,000 in all). Thanks to @TreyCausey.

Bonus! In The Scientific Reason Why Bullets Are Bad for Presentations, Leslie Belknap recaps a 2014 study concluding that "Subjects who were exposed to a graphic representation of the strategy paid significantly more attention to, agreed more with, and better recalled the strategy than did subjects who saw a (textually identical) bulleted list version."

Tuesday, 03 November 2015

Watson isn't thinking, business skills for data scientists, and zombie clickbait.

This week's 5 links on evidence-based decision making.

1. Evidence scoring → Cognitive computing → Thinking?
Fantastic article comparing Sherlock Holmes to Dr. Watson - and smart analysis to cognitive computing. This must-read by Paul Levy asks if scoring evidence and ranking hypotheses are the same as thinking.

2. Data science understanding → Business relevance → Career success
In HBR, Michael Li describes three crucial abilities for data scientists: 1) Articulate the business value of their work (defining success with metrics such as attrition); 2) Give the right level of technical detail (effectively telling the story behind the data); 3) Get visualizations right (tell a clean story with diagrams).

3. Long clinical trials → Patient expectations → Big placebo effect
The placebo effect is wreaking havoc in painkiller trials. Nature News explains that "responses to [placebo] treatments have become stronger over time, making it harder to prove a drug’s advantage." The trend is US-specific, possibly because big, expensive trials "may be enhancing participants’ expectations of their effectiveness".

4. Find patterns → Design feature set → Automate predictions
Ahem. MIT researchers aim to take the human element out of big-data analysis, with a system that searches for patterns *and* designs the feature set. In testing, it outperformed 615 of 906 human teams. Thanks to @kdnuggets.

5. Recurrent neural nets → Autogenerated clickbait → Unemployed Buzzfeed writers?
A clickbait website has been built entirely by recurrent neural nets. Click-o-Tron has the latest and greatest stories on the web, as hallucinated by an algorithm. Thanks to @leapingllamas.

Bonus! Sitting studies debunked? Corey Doctorow explains it's not the sitting that will kill you - it's the lack of exercise.

Tuesday, 13 October 2015

Decision science, NFL prediction, and recycling numbers don't add up.

This week's 5 links on evidence-based decision making.

Hear me talk October 14 on communicating messages clearly with data. Part of the HEOR Writing webinar series: Register here.

1. Data science → Decision science → Institutionalize data-driven decisions
Deepinder Dhingra at @MuSigmaInc explains why data science misses half the equation, and that companies instead need decision science to achieve a balanced creation, translation, and consumption of insights. Requisite decision science skills include "quantitative and intellectual horsepower; the right curiosity quotient; ability to think from first principles; and business synthesis."

2. Statistical model → Machine learning → Good prediction
Microsoft is quite good at predicting American Idol winners - and football scores. Tim Stenovec writes about the Bing Predicts project's impressive record of correctly forecasting World Cup, NFL, reality TV, and election outcomes. The @Bing team begins with a traditional statistical model and supplements it with query data, text analytics, and machine learning.

3. Environmental concern → Good feelings → Bad recycling ROI
From a data-driven perspective, it's difficult to justify the high costs of US recycling programs. John Tierney explains in the New York Times that people's good motives and concerns about environmental damage have driven us to the point of recovering every slip of paper, half-eaten pizza, water bottle, and aluminum can - when the majority of value is derived from those cans and other metals.

4. Prescriptive analytics → Prescribe actions → Grow the business
Business intelligence provides tools for describing and visualizing what's happening in the company right now, but BI's value for identifying opportunities is often questioned. More sophisticated predictive analytics can forecast the future. But Nick Swanson of River Logic says the path forward will be through prescriptive analytics: Using methods such as stochastic optimization, analysts can prescribe specific actions for decision makers.

5. Graph data → Data lineage → Confidence & trust
Understanding the provenance of a data set is essential, but often tricky: Who collected it, and whose hands has it passed through? Jean Villedieu of @Linkurious explains how a graph database - rather than a traditional data store - can facilitate the tracking of data lineage.

Tuesday, 06 October 2015

Superforecasting, hot hand redux, and junk science.

This week's 5 links on evidence-based decision making.

Hear me talk on communicating messages clearly with data. Webinar October 14: Register here.

1. Good judgment → Accurate forecasts → Better decisions Jason Zweig (@jasonzweigwsj) believes Superforecasting: The Art and Science of Prediction is the "most important book on decision-making since Daniel Kahneman's Thinking Fast and Slow." Kahneman is equally enthusiastic, saying "This book shows that under the right conditions regular people are capable of improving their judgment enough to beat the professionals at their own game." The author, Philip Tetlock, leads the Good Judgment Project, where amateurs and experts compete to make forecasts - and the amateurs routinely win. Tetlock notes that particularly good forecasters regard their views as hypotheses to be tested, not treasures to be guarded. The project emphasizes transparency, urging people to explain why they believe what they do. Are you a Superforecaster? Find out by joining the project at GJOpen.com.

2. Better evidence → Better access → Better health CADTH (@CADTH_ACMTS), a non-profit that provides evidence to Canada's healthcare decision makers, is accepting abstract proposals for its 2016 Symposium, Evidence for Everyone.

3. Coin flip study → Surprising results → Hot hand debate The hot hand is making a comeback. After a noteworthy smackdown by Tom Gilovich, some evidence suggests there is such a thing. Ben Cohen explains in The 'Hot Hand' May Actually Be Real - evidently it's got something to do with coin flips. Regardless of how this works out, everyone should read (or reread) Gilovich's fantastic book, How We Know What Isn't So.

4. Less junk science → Better evidence → Better world The American Council on Science and Health has a mission to "provide an evidence-based counterpoint to the wave of anti-science claims". @ACSHorg presents its views with refreshingly snappy writing, covering a wide variety of topics including public policy, vaccination, fracking, chemicals, and nutrition.

5. Difference of differences → Misunderstanding → Bad evidence Ben Goldacre (@bengoldacre) of Bad Science fame writes in The Guardian that the same statistical errors – namely, ignoring the difference in differences – are appearing throughout the most prestigious journals in neuroscience.

Tuesday, 29 September 2015

Data blindness, measuring policy impact, and informing healthcare with baseball analytics.

This week's 5 links on evidence-based decision making.

Hear me talk October 14 on communicating messages clearly with data. Part of the HealthEconomics.com "Effective HEOR Writing" webinar series: Register here.

1. Creative statistics → Valuable insights → Reinvented baseball business Exciting baseball geek news: Bill James and Billy Beane appeared together for the first time. Interviewed in the Wall Street Journal at a Netsuite conference on business model disruption, Beane said new opportunities include predicting/avoiding player injuries - so there's an interesting overlap with healthcare analytics. (Good example from Baseball Prospectus: "no one really has any idea whether letting [a pitcher] pitch so much after coming back from Tommy John surgery has any effect on his health going forward.")

2. Crowdsourcing → Machine learning → Micro, macro policy evidence Premise uses a clever combination of machine learning and street-level human intelligence; their economic data helps organizations measure the impact of policy decisions at a micro and macro level. @premisedata recently closed a $50M US funding round.

3. Data blindness → Unfocused analytics → Poor decisions Data blindness prevents us from seeing what the numbers are trying to tell us. In a Read/Write guest post, OnCorps CEO (@OnCorpsHQ) Bob Suh recommends focusing on the decisions that need to be made, rather than on big data and analytics technology. OnCorps offers an intriguing app called Sales Sabermetrics.

4. Purpose and focus → Overcome analytics barriers → Create business value David Meer of PWC's Strategy& (@strategyand) talks about why companies continue to struggle with big data [video].

5. Health analytics → Evidence in the cloud → Collaboration & learning Evidera announces Evalytica, a SaaS platform promising fast, transparent analysis of healthcare data. This cloud-based engine from @evideraglobal supports analyses of real-world evidence sources, including claims, EMR, and registry data.

Tuesday, 08 September 2015

'What Works' toolkit, the insight-driven organization, and peer-review identity fraud.

This week's 5 links on evidence-based decision making.

1. Abundant evidence → Clever synthesis → Informed crime-prevention decisions The What Works Crime Toolkit beautifully synthesizes - on a single screen - the evidence on crime-prevention techniques. This project by the UK's @CollegeofPolice provides quick answers to what works (the car breathalyzer) and what doesn't (the infamous "Scared Straight" programs). Includes easy-to-use filters for evidence quality and type of crime. Just outstanding.

2. Insights → Strategic reuse → Data-driven decision making Tom Davenport explains why simply generating a bunch of insights is insufficient: "Perhaps the overarching challenge is that very few organizations think about insights as a process; they have been idiosyncratic and personal." A truly insight-driven organization must carefully frame, create, market, consume, and store insights for reuse. Via @DeloitteBA.

3. Sloppy science → Weak replication → Psychology myths Of 100 studies published in top-ranking journals in 2008, 75% of social psychology experiments and half of cognitive studies failed the replication test. @iansample delivers grim news in The Guardian: The psych research/publication process is seriously flawed. Thanks to @Rob_Briner.

4. Flawed policy → Ozone overreach → Burden on business Tony Cox writes in the Wall Street Journal that the U.S. EPA lacks causal evidence to support restrictions on ground-level ozone. The agency is connecting this pollutant to higher incidence of asthma, but Cox says new rules won't improve health outcomes, and will create substantial economic burden on business.

5. Opaque process → Peer-review fraud → Bad evidence More grim news for science publishing. Springer has retracted 64 papers from 10 journals after discovering the peer reviews were linked to fake email addresses. The Washington Post story explains that only nine months ago, BioMed Central - a Springer imprint - retracted 43 studies. @RetractionWatch says this wasn't even a thing before 2012.