Evidence Soup
How to find, use, and explain evidence.

107 posts categorized "science & research methods"

Tuesday, 12 April 2016

Better evidence for patients, and geeking out on baseball.

Health tech wearables

1. SPOTLIGHT: Redefining how patients get health evidence.

How can people truly understand evidence and the tradeoffs associated with health treatments? How can the medical community lead them through decision-making that's shared - but also evidence-based?

Hoping for cures, patients and their families anxiously Google medical research. Meanwhile, the quantified selves are gathering data at breakneck speed. These won't solve the problem. However, this month's entire Health Affairs issue (April 2016) focuses on consumer uses of evidence and highlights promising ideas.

  • Translating medical evidence. Lots of synthesis and many guidelines are targeted at healthcare professionals, not civilians. Knowledge translation has become an essential piece, although it doesn't always involve patients at early stages. The Boot Camp Translation process is changing that. The method enables leaders to engage patients and develop healthcare language that is accessible and understandable. Topics include colon cancer, asthma, and blood pressure management.
  • Truly patient-centered medicine. Patient engagement is a buzzword, but capturing patient-reported outcomes in the clinical environment is a real thing that might make a big difference. Danielle Lavallee led an investigation into how patients and providers can find more common ground for communicating.
  • Meaningful insight from wearables. These are early days, so it's probably not fair to take shots at the gizmos out there. It will be a beautiful thing when sensors and other devices can deliver more than alerts and reports - and make valuable recommendations in a consumable way. And of course these wearables can play a role in routine collection of patient-reported outcomes.


Statcast

2. Roll your own analytics for fantasy baseball.
For some of us, it's that special time of year when we come to the realization that our favorite baseball team is likely going home early again this season. There's always fantasy baseball, and it's getting easier to geek out with analytics to improve your results.

3. AI engine emerges after 30 years.
No one ever said machine learning was easy. Cyc is an AI engine that reflects 30 years of building a knowledge base. Now its creator, Doug Lenat, says it's ready for prime time. Lucid is commercializing the technology. Personal assistants and healthcare applications are in the works.

Photo credit: fitbit one by Tatsuo Yamashita on Flickr.

Tuesday, 05 April 2016

$15 minimum wage, evidence-based HR, and manmade earthquakes.

Fightfor15.org

Photo by Fightfor15.org

1. SPOTLIGHT: Will $15 wages destroy California jobs?
California is moving toward a $15/hour minimum wage (slowly, stepping up through 2023). Will employers be forced to eliminate jobs under the added financial pressure? As with all things economic, it depends who you ask. Lots of numbers have been thrown around during the recent push for higher pay. Fightfor15.org says 6.5 million workers are getting raises in California, and that 2/3 of New Yorkers support a similar increase. But small businesses, restaurants in particular, are concerned they'll have to trim menus and staff - they can charge only so much for a sandwich.

Moody's Analytics economist Adam Ozimek says it's not just about food service or home healthcare. Writing on The Dismal Scientist Blog, "[I]n past work I showed that California has 600,000 manufacturing workers who currently make $15 an hour or less. The massive job losses in manufacturing over the last few decades has shown that it is an intensely globally competitive industry where uncompetitive wages are not sustainable." 

It's not all so grim. Ozimek shows that early reports of steep job losses after Seattle's minimum-wage hike have been revised strongly upward. However, finding "the right comparison group is getting complicated."


Yellow Map Chance of Earthquake

2. Manmade events sharply increase earthquake risk.
Holy smokes. New USGS maps show north-central Oklahoma at high earthquake risk. The United States Geological Survey now includes potential ground-shaking hazards from both 'human-induced' and natural earthquakes, substantially changing their risk assessment for several areas. Oklahoma recorded 907 earthquakes last year at magnitude 3 or higher. Disposal of industrial wastewater has emerged as a substantial factor.

3. Evidence-based HR redefines leadership roles.
Applying evidence-based principles to talent management can boost strategic impact, but requires a different approach to leadership. The book Transformative HR: How Great Companies Use Evidence-Based Change for Sustainable Advantage (Jossey-Bass) describes practical uses of evidence to improve people management. John Boudreau and Ravin Jesuthasan suggest principles for evidence-based change, including logic-driven analytics. For instance, establishing appropriate metrics for each sphere of your business, rather than blanket adoption of measures like employee engagement and turnover.

4. Why we're not better at investing.
Gary Belsky does a great job of explaining why we think we're better investors than we are. By now our decision biases have been well-documented by behavioral economists. Plus we really hate to lose - yet we're overconfident, somehow thinking we can compete with Warren Buffet.

Wednesday, 24 February 2016

How to show your evidence is reliable, repeatable.

audience clapping

When presenting findings, it’s essential to show their reliability and relevance. This post explains how to demonstrate that evidence is reproducible; next week in Part 2, we’ll cover how to show it’s relevant.

Show your evidence is reproducible. With complexity on the rise, there’s no shortage of quality problems with traditional research: People are finding it impossible to replicate everything from peer-reviewed, published findings to Amy Cuddy's power pose study. A recent examination of psychology evidence was particularly painful.

In a corporate setting, the problem is no less difficult. How do you know a data scientist’s results can be replicated?* How can you be sure an analyst’s Excel model is flawless? Much confusion could be avoided if people produced documentation to add transparency.

Demystify, demystify, demystify. To establish credibility, the audience needs to believe your numbers and your methods are reliable and reproducible. Numerous efforts are bringing transparency to academic research (@figshare, #openscience). Technologies such as self-serve business intelligence and data visualization have added traceability to corporate analyses. Data scientists are coming to grips with the need for replication, evidenced by the Johns Hopkins/Coursera class on reproducible research. At presentation time, include highlights of data collection and analysis so the audience clearly understands the source of your evidence.

Make a list: What would you need to know? Imagine a colleague will be auditing or replicating your work - whether it’s a straightforward business analysis, data science, or scientific research. Put together a list of the things they would need to do, and the data they would access, to arrive at your result. Work with your team to set expectations for how projects are completed and documented. No doubt this can be a burdensome task, but the more good habits people develop (e.g., no one-off spreadsheet tweaking), the less pain they’ll experience when defending their insights.

*What is a “reproducible” finding, anyway? Does this mean literally replicated, as in producing essentially the exact same result? Or does it mean a concept or research theory is supported? Is a finding replicated if effect size is different, but direction is the same? Sanjay Srivastava has an excellent explanation of the differences as they apply to psychology in What counts as a successful or failed replication?

image source: Barney Moss (creative commons)

Tuesday, 02 February 2016

The new ISPOR #pharma health decision guidance: How it's like HouseHunters.

Househunters_decision_checklist

'Multiple criteria decision analysis' is a crummy name for a great concept (aren't all big decisions analyzed using multiple criteria?). MCDA means assessing alternatives while simultaneously considering several objectives. It's a useful way to look at difficult choices in healthcare, oil production, or real estate. But oftentimes, results of these analyses aren't communicated clearly, limiting their usefulness (more about that below).

The International Society For Pharmacoeconomics and Outcomes Research (ISPOR) has developed new MCDA guidance, available in the latest issue of Value for Health (paywall). To be sure, healthcare decision makers have always weighed medical, social, and economic factors: MCDA helps stakeholders bring concrete choices and transparency to the process of evaluating outcomes research - where as we know, controversy is always a possibility.

Anyone can use MCDA. To put it mildly, it’s difficult to balance saving lives with saving money. Fundamentally, MCDA means listing options, defining decision criteria, weighting those criteria, and then scoring each option. Some experts build complex economic models, but anyone can apply this decision technique in effective, less rigorous ways.

You know those checklists at the end of every HouseHunters episode where buyers weigh location and size against budget? That's essentially it: People making important decisions, applying judgment, and weighing multiple goals (raise the kids in the city or the burbs?) - and even though they start out by ranking priorities, once buyers see their actual options, deciding on a house becomes substantially more complex.

MCDA gains traction in health economics. As shown in the diagram (source: ISPOR), the analysis hinges on assigning relative weights to individual decision criteria. While this brings rationality and transparency to complex decisions, it also invites passionate discussions. Some might expect these techniques to remove human judgment from the process, but MCDA leaves it front and center.


Looking for new ways to communicate health economics research and other medical evidence? Join me and other speakers at the 2nd annual HEOR Writing workshop in March.                                      


Pros and cons. Let’s not kid ourselves: You have to optimize on something. MCDA is both beautiful and terrifying because it forces us to identify tradeoffs: Quality, quick improvement, long-term health benefits? Uncertain outcomes only complicate things further.

MCDA is a great way to bring interdisciplinary groups into a conversation. It's essential to communicate the analysis effectively, so stakeholders understand the data and why they matter - without burying them in so much detail that the audience is lost.

One of the downsides is that, upon seeing elaborate projections and models, people can become over-confident in the numbers. Uncertainty is never fully recognized or quantified. (Recall the Rumsfeldian unknown unknown.) Sensitivity analysis is essential, to illustrate which predicted outcomes are strongly influenced by small adjustments.

Resources to learn more. If you want to try MCDA, I strongly recommend picking up one of the classic texts, such as Smart Choices: A Practical Guide to Making Better Decisions. Additionally, ISPOR's members offer useful insights into the pluses and minuses of this methodology - see, for example, Does the Future Belong to MCDA? The level of discourse over this guidance illustrates how challenging healthcare decisions have become.  

I'm presenting at the HEOR Writing workshop on communicating value messages clearly with data. March 17-18 in Philadelphia.

Tuesday, 12 January 2016

Game theory for Jeopardy!, evidence for gun control, and causality.

This week's 5 links on evidence-based decision making.

1. Deep knowledge → Wagering strategy → Jeopardy! win
Some Jeopardy! contestants struggle with the strategic elements of the show. Rescuing us is Keith Williams (@TheFinalWager), with the definitive primer on Jeopardy! strategy, applying game theory to every episode and introducing "the fascinating world of determining the optimal approach to almost anything".

2. Gun controls → Less violence? → Less tragedy?
Does the evidence support new US gun control proposals? In the Pacific Standard, Francie Diep cites several supporting scientific studies.

3. New data sources → Transparent methods → Health evidence
Is 'real-world' health evidence closer to the truth than data from more traditional categories? FDA staff explain in What We Mean When We Talk About Data. Thanks to @MandiBPro.

4. Data model → Cause → Effect
In Why: A Guide to Finding and Using Causes, Samantha Kleinberg aims to explain why causality is often misunderstood and misused: What is it, why is it so hard to find, and how can we do better at interpreting it? The book excerpt explains that "Understanding when our inferences are likely to be wrong is particularly important for data science, where we’re often confronted with observational data that is large and messy (rather than well-curated for research)."

5. Empirical results → Verification → Scientific understanding
Independent verification is essential to scientific progress. But in academia, verifying empirical results is difficult and not rewarded. This is the reason for Curate Science, a tool making it easier for researchers to independently verify each other’s evidence and award credit for doing so. Follow @CurateScience.

Join me at the HEOR writing workshop March 17 in Philadelphia. I'm speaking about communicating data, and leading an interactive session on data visualization. Save $300 before Jan 15.

Tuesday, 22 December 2015

Asthma heartbreak, cranky economists, and prediction markets.

This week's 5 links on evidence-based decision making.

1. Childhood stress → Cortisol → Asthma
Heartbreaking stories explain likely connections between difficult childhoods and asthma. Children in Detroit suffer a high incidence of attacks - regardless of allergens, air quality, and other factors. Peer-reviewed research shows excess cortisol may be to blame.

2. Prediction → Research heads up → Better evidence
Promising technique for meta-research. A prediction market was created to quantify the reproducibility of 44 studies published in prominent psychology journals, and estimate likelihood of hypothesis acceptance at different stages. The market outperformed individual forecasts, as described in PNAS (Proceedings of the National Academy of Sciences.)

3. Fuzzy evidence → Wage debate → Policy fail
More fuel for the minimum-wage fire. Depending on who you ask, a high minimum wage either bolsters the security of hourly workers or destroys the jobs they depend on. Recent example: David Neumark's claims about unfavorable evidence.

4. Decision tools → Flexible analysis → Value-based medicine
Drug Abacus is an interactive tool for understanding drug pricing. This very interesting project, led by Peter Bach at Memorial Sloan Kettering, compares the price of a drug (US$) with its "worth", based on outcomes, toxicity, and other factors. Hopefully @drugabacus signals the future for health technology assessment and value-based medicine.

5. Cognitive therapy → Depression relief → Fewer side effects
A BMJ systematic review and meta-analysis show that depression can be treated with cognitive behavior therapy, possibly with outcomes equivalent to antidepressants. Consistent CBT treatment is a challenge, however. AHRQ reports similar findings from comparative effectiveness research; the CER study illustrates how to employ expert panels to transparently select research questions and parameters.

Tuesday, 17 November 2015

ROI from evidence-based government, milking data for cows, and flu shot benefits diminishing.

This week's 5 links on evidence-based decision making.

1. Evidence standards → Knowing what works → Pay for success
Susan Urahn says we've reached a Tipping Point on Evidence-Based Policymaking. She explains in @Governing that 24 US governments have directed $152M to programs with an estimated $521M ROI: "an innovative and rigorous approach to policymaking: Create an inventory of currently funded programs; review which ones work based on research; use a customized benefit-cost model to compare programs based on their return on investment; and use the results to inform budget and policy decisions."

2. Sensors → Analytics → Farming profits
Precision dairy farming uses RFID tags, sensors, and analytics to track the health of cows. Brian T. Horowitz (@bthorowitz) writes on TechCrunch about how farmers are milking big data for insight. Literally. Thanks to @ShellySwanback.

3. Public acceptance → Annual flu shots → Weaker response?
Yikes. Now that flu shot programs are gaining acceptance, there's preliminary evidence suggesting that repeated annual shots can gradually reduce their effectiveness under some circumstances. Scientists at the Marshfield Clinic Research Foundation recently reported that "children who had been vaccinated annually over a number of years were more likely to contract the flu than kids who were only vaccinated in the season in which they were studied." Helen Branswell explains on STAT.

4. PCSK9 → Cholesterol control → Premium increases
Ezekiel J. Emanuel says in a New York Times Op-Ed I Am Paying for Your Expensive Medicine. PCSK9 inihibitors newly approved by US FDA can effectively lower bad cholesterol, though data aren't definitive whether this actually reduces heart attacks, strokes, and deaths from heart disease. This new drug category comes at a high cost. Based on projected usage levels, soem analysts predict insurance premiums could rise >$100 for everyone in that plan.

5. Opportunistic experiments → Efficient evidence → Informed family policy
New guidance details how researchers and program administrators can recognize opportunities for experiments and carry them out. This allows people to discover effects of planned initiatives, as opposed to analyzing interventions being developed specifically for research studies. Advancing Evidence-Based Decision Making: A Toolkit on Recognizing and Conducting Opportunistic Experiments in the Family Self-Sufficiency and Stability Policy Area.

Tuesday, 10 November 2015

Working with quantitative people, evidence-based management, and NFL ref bias.

This week's 5 links on evidence-based decision making.

1. Understand quantitative people → See what's possible → Succeed with analytics Tom Davenport outlines an excellent list of 5 Essential Principles for Understanding Analytics. He explains in the Harvard Business Review that an essential ingredient for effective data use is managers’ understanding of what is possible. To counter that, it’s really important that they establish a close working relationship with quantitative people.

2. Systematic review → Leverage research → Reduce waste This sounds bad: One study found that published reports of trials cited fewer than 25% of previous similar trials. @PaulGlasziou and @iainchalmersTTi explain on @bmj_latest how systematic reviews can reduce waste in research. Thanks to @CebmOxford.

3. Organizational context → Fit for decision maker → Evidence-based management A British Journal of Management article explores the role of ‘fit’ between the decision-maker and the organizational context in enabling an evidence-based process and develops insights for EBM theory and practice. Evidence-based Management in Practice: Opening up the Decision Process, Decision-maker and Context by April Wright et al. Thanks to @Rob_Briner.

4. Historical data → Statistical model → Prescriptive analytics Prescriptive analytics finally going mainstream for inventories, equipment status, trades. Jose Morey explains on the Experfy blog that the key advance has been the use of statistical models with historical data.

5. Sports data → Study of bias → NFL evidence Are NFL officials biased with their ball placement? Joey Faulkner at Gutterstats got his hands on a spreadsheet containing every NFL play run 2000-2014 (500,000 in all). Thanks to @TreyCausey.

Bonus! In The Scientific Reason Why Bullets Are Bad for Presentations, Leslie Belknap recaps a 2014 study concluding that "Subjects who were exposed to a graphic representation of the strategy paid significantly more attention to, agreed more with, and better recalled the strategy than did subjects who saw a (textually identical) bulleted list version."

Tuesday, 03 November 2015

Watson isn't thinking, business skills for data scientists, and zombie clickbait.

This week's 5 links on evidence-based decision making.

1. Evidence scoring → Cognitive computing → Thinking?
Fantastic article comparing Sherlock Holmes to Dr. Watson - and smart analysis to cognitive computing. This must-read by Paul Levy asks if scoring evidence and ranking hypotheses are the same as thinking.

2. Data science understanding → Business relevance → Career success
In HBR, Michael Li describes three crucial abilities for data scientists: 1) Articulate the business value of their work (defining success with metrics such as attrition); 2) Give the right level of technical detail (effectively telling the story behind the data); 3) Get visualizations right (tell a clean story with diagrams).

3. Long clinical trials → Patient expectations → Big placebo effect
The placebo effect is wreaking havoc in painkiller trials. Nature News explains that "responses to [placebo] treatments have become stronger over time, making it harder to prove a drug’s advantage." The trend is US-specific, possibly because big, expensive trials "may be enhancing participants’ expectations of their effectiveness".

4. Find patterns → Design feature set → Automate predictions
Ahem. MIT researchers aim to take the human element out of big-data analysis, with a system that searches for patterns *and* designs the feature set. In testing, it outperformed 615 of 906 human teams. Thanks to @kdnuggets.

5. Recurrent neural nets → Autogenerated clickbait → Unemployed Buzzfeed writers?
A clickbait website has been built entirely by recurrent neural nets. Click-o-Tron has the latest and greatest stories on the web, as hallucinated by an algorithm. Thanks to @leapingllamas.

Bonus! Sitting studies debunked? Corey Doctorow explains it's not the sitting that will kill you - it's the lack of exercise.

Tuesday, 13 October 2015

Decision science, NFL prediction, and recycling numbers don't add up.

This week's 5 links on evidence-based decision making.

Hear me talk October 14 on communicating messages clearly with data. Part of the HEOR Writing webinar series: Register here.

1. Data science → Decision science → Institutionalize data-driven decisions
Deepinder Dhingra at @MuSigmaInc explains why data science misses half the equation, and that companies instead need decision science to achieve a balanced creation, translation, and consumption of insights. Requisite decision science skills include "quantitative and intellectual horsepower; the right curiosity quotient; ability to think from first principles; and business synthesis."

2. Statistical model → Machine learning → Good prediction
Microsoft is quite good at predicting American Idol winners - and football scores. Tim Stenovec writes about the Bing Predicts project's impressive record of correctly forecasting World Cup, NFL, reality TV, and election outcomes. The @Bing team begins with a traditional statistical model and supplements it with query data, text analytics, and machine learning.

3. Environmental concern → Good feelings → Bad recycling ROI
From a data-driven perspective, it's difficult to justify the high costs of US recycling programs. John Tierney explains in the New York Times that people's good motives and concerns about environmental damage have driven us to the point of recovering every slip of paper, half-eaten pizza, water bottle, and aluminum can - when the majority of value is derived from those cans and other metals.

4. Prescriptive analytics → Prescribe actions → Grow the business
Business intelligence provides tools for describing and visualizing what's happening in the company right now, but BI's value for identifying opportunities is often questioned. More sophisticated predictive analytics can forecast the future. But Nick Swanson of River Logic says the path forward will be through prescriptive analytics: Using methods such as stochastic optimization, analysts can prescribe specific actions for decision makers.

5. Graph data → Data lineage → Confidence & trust
Understanding the provenance of a data set is essential, but often tricky: Who collected it, and whose hands has it passed through? Jean Villedieu of @Linkurious explains how a graph database - rather than a traditional data store - can facilitate the tracking of data lineage.