Evidence Soup
How to find, use, and explain evidence.

Tuesday, 26 April 2016

Baseball decisions, actuaries, and streaming analytics.

Cutters from Breaking Away movie

1. SPOTLIGHT: How are innovations in baseball analytics like data science?
Last week, I spoke at Nerd Nite SF about recent developments in baseball analytics. Highlights from my talk:

- Data science and baseball analytics are following similar trajectories. There's more and more data, but people struggle to find predictive value. Oftentimes, executives are less familiar with technical details, so analysts must communicate findings and recommendations so they're palatable to decision makers. The role of analysts, and  challenges they face, are described beautifully by Adam Guttridge and David Ogren of NEIFI.

- 'Inside baseball' is full of outsiders with fresh ideas. Bill James is the obvious/glorious example - and Billy Beane (Moneyball) applied great outsider thinking. Analytics experts joining front offices today are also outsiders, but valued because they understand prediction;  the same goes for anyone seeking to transform a corporate culture to evidence-based decision making.

Tracy Altman @ Nerd Nite SF
- Defensive shifts may number 30,000 this season, up from 2,300 five years ago (John Dewan prediction). On-the-spot decisions are powered by popup iPad spray charts with shift recommendations for each opposing batter. And defensive stats are finally becoming a reality.

- Statcast creates fantastic descriptive stats for TV viewers; potential value for team management is TBD. Fielder fly-ball stats are new to baseball and sort of irresistible, especially the 'route efficiency' calculation.

- Graph databases, relatively new to the field, lend themselves well to analyzing relationships - and supplement what's available from a conventional row/column database. Learn more at FanGraphs.com. And topological maps (Ayasdi and Baseball Prospectus) are a powerful way to understand player similarity. Highly dimensional data are grouped into nodes, which are connected when they share a common data point - this produces a topo map grouping players with high similarity.

2. Will AI replace insurance actuaries?
10+ years ago, a friend of Ugly Research joined a startup offering technology to assist actuaries making insurance policy decisions. It didn't go all that well - those were early days, and it was difficult for people to trust an 'assistant' who was essentially a black box model. Skip ahead to today, when #fintech competes in a world ready to accept AI solutions, whether they augment or replace highly paid human beings. In Could #InsurTech AI machines replace Insurance Actuaries?, the excellent @DailyFintech blog handicaps several tech startups leading this effort, including Atidot, Quantemplate, Analyze Re, FitSense, and Wunelli.

3. The blind leading the blind in risk communication.
On the BMJ blog, Glyn Elwyn contemplates the difficulty of shared health decision-making, given people's inadequacy at understanding and communicating risk. Thanks to BMJ_ClinicalEvidence (@BMJ_CE).

4. You may know more than you think.
Maybe it's okay to hear voices. Evidence suggests the crowd in your head can improve your decisions. Thanks to Andrew Munro (@AndrewPMunro).

5. 'True' streaming analytics apps.
Mike Gualtieri of Forrester (@mgualtieri) put together a nice list of apps that stream real-time analytics. Thanks to Mark van Rijmenam (@VanRijmenam).

Wednesday, 20 April 2016

How to lead people through evidence-based decisions.

Decision Quality book

There's no shortage of books on strategy and decision-making - and many of them can seem out of touch. This one is worthwhile reading: Decision Quality: Value Creation from Better Business Decisions by Carl Spetzler, Hannah Winter, and Jennifer Meyer (Wiley 2016).

The authors are decision analysis experts with the well-known, Palo Alto-based Strategic Decisions Group. Instead of presenting schemes or templates for making decisions, they get to the heart of the matter: Decision quality, when making big decisions or smaller choices. How will you decide? How will you teach your team to make high-quality decisions? And how will you define 'high quality'?

For example, for a healthcare formulary decision, outline in advance what findings will be considered. Cost-effectiveness modeling? Real-world evidence? How will evidence be weighted - possibly using multi-criteria decision analysis? How will uncertainty be factored in?

 

“If you want to change the culture of an organization, change the way people make decisions.” -Vincent Barabba

Key takeaways from this book:

- You can lead a meaningful change by encouraging people to fully understand why it's the decision process, not the outcome, that is under their control.

- Teach your team to make high-quality decisions. Build organizational capability so people use similar language and methods to assess evidence and analyze decisions.

- Get more buy-in with a better process, from initial concept to execution. Judge the quality of a decision as you go along.

 

Tuesday, 12 April 2016

Better evidence for patients, and geeking out on baseball.

Health tech wearables

1. SPOTLIGHT: Redefining how patients get health evidence.

How can people truly understand evidence and the tradeoffs associated with health treatments? How can the medical community lead them through decision-making that's shared - but also evidence-based?

Hoping for cures, patients and their families anxiously Google medical research. Meanwhile, the quantified selves are gathering data at breakneck speed. These won't solve the problem. However, this month's entire Health Affairs issue (April 2016) focuses on consumer uses of evidence and highlights promising ideas.

  • Translating medical evidence. Lots of synthesis and many guidelines are targeted at healthcare professionals, not civilians. Knowledge translation has become an essential piece, although it doesn't always involve patients at early stages. The Boot Camp Translation process is changing that. The method enables leaders to engage patients and develop healthcare language that is accessible and understandable. Topics include colon cancer, asthma, and blood pressure management.
  • Truly patient-centered medicine. Patient engagement is a buzzword, but capturing patient-reported outcomes in the clinical environment is a real thing that might make a big difference. Danielle Lavallee led an investigation into how patients and providers can find more common ground for communicating.
  • Meaningful insight from wearables. These are early days, so it's probably not fair to take shots at the gizmos out there. It will be a beautiful thing when sensors and other devices can deliver more than alerts and reports - and make valuable recommendations in a consumable way. And of course these wearables can play a role in routine collection of patient-reported outcomes.


Statcast

2. Roll your own analytics for fantasy baseball.
For some of us, it's that special time of year when we come to the realization that our favorite baseball team is likely going home early again this season. There's always fantasy baseball, and it's getting easier to geek out with analytics to improve your results.

3. AI engine emerges after 30 years.
No one ever said machine learning was easy. Cyc is an AI engine that reflects 30 years of building a knowledge base. Now its creator, Doug Lenat, says it's ready for prime time. Lucid is commercializing the technology. Personal assistants and healthcare applications are in the works.

Photo credit: fitbit one by Tatsuo Yamashita on Flickr.

Tuesday, 05 April 2016

$15 minimum wage, evidence-based HR, and manmade earthquakes.

Fightfor15.org

Photo by Fightfor15.org

1. SPOTLIGHT: Will $15 wages destroy California jobs?
California is moving toward a $15/hour minimum wage (slowly, stepping up through 2023). Will employers be forced to eliminate jobs under the added financial pressure? As with all things economic, it depends who you ask. Lots of numbers have been thrown around during the recent push for higher pay. Fightfor15.org says 6.5 million workers are getting raises in California, and that 2/3 of New Yorkers support a similar increase. But small businesses, restaurants in particular, are concerned they'll have to trim menus and staff - they can charge only so much for a sandwich.

Moody's Analytics economist Adam Ozimek says it's not just about food service or home healthcare. Writing on The Dismal Scientist Blog, "[I]n past work I showed that California has 600,000 manufacturing workers who currently make $15 an hour or less. The massive job losses in manufacturing over the last few decades has shown that it is an intensely globally competitive industry where uncompetitive wages are not sustainable." 

It's not all so grim. Ozimek shows that early reports of steep job losses after Seattle's minimum-wage hike have been revised strongly upward. However, finding "the right comparison group is getting complicated."


Yellow Map Chance of Earthquake

2. Manmade events sharply increase earthquake risk.
Holy smokes. New USGS maps show north-central Oklahoma at high earthquake risk. The United States Geological Survey now includes potential ground-shaking hazards from both 'human-induced' and natural earthquakes, substantially changing their risk assessment for several areas. Oklahoma recorded 907 earthquakes last year at magnitude 3 or higher. Disposal of industrial wastewater has emerged as a substantial factor.

3. Evidence-based HR redefines leadership roles.
Applying evidence-based principles to talent management can boost strategic impact, but requires a different approach to leadership. The book Transformative HR: How Great Companies Use Evidence-Based Change for Sustainable Advantage (Jossey-Bass) describes practical uses of evidence to improve people management. John Boudreau and Ravin Jesuthasan suggest principles for evidence-based change, including logic-driven analytics. For instance, establishing appropriate metrics for each sphere of your business, rather than blanket adoption of measures like employee engagement and turnover.

4. Why we're not better at investing.
Gary Belsky does a great job of explaining why we think we're better investors than we are. By now our decision biases have been well-documented by behavioral economists. Plus we really hate to lose - yet we're overconfident, somehow thinking we can compete with Warren Buffet.

Tuesday, 29 March 2016

Rapid is the new black, how to ask for money, and should research articles be free?

Digitalhealthnetwork

1. #rapidisthenewblack

The need for speed is paramount, so it's crucial that we test ideas and synthesize evidence quickly without losing necessary rigor. Examples of people working hard to get it right:

  • The Digital Health Breakthrough Network is a very cool idea, supported by an A-list team. They (@AskDHBN) seek New York City-based startups who want to test technology in rigorous pilot studies. The goal is rapid validation of early-stage startups with real end users. Apply here.
  • The UK's fantastic Alliance for Useful Evidence (@A4UEvidence) asks Rapid Evidence Assessments: A bright idea or a false dawn? "Research synthesis will be at the heart of the government’s new What Works centres" - equally true in the US. The idea is "seductive: the rigour of a systematic review, but one that is cheaper and quicker to complete." Much depends on whether the review maps easily onto an existing field of study.
  • Jon Brassey of the Trip database is exploring methods for rapid reviews of health evidence. See Rapid-Reviews.info or @rapidreviews_i.
  • Miles McNall and Pennie G. Foster-Fishman of Michigan State (ouch, still can't get over that bracket-busting March Madness loss) present methods and case studies for rapid evaluations and assessments. In the American Journal of Evaluation, they caution that the central issue is balancing speed and trustworthiness.

2. The science of asking for donations: Unit asking method.
How much would you give to help one person in need? How much would you give to help 20 people? This is the concept behind the unit asking method, a way to make philanthropic fund-raising more successful.

3. Should all research papers be free? 
Good stuff from the New York Times on the conflict between scholarly journal paywalls and Sci-Hub.

4. Now your spreadsheet can tell you what's going on.
Savvy generates a narrative for business intelligence charts in Qlik or Excel.

Tuesday, 22 March 2016

Who should prepare evidence for human consumption? A possible fix to the hiring dilemma.

Pasta being sliced

Health economists, financial analysts, policy advisers, along with newly minted data scientists, face diverse challenges: Besides data-gathering and modeling, they analyze findings, demonstrate value, and advocate to others. No wonder these positions are so difficult to fill.

At some point, new evidence - economics, health, marketing - will have to be explained for human consumption. Who should do the explaining?

Writing in Tech Republic, Matt Asay (@mjasay) reminds us data science falls into two categories, depending on whether it's intended for human or machine consumption - the same can be said for all sorts of analytical activities. But eventually, most people who develop complex models, or create algorithms, or report findings will need to describe their work to someone making budget and strategy decisions.

That's some skill set. Delivering evidence for human consumption requires talent above and beyond complex technical or scientific expertise; communication skills are essential. In a recent Harvard Business Review, Michael Li explains three key presentation capabilities:

  1. Articulating business value (defining success with metrics).
  2. Giving the right level of technical detail (the story behind the data).
  3. Getting visualizations right (telling a clean story with diagrams).

How many hats do I have to wear? Many organizations want their strong technical and scientific talent to be skillful presenters, capable of articulating insights to decision makers. Some are also expected to function as influencers, and champion analytical methodologies to business units or partners across the organization. But not everyone aspires - or is able - to wear all these hats.

Consider a recent job posting for an Audience Insights Manager at a consumer-facing tech firm. The company is seeking someone with a “proven understanding of consumers and an ability to distill data into compelling, actionable insights”. But that's not all: “The ideal candidate will be a subject matter expert on analytics and have a strong track record of building and managing high-performing teams. You will be required to articulate the vision of the team to sales and marketing leadership and be a strong advocate for insights. You must be a data expert, but also know how to articulate data into insights.” (Is this realistic?)

Why not pair up? Would it make more sense to pair people, say for instance partnering an analytics expert with an internal influencer whose talent is synthesizing evidence and articulating value to executive decision makers? Or a pharma investigator with an 'insight interpreter'? We've all seen settings where a subject matter expert handed work product to a writer who created deliverables. I'm thinking of an integrated team of peers, so each individual can contribute as much value as possible.

I welcome your thoughts.

Photo credit: Pasta Being Sliced by Dldrlks on Flickr.

Wednesday, 16 March 2016

Rethinking Abstracts & Bibliographies: Three Examples.

In healthcare, communication is long overdue for innovation. Here's three approaches that change how research evidence and health economics are presented to decision makers and other stakeholders. 

1. From Taylor & Francis, cartoon abstracts are innovative introductions to journal articles. T&F says cartoons have already generated 11,000+ extra downloads for papers in science, technology, and math. "With the authors represented through characters in the cartoon strip, they’re also a useful networking tool amongst peers."

Taylor & Francis cartoon abstract

 

2. PepperSlice is an evidence synthesis format and technology I created with my team at Ugly Research. Key conclusions are presented in a brief narrative, and each supporting citation appears in a graphical format that highlights the data and how it was evaluated. (To get your name on the PepperSlice beta list, email me at tracy@uglyresearch.com.)

PepperSlice evidence synthesis format

 

3. Many online Elsevier journals supplement the print articles with interactive graphics. This effort began as the Article of the Future project.

Elsevier article of the future

 

Let me know what innovations you're developing, or would like to see.

Tuesday, 15 March 2016

Analytics disillusionment, evidence-based presentation style, and network analysis.

Polinode New_Layout_Algorithm

1. Visualizing networks.
@Polinode builds innovative tools for network analysis. One nifty feature allows creation of column charts using a set of nodes. A recent post explains how to use calculated network metrics such as centrality or betweenness.

2. Analytics are disconnected from strategic decisions.
An extensive study suggests analytics sponsors are in the trough of disillusionment. The new MIT Sloan-SAS report, Beyond the hype: The hard work behind analytics success finds that competitive advantage from analytics is declining. How can data do more to improve outcomes?

Analytics insights MIT-SAS report

The @mitsmr article notes several difficulties, including failure to drive strategic decisions with analytics. "Over the years, access to useful data has continued to increase, but the ability to apply analytical insights to strategy has declined." Dissemination of insights to executives and other decision makers is also a problem. The full report is available from SAS (@SASBestPractice).

3. Evidence shows graphics better than bullets.
There's new empirical evidence on communicating business strategy. 76 managers saw a presentation by the financial services branch of an auto manufacturer. Three types of visual support were displayed: bulleted list, visual metaphor, and temporal diagram. Each subject saw only one of the three formats. Those who saw a graphical representation paid significantly more attention to, agreed more with, and better recalled the strategy than did subjects who saw a (textually identical) bulleted list version. However, no significant difference was found regarding the *understanding* of the strategy. Also, presenters using graphical representations were more positively perceived those who presented bulleted lists.

4. Linking customer experience with value.
McKinsey's Joel Maynes and Alex Rawson offer concrete advice on how to explicitly link customer experience initiatives to value. "Develop a hypothesis about customer outcomes that matter. Start by identifying the specific customer behavior and outcomes that underpin value in your industry. The next step is to link what customers say in satisfaction surveys with their behavior over time."

5. Never mind on that reproducibility study.
Slate explains how Psychologists Call Out the Study That Called Out the Field of Psychology. In a comment published by Science, reviewers conclude that "A paper from the Open Science Collaboration... attempting to replicate 100 published studies suggests that the reproducibility of psychological science is surprisingly low. We show that this article contains three statistical errors and provides no support for such a conclusion. Indeed, the data are consistent with the opposite conclusion, namely, that the reproducibility of psychological science is quite high." Evidently, OSC frequently used study populations that differed substantially from the original ones - and each replication attempt was done only once.

Tuesday, 08 March 2016

NBA heat maps, FICO vs Facebook, and peer review.

Curry-heatmap

Curry-heatmap2016

1. Resistance is futile. You must watch Steph Curry.
The Golden State Warriors grow more irresistible every year, in large part because of Curry’s shooting. With sports data analytics from Basketball-Reference.com, these heat maps illustrate his shift to 3-pointers (and leave no doubt why Curry was called the Babyfaced Assassin; now of course he’s simply MVP).

2. Facebook vs FICO. Conventional (FICO) vs. creepy (Facebook): What's the future of consumer lending decisions?

Conventional (FICO) vs. creepy (Facebook): What's the future of consumer lending decisions?

Fintech startups are rethinking how people access their money, and borrow money from individuals or institutions. Peer-to-peer lending (Lending Club,Prosper, others) is one of the innovative business models being explored.NerdWallet is adding speed and transparency to choosing financial products.

What signals a reasonable credit risk? Another big idea is replacing traditional credit scores with rankings derived from social media profiles and other data. Just 3 months ago, Affirm and others were touted in Fortune’s piece Why Facebook Profiles are Replacing Credit Scores

Not so fast. But now the Wall Street Journal says those decisions are falling out of favor, in Facebook Isn’t So Good at Judging Your Credit After All. Turns out, regulatory restrictions and limited data-sharing policies are interfering. Plus, executives with startups like ZestFinance find social-media lending “creepy”. 

 Naturally a lot more goes into a consumer lending decision than a simple evaluation of a score, whether it's traditional FICO or an experimental Facebook metric. ZestFinance is doing some interesting work, using machine learning to discover meaningful patterns in people's credit reports. The path to better #fintech will likely be a bumpy ride, but it's fascinating to watch.

3. How to fix science journals.
Harvard Med School’s Jeffrey Flier wrote an excellent op-ed for the Wall Street Journal, How to Keep Bad Science from Getting into Print [paywall]. Key issues: anonymous peer reviewers, and lack of transparent post-publishing dialogue with authors (@PubPeer being a notable exception). Flier says we need a science about how to publish science. Amen to that.

4. Longing for civil, evidence-based discourse?
ProCon.org publishes balanced coverage of controversial issues, presenting side-by-side pros and cons supported by evidence. The nonprofit’s site is ideal for schoolteachers, or anyone wanting a quick glance at important findings.

Tuesday, 01 March 2016

How to show your evidence is relevant.

Penguin navel-gazing

To be inspired, your audience needs to see how findings are reliable and relevant. In Part 1, I talked about creating practical checklists to ensure data-driven research is reproducible. This post describes how to deliver results that resonate with your audience.

It’s nice when people review analytical findings, think "Hmmm, interesting," and add the link to bitly. It’s exponentially nicer when they say “Holy smokes, let’s get started!” Certainly there are big differences between publishing a report, populating an executive dashboard, and presenting face-to-face. But these three techniques can be applied in many settings.

1. Avoid navel-gazing. Regardless of how elegant the analytics are, if your audience doesn’t understand what they might do with them, your efforts won’t have impact. All of us must resist the urge to overemphasize our expertise and hard work, and focus on helping others achieve more. Ask yourself which insights can help someone grow their business, improve team performance, or create social good.

2. Show relationships explicitly. Now more than ever, organizations urgently need *actionable insights* rather than findings. Of course you won’t always know people’s potential actions or decisions; addressing them directly can make you sound presumptuous, or just plain wrong. But you should know the subject matter well enough to anticipate objectives, values, or priorities. Be sure to connect to outcomes that are meaningful: Whenever possible, include a simple illustration, so people see key relationships at a glance.

Example: Before. Writeup of results (paragraphs or bullet points). “Patient engagement enables substantial provider cost savings. In a recent RCT, interactive, web-based patient engagement cut sedation needs 18% and procedure time 14% for first-time colonoscopy patients.”

Example: After. Use a simple illustration of associations, cause-effect, or before-after data relationships.
Interactive colonoscopy education → 14% faster procedures

EMMI offers an excellent example in this writeup of patient engagement research. Note how they name-drop respected medical centers doing a randomized, controlled trial - but quickly shift to simple, powerful visuals and descriptions of the business problem, evidence, and value message. (Bonus points for this Vimeo.)

Line Chart example

3. Build a better dashboard.
Data visualizations on dashboards effectively show what’s happening now, or what already happened. But when you are in a position to specifically advise decision makers, more is required. This spot-on observation by James Taylor (@jamet123) at Decision Management Solutions says it well: “Dashboards are decision support systems, but paradoxically, their design does not usually consider decisions explicitly.”

Example: Before. Graphics can indeed be worth 1,000 words, but simple information feeds and routine forecasts are a commodity.

Example: After. Decision makers need predictions and recommendations/prescriptive analytics. Most powerful are insights into the expected outcomes from untried, hopefully curve-bending or needle-moving activities. Some innovative variations on the standard dashboard are:

- List of specific decisions that could influence the numbers being predicted.

- List of actions that have influenced the numbers being displayed. 

- Environment where people can do what-ifs and think through different operational decisions. Formalizing this in a dashboard isn't realistic for every organization. Accenture provides excellent perspective on industrializing the insight-action-outcome sequence.