Evidence Soup
How to find, use, and explain evidence.

Friday, 05 June 2015

Evidence-based executives and insight integrators.

Five for Friday.

1. Lori C. Bieda of SAS is spot on, describing how analytics professionals can grow into roles as trusted advisors for senior executives. In The Translation Layer: The Role of Analytic Talent, she explains that "Analytics teams... need to evolve from data providers into insight integrators." Lots of detailed observations and recommendations in this white paper: Highly recommended.

2. Very cool. An executive-level course in Improving Decision-Making Through Evidence-Based Management. June 30-July 2 (UK). Led by Rob Briner of Bath University (@Rob_Briner) and Eric Barends of the Center for Evidence-Based Management.

3. More on the insight economy. “The rise of insights as a service represents a seismic shift in the business world. We believe it will create a huge new market segment.” IBM's Joel Cawley says data is a commodity, and analytics tools are often too complicated to use. So the insight service approach clears both these hurdles.

4. Fresh voice: Eugene Wei writes about Supposedly Irrelevant Factors, pondering why some insist on believing in rational homo economicus - and how behavioral economics repeatedly debunks this. "Hand waving is required because there is nothing in the workings of markets that turns otherwise normal human beings into Econs."

5. Even more on the insight economy. Krista Schnell (Accenture) explains how companies are monetizing their data further up the value chain, closer to the insight/decision step, and further from raw data. Synopsis, plus a link to the slides from her Strata presentation. (@KristaSchnell)

Friday, 19 December 2014

Fun-with-Evidence Friday: Super food facts and egg nog decision making.


Super facts. Recently I attended a chocolate & cheese party, and took along some bars from Vosges.  It's tasty, yes, but I was surprised by the back label, boasting super facts about this super dark, super food. Health benefits are said to include detoxification and general wellness. Nutritional boosters include anti-inflammatory.

Would you drink that egg nog if you knew its calorie count? The U.S. FDA is requiring menu labeling for alcoholic drinks. Advocates claim the "Centers for Disease Control and Prevention found that nearly 60 percent of adults use calorie information on menus to decide what to order". Not really: The CDC study includes the caveat that "data were not available to determine whether frequent or moderate [menu label] users choose more healthful foods than nonusers." So, evidently a 'label user' is someone who *says* they read the label. Will this regulatory intervention have any beneficial impact on pubic health? (Fun fact: In a previous life, I wrote a book about U.S. Nutrition Facts labeling requirements.)

Wishing everyone a delightful and prosperous 2015. 

Friday, 05 December 2014

"Big E" vs. "little e" evidence, the hero's journey, and provisional truth.

1. It's tempting to think there's a hierarchy for data: That evidence from high-quality experiments is on top at Level 1, and other research findings follow thereafter. But even in healthcare - the gold standard for the "gold standard" - it's not that simple, says NICE in The NICE Way: Lessons for Social Policy and Practice from the National Institute for Health and Care Excellence. The report, produced with the Alliance for Useful Evidence (@A4UEvidence), cautions against "slavish adherence" to hierarchies of research. "The appropriateness of the evidence to the question is more important, not its place in any such hierarchy." NICE articulates how its decision model works, and argues why it's also relevant outside healthcare.

2. Don't let "little e" evidence kill off good Big Ideas. Take note, lean startups + anyone new to the ways of validating ideas with evidence. In their should-be-considered-a-classic, Is Decision-Based Evidence Making Necessarily Bad?, Tingling and Broydon examine the different uses of evidence in decision-making (MIT Sloan Management Review). As a predictive tool, sometimes it's flat wrong: Those iconic Aeron chairs in tech offices everywhere? Utterly rejected by Herman Miller's market-research focus groups. It's good to have a culture where "small" evidence isn't just an excuse to avoid risk-taking. But it's also good to look at "Big E" Evidence, assessing what research is predictive over time, and replace older methods (focus groups, perhaps).

3. 10+ years ago, Billy Beane famously discovered powerful analytic insights for managing the Oakland A's baseball* team, and as a reward was portrayed by Brad Pitt in Moneyball. Now a bipartisan group of U.S. federal leaders and advisors has published Moneyball for Government, intending to encourage use of data, evidence, and evaluation in policy and funding decisions. On Twitter at @Moneyball4gov. *The A's play not far from Ugly Research HQ, and much to our dismay, the Moneyball competitive advantage has long since played out. But Billy is still a great analyst; for the moment, we're just holding our breath.

4. We've barely scratched the surface on figuring out how to present data to decision makers. All Analytics did a web series this week, The Results Are In: Think Presentation From the Start (recordings and slide decks available). One of the highlights was a comparison to Joseph Campbell's hero's journey, by James Haight of Blue Hill Research (on Twitter @James_Haight).

5. We're wired to seek certainty. But Ted Cadsby argues the world is too complex for our simplified conclusions. He suggests probabilistic thinking to arrive at a "provisional truth" that we can test over time in Closing the Mind Gap: Making Smarter Decisions in a Hypercomplex World.

Five for Friday: 5-Dec-2014.

Monday, 17 November 2014

Cognitive analytics, satisficing, sabremetrics, and other cool ways of deciding.

My weekly five links on evidence & decision-making. 

1. I was delighted when a friend sent a link to 10 Things I Believe About Baseball Without Evidence. Ken Arneson (@kenarneson) looks at sabremetrics with things like the linguistic relativity principle and the science of memory-based prediction. Warning to Oakland A's fans: Includes yet more reminders of the epic 2014 collapse.

2. The concept of satisficing came up on one of my current projects, so I reread some overviews. Herbert Simon coined this term to explain decision making under circumstances where an optimal solution can't be determined. (Simon sure has staying power: During grad school I wanted a copy of Administrative Behavior (1957), and there it was in paperback on the shelf at the Tattered Cover in Denver.)

3. Eric Topol (Medscape) interviewed Margaret Hamburg, US FDA commissioner, on weighing risks/benefits in complex agency decision-making. She talked specifically about the newly restrictive policy on opiate painkiller prescriptions (see page 3). I'd like to see FDA's analysis of unintended side effects, such as the concurrent rise of U.S. heroin use/overdoses.

4. Organizations are challenged with finding the pony in all their data, and marketing-spin hijinks have ensued. Seth Grimes (@sethgrimes) has a great discussion of efforts to go beyond the fundamental volume, velocity, etc. in Avoid Wanna-V Confusion.

5. Steve Ardire (@SArdire) authored a Dataversity paper on cognitive computing, including survey results and some choice comments, both pro and con, about the prospects for this emerging area. For creating business value, 'business intelligence/cognitive analytics' looks promising. Cognitive Computing: An Emerging Hub in IT Ecosystems.

Curated by Tracy Altman of Ugly Research


Friday, 07 November 2014

Data & Decision-Making: Five for Friday 7-Nov-2014.

1. "A gut is a personal, nontransferable attribute, which increases the value of a good one." This classic from Harvard Business Review recaps how policy makers have historically made big decisions. It's never just about the data. A Brief History of Decision Making.

2. James Taylor, an expert on decision management who I admire, is coauthor (with Tom Debevoise) of a new book, bringing a decision management perspective to process discovery and design. I like to think everything's better with explicitly identified decision steps. The notation idea (DMN) makes it much easier to model, execute, and interchange business rules. MicroGuide to Process and Decision Modeling in BPMN/DMN.

3. A reminder to look for the nonobvious. This analysis examines differences in parole hearing outcomes: The usual suspects, such as crime committed, don't always explain why one prisoner is paroled and another is not. Turns out, it's best to go up first thing in the morning. [NY Times] Do You Suffer From Decision Fatigue?

4. IBM sponsored a paper by Ventana Research connecting advanced analytics and business intelligence (finally!), making the excellent point that unless BI insights "are inserted into decision-making processes," they have minimal value. (And then sadness. Ventana weakens their case with "Robust new technology enables better decision-making." How do you measure ‘robust’?) Available at Advanced Analytics Enhances Business Intelligence.

5. Doctor 'decider' skills fade after lunchtime. Bring snacks. Researchers analyzed billing and electronic health records for 20,000+ patients. About 5% more got antibiotics prescriptions later in the day. Findings in JAMA Internal Medicine. Data Shows Docs Prescribe More Antibiotics Late in Day.


Are you interested in creating better ways to inform people's decision making? At Ugly Research, I'm  always looking for people who want to shape this idea.

Wednesday, 15 October 2014

The lean way to present evidence to decision makers.

There's a lot of advice out there about how to design presentations. But most of that fails to prepare you for delivering complex evidence to senior-level decision-makers. Here's what I do. Hope this helps.

First, ask yourself this: How might your evidence help someone better understand the steps required to reach an important goal?

  1. To develop an answer, put together what I call lean evidence, embracing lean management concepts. As explained by the Lean Enterprise Institute, "The core idea is to maximize customer value while minimizing waste." Keep this in mind when planning a presentation, writing your report, or sending that email: Focus on what's valuable, and reduce waste by stripping out nonessentials. Show how value flows over all the details to what's important to your audience.
  2. Skip the storytelling. Begin with "Boom! Here's my answer." You're not Steve Jobs, and this isn't a TED talk. You're delivering lean evidence to a busy executive, so think of all that buildup as waste. Stay true to lean, and get rid of it. Jeanne Tari, VP at Power Speaking, makes a similar point, saying the way to present to executives is to "bottom line it first, then have a dialogue".
  3. Go easy on the pretty pictures. Everybody loves eye candy. But the data visualization is not the point: It just helps you make your point.
  4. Connect dots that matter. Keep the focus on your insights, and how they can help the decision maker improve outcomes. (If you find that you're simply reporting results without connecting at least two important things together, then go back and re-evaluate.)
  5. Avoid the dreaded SMEs disease. Provide enough detail about your methods to establish credibility as a subject matter expert. Then stop. Pay yourself $5 for every word you delete. Andrew a/k/a @theFundingGuru gives this advice, and I agree.

Monday, 18 August 2014

How to tell if someone really is a unicorn.


This Sunday is Unicorn Backpack giveaway day at the Oakland A's game. Given the current mythology about good data scientists a/k/a unicorns, Billy Beane of baseball analytics fame (G.M. of the Athletics) comes to mind.

Unicorn verification process. I'm not minimizing the difficulty of policy research, analytics, data science, and other efforts to find meaningful patterns in data. But communication skills and business savvy dramatically influence people's ability to succeed. As part of an engagement or hiring process, I suggest asking a potential unicorn these questions:

1) What evidence have you worked with that can potentially improve outcomes? Where might it be applicable? 2) How do you translate a complex analysis into plain English for executive decision makers? 3) What visuals are most effective for connecting findings to important business objectives?

Can you talk the talk and walk the walk? While Mr. Beane brilliantly recognized the value of OBP and other underappreciated baseball stats, that's not what made him a unicorn. His ability to explain his findings and advocate for nonobvious, risky, high-stakes management decisions - and to later demonstrate a payoff from those decisions -  is what made him a unicorn.

A colleague of mine worked at a successful, publicly traded telecom company. As a PhD economist, he managed a group of 25 economists. And he says the reason he led the team, and did most of the interacting with senior executives, was that he could explain their economic modeling in business terms appropriate for the audience. 

Connect to what matters. Accenture’s extensive research of analytics ROI has found that “most organizations measure too many things that don’t matter, and don’t put sufficient focus on those things that do, establishing a large set of metrics, but often lacking a causal mapping of the key drivers of their business.”

It's a common theme: Translate geek to English. SAP’s chief data scientist, David Ginsberg, says a key player on his big-data team is someone “who can translate PhD to English. Those are the hardest people to find”. Kerem Tomak, who manages 35 retail analysts, explained to Information Week that “A common weakness with data analytics candidates is they’re happy with just getting the answer, but don’t communicate it”. "The inability to communicate with business decision-makers is not just a negative, it's a roadblock," says Jeanne Harris, global managing director of IT research at Accenture and author of two books on analytics. 

Will Mr. Beane be wearing a unicorn backpack at the game on Sunday? I sure hope so. 

Wednesday, 13 August 2014

Interview Wednesday: James Taylor on decision management, analytics, and evidence.

JamestaylorFor Interview Wednesday, today we hear from James Taylor, CEO of Decision Management Solutions in Palo Alto, California. Email him james@decisionmanagementsolutons.com, or follow on Twitter @jamet123. James' work epitomizes the mature use of evidence: Developing decision processes, figuring out ahead of time what evidence is required for a particular type of decision, then continually refining that to improve outcomes. I'm fond of saying "create a decision culture, not a data culture" and decision management is the fundamental step toward that. One of the interesting things he does is show people how to apply decision modeling. Of course we can't always do this, because our decisions aren't routine/repeatable enough, and we lack evidence - although I believe we could achieve something more meaningful in the middle ground, somewhere between establishing hard business rules and handling every strategic decision as a one-off process. But enough about me, let's hear from James.

#1. How did you get into the decision-making field, and what types of decisions do you help people make?
I started off in software as a product manager. Having worked on several products that needed to embed decision-making using business rules, I decided to join a company whose product was a business rules management system or BRMS. While there are many things you can do with business rules and with a BRMS, automating decision-making is where they really shine. That got me started but then we were acquired by HNC and then FICO – companies with a long history of using advanced predictive analytics as well as business rules to automate and manage credit risk decisions.

That brought me squarely into the analytics space and led me to the realization that Decision Management – the identification, automation, and management of high volume operational decisions – was a specific and very high-value way to apply analytics. That was about 11 or 12 years ago and I have been working in Decision Management and helping people build Decision Management Systems ever since. The specific kinds of decisions that are my primary focus are repeatable, operational decisions made at the front line of an organization in very high volume. These decisions, often about a single customer or a single transaction, are generally wholly or partly automated as you would expect. They range from decisions about credit or delivery risk to fraud detection, from approvals and eligibility decisions to next best action and pricing decisions. Often our focus is not so much on helping a person MAKE these decisions as helping them manage the SYSTEM that makes them.

#2. There's no shortage of barriers to better decision-making (problems with data, process, technology, critical thinking, feedback, and so on). Where does your work contribute most to breaking down these barriers?
I think there are really three areas – the focus on operational decisions, the use of business rules and predictive analytics as a pair, and the use of decision modeling. The first is simply identifying that these decisions matter and that analytics and other technology can be applied to automate, manage, and improve them. Many organizations think that only executives or managers make decisions and so neglect to improve the decision-making of their call center staff, their retail staff, their website, their mobile application etc.

The ROI on improving these decisions is high because although each decision is small, the cumulative effect is large because these decisions are made so often. The second is in recognizing business rules and predictive analytics as a pair of technologies. Business rules allow policies, regulations, and best practices to be applied rigorously while maintaining the agility to change them when necessary. They also act as a great platform for using predictive analytics, allowing you to define what to DO based on the prediction. Decision Management focuses on using them together to be analytically prescriptive. The third is in using decision modeling as a way to specify decision requirements. This helps identify the analytics, business rules and data required to make and improve the decision. It allows organizations to be explicit about the decision making they need and to create a framework for continuous improvement and innovation.

#3. One challenge with delivering findings is that people don't always see how or where they might be applied. In your experience, what are some effective ways to encourage the use of data in decision making?
Two things really seem to help. The first is mapping decisions to metrics, showing executives and managers that these low-level decisions contribute to hitting key metrics and performance indicators. If, for instance, I care about my level of customer retention, then all sorts of decisions made at the front line (what retention offer to make, what renewal price to offer, service configuration recommendations) make a difference. If I don't manage those decisions then I am not managing that metric. Once they make this connection they are very keen to improve the quality of these decisions and that leads naturally to using analytics to improve them.

Unlike executive or management decisions there is a sense that the "decision makers" for these decisions don't have much experience and it is easier therefore to present analytics as a critical tool for better decisions. The second is to model out the decision. Working with teams to define how the decision is made, and how they would like it to be made, often makes them realize how poorly defined it has been historically and how little control they have over it. Understanding the decision and then asking "if only" questions – "we could make a better decision if only we knew…" – make the role of analytics clear and the value of data flows naturally from this. Metrics are influenced by decisions, decisions are improved by analytics, those analytics require data. This creates a chain of business value.

#4. On a scale of 1 to 10, how would you rate the "state of the evidence" in your field? Where 1 = weak (data aren't available to show what works and what doesn't), and 10 = mature (people have comprehensive evidence to inform their decision-making, in the right place at the right time).
This is a tricky question to answer. In some areas, like credit risk, the approach is very mature. Few decisions about approving a loan or extending a credit card limit are made without analytics and evidence. Elsewhere it is pretty nascent. I would say 10 in credit risk and fraud, 5-6 in marketing and customer treatment, and mostly 1-2 elsewhere.

#5. What do you want your legacy to be?
From a work perspective I would like people to take their operational decisions as seriously as they take their strategic, tactical, executive and managerial decisions. Thanks, James. Great explanation of creating value with evidence by stepping back and designing a mature decision process.

Tuesday, 05 August 2014

Tech needs to embrace diversity in more ways than one.

In the U.S. there’s a push for more opportunity and diversity in the tech industry (for good reason, judging from recent statistics). Diversity is an important social goal. Where I live in Oakland, California, good work is being done to foster inclusion in tech. But I see another, related problem: We need more diversity of thought to stop producing the same types of data for the same types of audiences. Here’s my evidence.

Sheep herd photo by Linda Tanner on Flickr

Data isn't enough. It seems that business intelligence technology, big data startups, and analytics are everywhere. Nothing wrong with people becoming more productive and making more evidence-based decisions. Of course, many technologies seek to replace people with algorithms: Nothing wrong with that either, in some cases (I explore this in my recent report, Data is Easy, Deciding is Hard.)

But while we're collecting data, why don't we do more for the human decision maker? Tech vendors are producing lots of impressive dashboard and visualization functionality, but not enough tools for synthesizing complex evidence, evaluating difficult situations, and overcoming our bad decision-making habits. Tech is producing too many nicely displayed facts without explanation: Lots of 'what' and not enough 'why'. 

Data viz isn't enough. With more diverse thinking, we could build practical tools that visualize decisions, show causal mappings, and capture a whole story from numerous sources of evidence. Consider the new 8.2 release from Tableau, a very successful maker of data visualization tools. The company says it's "obsessed with data. Connecting to data, analyzing data, and communicating with data." Their new Story Point feature is nice. But, as you can see in their Austin Teacher Turnover example, the 'story' is long on facts and short on real story: We don't see the specifics of the Reach program, we don't know which Austin groups supported it and which ones didn't, or why it failed. And we don't see what decisions were made by school officials implementing the program, or which actions are connected to which outcomes. Rather than yet another data viz, why don't the smart, capable people at Tableau think differently and produce something more comprehensive and innovative?

Austin Teacher Turnover visualization by Tableau Story Points

BI getting bigger, not better. I'm not alone in questioning the value in some of the new data tools. Business intelligence usage is flat. A popular 2014 survey by TDWI reported a 6% decline in those finding significant impact, down to only 28%. 

We're not connecting action to outcome. One of the best critiques / analyses I've seen is Accenture’s extensive study Analytics in Action: Breakthroughs and Barriers on the Journey to ROI. Their research shows that “most organizations measure too many things that don’t matter, and don’t put sufficient focus on those things that do, establishing a large set of metrics, but often lacking a causal mapping of the key drivers of their business.” Accenture underscores the “need to industrialize the insight-action-outcome sequence”. Highlighting the absence of tools designed for decision-making, they conclude that most companies “fail to embed analytical insights in key decision processes so that analytics capabilities are linked to business outcomes.”

Frank Bien of Looker tells the hard truth: “The common view of the past five years is that users are stupid and that data needs to be spoon-fed to them via pretty pictures…. It’s time to strike a new balance: to join ‘big data’ to business data in such a way that it serves the business - and doesn’t just grow a big data repository.” 

What can be done? Hiring people with diverse experience, and engaging a diverse set of customers, is a good first step toward finding better problems to solve. Diversity of investment - in the public, private, and third sectors - is another needed step, and that's being recognized. Christopher Mims wrote recently that "The entire Bay Area appears to have given up on solving anything but its own problems: those afflicting the same 20-somethings who are building these startups." Of course they don't do this all by themselves: Venture capitalists are being accused of "focusing exclusively on the first-world segment of twentysomething yuppies".

Yes, we need more hiring diversity, but please don't take away the 20-somethings. As a startup founder in the Bay Area, I benefit from several of their clever, disruptive, well-executed solutions, particularly Lyft, Munchery, Caviar, and Instacart.

Adorable sheep photo, Why I Was Late for Church Today, by Linda Tanner / CC BY.

#divtech #dataviz #diversity #siliconvalley #decisionmaking


Monday, 21 July 2014

The Data-Driven vs. Gut Feel hyperbole needs to stop.

Smart decision-making is more complicated than becoming ‘data-driven’, whatever that means exactly. We know people can make better decisions if they consider relevant evidence, and that process is getting easier with more data available. But too often I hear tech advocates suggest that people’s decisions are just based on gut feel, as if data will save us from ourselves.


We need to put an end to the false dichotomy of 'data-driven' vs. 'human intuition'. Consider the challenge of augmenting the performance of a highly skilled professional, such as a medical doctor. Investor Vinod Khosla claims technology will replace 80%+ of physicians’ role in the decision-making process. “Human judgment simply cannot compete against machine-learning systems that derive predictions from millions of data points”. Perhaps so, but it’s really tricky to blend evidence into patient care processes: Research in BMJ reveals mixed results from clinical decision support technology, particularly systems that deliver alerts to doctors who are writing prescriptions.

Data+People=Better. One tech enthusiast compares IBM’s Watson to a hospital CEO. Ron Shinkman asks if it could “be programmed to pore over business cases, news clippings, algorithms and spreadsheets to make the same recommendations?” Actually, that’s what Watson does. But Shinkman overlooks the real opportunity: To supplement, not replace, a CEO’s analytical skills. (Note: This is an excerpt from a research paper I recently wrote  at Ugly Research.)

Why IT Fumbles Analytics. In an excellent Harvard Business Review analysis of how decision makers assimilate data, Donald Marchand and Joe Peppard explain that

management lacks “structure. Even when an organization tries to capture their information needs, it can take only a snapshot, which in no way reflects the messiness of their jobs. At one moment a manager will need data to support a specific, bounded decision; at another he’ll be looking for patterns that suggest new business opportunities or reveal problems.”

Here's another example of a claim that new technology will replace human intuition with fact-driven decision-making.


Source: Business analytics and optimization for the intelligent enterprise (IBM).

You’re not the boss of me. There’s a right time and a wrong time to look at data. As Peter Kafka explains, Netflix executives enthusiastically use data to market TV shows, but not to create them. Others agree data can interrupt the creative process. In The United States of Metrics, Bruce Fieler observes that data is often presented as if it contains all the answers. But “metrics rob individuals of the sense that they can choose their own path.”

However, people could do better. Of course decision makers frequently should ignore their instincts. Andrew McAfee gives examples of algorithms that outperform human experts, and explains why our intuition is uneven (we need cues and rapid feedback).

The Economist Intelligence Unit asked managers “When taking a decision, if the available data contradicted your gut feeling, what would you do?” Most preferred to crunch some more numbers. Only 10% said they would follow the action suggested by the data. The sponsors of Decisive action: How businesses make decisions and how they could do it better concluded that while “many business leaders know they need to make better use of data, it’s clear that they don’t always know how best to do so, or which data they should select from the enormous quantity available to them. They are constrained by their ability to analyse data, rather than their access to it.”

How do you challenge a decision maker? When data is available to improve a result, it must be communicated so it challenges people to apply it, not deny it. One way is to provide initial recommendations, and then require anyone who takes exception to enter notes explaining their rationale. Examples: Extending offers to customers on the telephone, or prescribing medical treatments.

Excerpted from: Data is Easy. Deciding is Hard.