For today's Interview Wednesday, we talk with Terri Griffith, a Professor of Management in the Leavey School of Business at Santa Clara University. I've known Terri for several years, and admire how she strives to maintain high standards for evidence-based decision-making, while also staying realistic (as she says below, we can confidently explain about 67% of the things we study). She writes a blog called Technology and Organizations; you can follow her on Twitter @terrigriffith.
The Five Questions.
#1. What got you interested in evidence?
"My interest in the hard sciences carried over to my career in the social sciences. I give a great deal of credit to the faculty at UC Berkeley and Carnegie Mellon for helping me see the power of methods and good measurement. It probably also helps that I do most of my work in science, engineering, & software companies. They expect to make decisions with evidence - though most of the time I think they are surprised that I agree."
What types of evidence do you work with most often (medical, business research, statistics, social science, etc.)?
"Social science and business research."
What is your involvement with evidence: applying it, advocating its use, researching/developing it, synthesizing/explaining/translating it, communicating it?
"All of the above. In my teaching I try and instill an understanding for the value of evidence and some basic around how to collect it. I’ll admit it’s one of the toughest issues to convey in a general management class - but it’s also an opportunity to demonstrate how different business courses give you different approaches for collecting and evaluating evidence.
"My colleagues and I do research based on social science techniques. The underlying effort and care is sometimes a surprise to our enterprise collaborators. I recall a pharma company being taken aback that we needed contact with hundreds of teams to effectively answer a particular question. Yes, it would have been a lot of work, but so was the resulting value and quality of the answer. Unfortunately the project wasn’t being driven at a high enough level for the project to go through. I have high hopes that a culture of evidence is growing. Companies like Google and Yahoo! are doing a lot to demonstrate the value of large scale studies before making a decision. I especially love Marissa Mayer’s (Google) 'Data is Apolitical.'
"...and my colleagues and I often have to develop custom assessment tools before beginning a new project. Right now we’re working on an assessment tool for Systems Savvy."
Where do you go looking for evidence, and what types of sources do you prefer? (formally published stuff such as journals, or something less formalized?)
"Top tier peer-reviewed journals tend to be my first stop. I’m typically looking for something very specific and am likely to know the people involved and their approach to the work. I love thinking about how I used to do that in 1980 and how I do it now (a combination of Google Scholar and our University library’s resources). Gartner and Forrester are also excellent resources for demonstrating a trend or business perceptions around a topic. They have access to sources that most academics can’t parallel."
#2. On a scale of 1 to 10, where 10=‘It’s crystal clear.’ and 1=’We have no idea why things are happening.’, how would you describe the overall “state of the evidence” in your primary field?
"5 -- As I tell my students, I can give them great confidence about 67% of behavior."
Which of these situations is most common in your field?
a) Much of the evidence we need doesn’t yet exist.
b) People don't know about the evidence that is available.
c) People don't understand the available evidence well enough to apply it successfully.
d) People don’t follow the evidence because “evidence-based” is not the expectation.
"Answering in terms of how practitioners would apply my research:
a) Much of the evidence we need doesn’t yet exist. Again, hoping that we are developing a 'culture of evidence' in organizations. I look forward to the day when it is common for organizations to approach the business school to answer burning questions. This will require a combination of my colleagues and I being able to move more quickly and there being greater value placed on the answers we find.
"also d) People don’t follow the evidence because 'evidence-based' is not the expectation. Social science research methods and results are not well known. The translation from a statistical finding to a dollar amount is often not direct and unexpected."
#3. Imagine a world where people can get the evidence they need, and exchange it easily and transparently. What barriers do you believe are preventing that world from becoming a reality?
"Combination of the above. If we just take the case of academic research and think about the underlying data, we see that compatibility and standards are a big issue. The National Science Foundation is working on this one both in terms of how they make researchers deal with data transparency and in their funding of repositories and systems for sharing in the research process and data. Barriers are coming down, just not quickly."
Where do you see technology making things better?
"I'm hoping for a 'semantic web' of data. Something where variable names wouldn’t have to be identical for data to be matched. Google is trying to help: Google Fusion Tables."
#4. How do you prefer to share evidence with people, and explain it to them? Do you have a systematic way of doing it, or is there a format that you follow?
"Depends on the audience, and for other academics, it depends on the setting. Formal presentation, looks a lot like a journal article in terms of the flow. That’s the language that people are expecting. If it’s a shorter event, then it begins to look more like a public presentation. There I go with the story and then drill down where proof is needed. Love what Decker Communication calls 'human scale.' One example had to do with the cost of a bottle of water versus tap. They present the number in terms of how many years you could drink a bottle of tap water a day before you would hit the cost of a single bottle of brand-name water."
What mistakes do you see people making when they explain evidence?
"How about the mistakes I make...? I assume my business students have the same implicit understanding of modeling that I do. I’ll start drawing a box and arrow diagram and talking about the statistical significance, and then I realize I’ve lost most of them. I think I get excited about the idea and switch into a language that researchers take years to develop. No reason to expect the audience to speak that language."
#5. What do you want your legacy to be?
"I’d like to be remembered for making systems savvy a valued organizational skill. Systems savvy is the ability to weave technology tools, organizational practices, and human capabilities together for consistently powerful organizational performance. I see systems savvy as being important on the same scale as goal setting. Most managers have a reasonable knowledge of goal setting. We have a strong base of evidence around goal setting’s value, and why it works, and how to teach it. I hope to be involved in taking systems savvy to the same place. The result would hopefully be a vast improvement in organizational performance and similar reduction in human frustration."
#6. What question do you wish we'd asked?
Terri: "When should evidence-based thinking be introduced in education?"
"In kindergarten. From math classes to current events, evidence-based thinking should be a foundation."
Chime in. Would you like to be interviewed, or do you have someone to recommend? Drop me a note at tracy AT evidencesoup DOT com.