Evidence Soup
How to find, use, and explain evidence.

16 posts categorized "interview wednesday"

Wednesday, 08 December 2010

Interview Wednesday: Denise Rousseau, enthusiastic champion of evidence-based management (and university professor).

DeniserousseauToday we talk with Denise Rousseau, a professor at Carnegie Mellon University, and self-described inter-galactic champion of evidence-based management. She's active in the Academy of Management and is one of the organizers behind Evidencebased-Management.com. (I've written before about Denise's efforts: In 2007, I discussed her article Is There Such a Thing as Evidence-Based Management? [get the pdf here] and later, about meetups of the Evidence-based Management Collaborative.)

The Five Questions.
#1. What got you interested in evidence?
After 30 years of research and teaching MBAs, I came to realize how little our own MBAs really understood about management research findings, and even worse, how seldom they used research findings in their decision making. Well, that is not quite right, my finance colleagues have students and alums who use financial research findings, ditto for my operations research colleagues. It is management students (HR, strategy, organizational behavior) who didn't know and don't use research findings.

Then, horror of horrors, as Academy of Management president, I started getting emails from AOM members who were less academic research-oriented, complaining about how our journals were no help in their teaching or consulting. My first reaction was to feel bad for them, since AOM wasn't being useful to them in terms of professional updating. And then I stopped and thought about it further. In my own teaching, I periodically have to update and alter what I teach, because new findings make the old obsolete. Why weren't other people feeling the same need? Then it hit me: What if our educators and consultants aren't using the evidence in their work? Inquiry led me to the realization that both in the classroom and professional practice, lots of people with doctorates in fields related to management were ignoring current evidence, or acting on hunches and impressions that bore no connection to the huge, rich evidence base we have in social science and management. (Long answer, but that's how I got started.)

What types of evidence do you work with most often (medical, business research, statistics, social science, etc.)?
Business and social science research, quantitative, qualitative, meta-analyses, etc. Depends on what the issue or the problem requires.

What is your involvement with evidence: applying it, advocating its use, researching/developing it, synthesizing/explaining/translating it, communicating it?
I try to do all of the above. Applying in my own life as I teach from an evidence-based perspective, consult and advise based on evidence from science as well as facts of the organizational setting itself. Research is still my great love, but if it is seldom used, even the most wonderful research has less value than it would otherwise.

I now do more work trying to integrate different literatures - to understand what the core findings are, and am writing more translation pieces. The latter is the absolute hardest, as writing for colleagues is more of an in-group language, and plain English is not my forte....but I am working on it!

Where do you go looking for evidence, and what types of sources do you prefer? (formally published stuff such as journals, or something less formalized?)
I am a big library-search engine data base user - sometimes I think I’m a reference librarian at heart. Once I start searching a) I always find useful things related to the question I am pursuing, and b) come across fascinating other things I want to know about in the process (this is a huge distraction sometimes, otherwise a very fulfilling discovery)! So I like ABI/Inform best, along with psychological research databases.

#2. On a scale of 1 to 10, where 10= ‘It’s crystal clear.’ and 1=’We have no idea why things are happening.’, how would you describe the overall “state of the evidence” in your primary field?
I am trained as an industrial psychologist, and I think in what we have studied for the most part we are "8".
The field is cumulative, and we know a lot about a) motivation, b) selection, c) performance (especially individual), and d) employment relationships, including the psychological contract.

Which of these situations is most common in your field?
a) Much of the evidence we need doesn’t yet exist.
b) People don't know about the evidence that is available.
c) People don't understand the evidence well enough to apply it.
d) People don’t follow the evidence because it's not the expectation.

"B". People just don't know - and for the most part, academics don't teach behavioral evidence to management students. Instead we use cases, pop theory, and give them little opportunity to practice applying research findings.

#3. Imagine a world where people can get the evidence they need, and exchange it easily and transparently. What barriers do you believe are preventing that world from becoming a reality?
Access is a huge problem, complicated by journals and proprietary data bases seeking to profit by restricting access. In my ideal world, not only would a literate college-educated person know about using evidence in their professional work, but they also would have been trained in accessing it while in University so that upon graduation they could stay updated (and their University would give them electronic library privileges as alums, so they could easily do this).

Alums would form communities of practice to help each other keep up-to-date, and more attention would be given to professional and exec updating on what the science says as pertains to professional practice. It would be a normal part of developing expertise over the course of a career. Now we are producing too many confident amateurs who don't know what they don't know, and they have the degree to prove it.

Where do you see technology making things better?
Accessibility, improved searching strategies, relational databases, update alerts when new content in an area is put on the web.

#4. How do you prefer to share evidence with people, and explain it to them? Do you have a systematic way of doing it, or is there a format that you follow?
In all honesty, I follow the template of Julia Child's approach to mastering the art of French cooking: State the general principle (what is sauce?), the forms it comes in (Hollandaise, Bernaise...), and what should you watch for when you make it and what to do if a problem turns up. Julia was a master behaviorist, I am convinced.

So in terms of evidence, I’ve found that focusing on core principles, simple statement of what the finding is (set specific goals to improve performance). Then discuss conditions of use (procedural knowledge, like how many goals can be set at one time, 5 or fewer, per cognitive limits on attention) and contingencies (What if it doesn't work? Well, the effect of goals depends on the people have accepted them. Did you set goals in a way that is legitimate and credible to the people involved?).

What mistakes do you see people making when they explain evidence?
Too much backstory (the five theories that came before this one, the theoretical controversies that remain) - and not enough about what's in it for the end user.

#5. What do you want your legacy to be?
I’d like to be remembered for helping managers and people in organizations make better, more thoughtful, more ethical decisions.

What makes you hopeful?
I have more and more students and alums who just assume that evidence is what a good manager incorporates into his or her thinking and actions. Like, if you aren't using evidence, can you really be a responsible decision maker/professional?

Thanks, Denise.


____________________________

Chime in. Would you like to be interviewed, or do you have someone to recommend? Drop me a note at tracy AT evidencesoup DOT com.

Wednesday, 01 December 2010

Interview Wednesday: Ashley Welde, Director of Evidence-Based Communications at Burson-Marsteller.

Ashley Welde Today I'm delighted to introduce Ashley Welde, Director of Evidence-Based Communications at Burson-Marsteller (New York, New York).

I asked Ashley where we can learn more about her, and she provided these links: "I blog and tweet under the Burson-Marsteller name. If you don’t mind receiving other company news, you can hear about the research I do @BMGlobalNews. Some studies I published this year which may be of interest include The PR Effect, which provides evidence for the effectiveness of PR programs; The Message Gap Analysis, which explores the 'gap' between a company’s messages and what ultimately appears in the media; and Fortune 100 Global Social Media Study, which uncovers evidence about how Fortune 100 companies are using social media."

The Five Questions.
#1. What got you interested in evidence?
I’m a natural-born psychologist who is always trying to understand why people think what they think, and why they do what they do. Thus, I’m compelled to uncover the root cause of people’s beliefs and actions, which instinctively leads to gathering evidence. I’m also interested in business, so I have an MBA and I’ve spend the last 15 years in marketing/PR, gathering evidence to understand what drives consumers’ brand perceptions, what influences their purchases, what drives corporate reputation, etc.
My father was also a psychology professor who did extensive experimenting on the chemistry of the human brain, so research about people and their motivations must run in my blood.

What types of evidence do you work with most often (medical, business research, statistics, social science, etc.)?
My job title, “Director of Evidence-Based Communications,” is kind of fancy name for market researcher, which is what I actually do. The evidence I collect as a researcher is a combination of social science and business, and the evidence drives two specific goals: 1) to develop PR programs that will influence stakeholders perceptions and behaviors and 2) to measure the impact of the PR programs.

For example, maybe I have a client who wants to encourage citizens to recycle. For the first goal, I would collect evidence to understand why citizens are not recycling (i.e., maybe they don’t know which items are recyclable, maybe they don’t know where to bring their recyclables, maybe they’re just lazy). Then we use that evidence to develop a PR program targeted at the issues preventing citizens from recycling. For the second goal, I identify measurement techniques to collect evidence about whether or not citizens’ recycling behavior has been influenced by the program.

What is your involvement with evidence: applying it, advocating its use, researching/developing it, synthesizing/explaining/translating it, communicating it?
I do all of the above. In addition to applying evidence to client work as I described above, I also design thought leadership studies to promote Burson-Marsteller’s communications services. Within the company, I am responsible for educating my colleagues about what evidence is available and to apply it to their client work. I also communicate and promote the Evidence-Based thought leadership we do throughout the PR community.

Where do you go looking for evidence, and what types of sources do you prefer? (formally published stuff such as journals, or something less formalized?)
The evidence for my industry is published in PR/marketing industry trades/websites such as PR Week, Advertising Age, the Center for Media Research and eMarketer. I always try to understand the methodologies used – and often email the researcher if I can to get a better sense of how the research was conducted – because the quality of research varies greatly. I also read some academic research about consumer behavior, which is often more methodologically sound but unfortunately cannot always be applied to real-life PR in a practical way.

#2. On a scale of 1 to 10, where 10= ‘It’s crystal clear.’ and 1=’We have no idea why things are happening.’, how would you describe the overall “state of the evidence” in your primary field?
Sorry to give a cop-out answer, but I would say “5.” (As a researcher, I always prefer 1s and 10s!) Traditionally, “evidence for success” in PR meant getting coverage for a client in a major newspaper, instead of whether or not you were really changing people’s perceptions and behaviors. Public relations is genuinely hard to measure, which I could explain in more depth except it would consume this entire interview. However, clients are starting to demand more evidence to prove that PR programs have an impact on the business’s bottom line, so people like me are developing tools to do just that. And, because so much of PR is becoming digital (including social media), it is getting easier and easier to gather evidence to demonstrate impact.

Which of these situations is most common in your field?
a) Much of the evidence we need doesn’t yet exist.
b) People don't know about the evidence that is available.
c) People don't understand the evidence well enough to apply it.
d) People don’t follow the evidence because it's not the expectation.

“C” I just described above that PR has a problem with “A” because there is a lack of evidence prove the impact of PR, but I believe that a lack of understanding about how to apply evidence is a bigger problem. PR professionals are extremely creative and excellent at coming up with big ideas, but they are not trained to apply and generate evidence. This is quickly changing though, and it is the driving force behind the Evidence-Based Communications program at Burson-Marsteller – to help encourage professionals in the PR industry to use evidence to be more effective.

#3. Imagine a world where people can get the evidence they need, and exchange it easily and transparently. What barriers do you believe are preventing that world from becoming a reality?
I think the barriers to an evidence-based world are that people do not always want to believe the evidence. For example, 18% of Americans still believe President Obama is a Muslim despite mountains of evidence and media to prove that he is not. People tend to support evidence that is already aligned with what they want to believe, regardless of whether it is scientific evidence, social science evidence, etc. Evidence will always be open to interpretation, and new evidence will make us realize that the “old evidence” was completely wrong. I believe these are the most limiting factors.

Where do you see technology making things better?
Technology makes things better because we can collect evidence more thoroughly and accurately, and I believe this is true for both social science and medical/natural science. However, technology and free survey tools such as surveymonkey and zoomerang – which are fabulous in many ways, and I find them extremely useful for my work – also enable people who are not properly trained to write biased survey questions and to collect, analyze and publish data with erroneous conclusions. And, technology allows for these erroneous conclusions to spread even more rapidly through digital/social media, which I find concerning. Even as a trained professional, sometimes I find it hard to determine which data I read are valid and which are not.

#4. How do you prefer to share evidence with people, and explain it to them? Do you have a systematic way of doing it, or is there a format that you follow?
I like to develop a dialogue with people so we can have a conversation about the evidence I’ve collected. When I talk with clients, colleagues, or media about evidence, they all begin the conversation with different assumptions about what the data means, and unless I get them talking about how they interpret the data, I won’t be able to correct their misinterpretations. So I always begin by presenting the data, but then I take an organic approach to learn the best way to get my message across to each unique audience.

What mistakes do you see people making when they explain evidence?
Researchers often present evidence, but they don’t offer ways to apply the evidence in an actionable way. They don’t tie all the data points together to create a story. The evidence has to be woven into a meaningful story with implications in order for someone else to make use of it.

#5. What do you want your legacy to be?
I’d like to be remembered for helping colleagues and clients recognize that evidence drives creativity. Many of my colleagues worry that data and evidence kill the creative PR process, but I want them to see that evidence inspires it.

Amen to that. Thanks for sharing your insights, Ashley.


____________________________

Chime in. Would you like to be interviewed, or do you have someone to recommend? Drop me a note at tracy AT evidencesoup DOT com.

Wednesday, 17 November 2010

Interview Wednesday: Peter West, management consultant.

Peterwest Today we talk with Peter West, a management consultant in London, Ontario, Canada. To learn about  Peter's work, go to ContinuousInnovation.ca. Peter frequently uses Twitter (@WestPeter) and Delicious to share citations, books, proceedings, etc. that will interest knowledge workers. And he has profiles on Facebook, LinkedIn, and SlideShare.

The Five Questions.
#1. What got you interested in evidence?
From an early age, I had an insatiable curiosity, something my parents nurtured and for which I will always be grateful. When contemplating career choices, I gravitated to the research world. In the hopes of making a difference and contributing to society, I chose to focus on the health sector. My first career exposure was in Respiratory Research. It was an incredibly formative environment. I learned how to formulate complex and highly-relevant questions; engage people, practices and resources in the pursuit of solutions; think broadly; listen deeply; and share freely. The value of creating and using evidence in innovative ways had real world implications.

A brief career history will provide a context for my responses to the remaining question. I have spent the majority of my 35-year career in the health sector, engaged in roles that include researcher (respiratory, sleep and cancer data capture and analysis), consultant (health information systems planning, procurement and implementation), manager (health informatics), economic developer (intellectual property, vendor contracts, and health data) and change agent (health reform). I am currently working as a management consultant, specializing in knowledge management – helping clients and organizations to create the conditions whereby the right knowledge becomes available to the right individuals in the right situations (or contexts) in order that the right actions be taken, adding value and positively influencing outcomes.
Knowledge processes play an important role in the creation and mobilization of evidence.

What types of evidence do you work with most often (medical, business research, statistics, social science, etc.)?
All of the above. Client- and project-specific.

What is your involvement with evidence: applying it, advocating its use, researching/developing it, synthesizing/explaining/translating it, communicating it?
All of the above. I apply evidence-oriented skills across a range of domains, working with clients in the public and private sectors.
As one example, I have acquired a strong reputation for representing complex evidence in one-page infographics that help stakeholders to see issues, opportunities and themselves in new ways.

Where do you go looking for evidence, and what types of sources do you prefer? (formally published stuff such as journals, or something less formalized?)
Projects typically require me to look inside and outside the organization. Internally, awareness of the historical record associated with the project promotes deeper awareness and nurtures credibility with internal stakeholders, both of which are critical to the success of internal evidence-gathering interviews and meetings. The external evidence gathering process is comprised of a detailed environmental scan supplemented with targeted interviews with subject matter experts and practitioners. Synthesis and analysis skills are used to frame the issue and expose potential solution pathways.

#2. On a scale of 1 to 10, where 10=‘It’s crystal clear.’ and 1=’We have no idea why things are happening.’, how would you describe the overall “state of the evidence” in your primary field? I would rank the overall state of evidence at ‘5.’

Which of these situations is most common in your field?
a) Much of the evidence we need doesn’t yet exist.
b) People don't know about the evidence that is available.
c) People don't understand the evidence well enough to apply it.
d) People don’t follow the evidence because it's not the expectation.
For most of the organizations I am involved with, the existence of evidence, awareness of its availability, understanding of its implications and expectations for its use are best described by a normal distribution (or bell curve).

#3. Imagine a world where people can get the evidence they need, and exchange it easily and transparently. What barriers do you believe are preventing that world from becoming a reality (data incompatibility, lack of research skills, information overload, lack of standard ways of presenting evidence, lack of motivation to follow evidence-based practices, ...)? And what would you recommend as possible solutions?
All of the above [barriers], plus these:

  • Government will, commitment, continuity, execution and sustainability. Possible solution: Evidence leadership, evidence-enabling infrastructures, citizen engagement (and accountability), etc.
  • Legal obstacles, including liability, intellectual property rights, etc. Possible solution: Evidence-friendly regulatory, policy and governance standards, practices and behaviours, etc.
  • Financial constraints. Possible solution: Greater stakeholder engagement in making the difficult decisions about funding allocations
  • Social awareness, practices and trust. Possible solution: A comprehensive suite of communication tools that enhances stakeholder engagement, dialogue and decision-making.
  • Source/Media hype (How often is evidence contradicted? How/what are stakeholders to believe? or do? Rampant stakeholder scepticism is understandable). Possible solution: Greater stakeholder awareness of the evidence lifecycle, greater source/media responsibility in framing and communicating evidence, etc.
  • Discipline-specific professional arguments regarding accepted theory and practice. Possible solution: Greater stakeholder awareness of theory and practice maturity.
  •  Time (to reflect on evidence needs and interact with stakeholders). Possible solution: New participatory vehicles for framing research questions and their execution.
  • Language. Possible solution: New resources that make it easier for researchers to translate evidence into plain language that stakeholders can mobilize.
  • Boundaries (and related behaviours). Possible solution: Complex issues require multidisciplinary approaches that span boundaries. More resources need to be made available to researchers to make the transition to multidisciplinary practice.
  • Reward systems. Possible solution: The current reward system is broken. A reward structure that promotes openness, interaction and application of evidence needs to be evolved.
  • Mobilization practices. Possible solution: Holding individual stakeholders accountable for evidence mobilization practices is not working. There is a pressing need for the availability of a generic evidence mobilization framework.

Where do you see technology making things better?
Technology has the capacity to enhance evidence mobilization by making evidence more social, ambient, embedded, sensed, remembered, anticipated, workflowed and applied. As a tool, technology enhances the availability of evidence and its interaction with stakeholders.

For example, in the health sector...
For researchers: A 24/7 electronic professional evidence assistant that proactively networks the researcher with other research professionals (maximizing the potential for a comprehensive multidisciplinary approach) and directly links the researcher to evidence stakeholders and beneficiaries (sensitizing the researcher to stakeholder evidence needs, and the stakeholder to the researcher’s environment). Collectively, the assistant enhances the researcher’s capacity to optimize the formation and execution of research questions.

For policy-makers: A 24/7 electronic professional evidence assistant that ensures that for any given policy or regulatory issue, all relevant evidence is available in a form that is understandable and usable. Stakeholders also have access to the evidence. Greater awareness, dialogue, accountability and applicability are possible.

For practitioners: A 24/7 electronic professional evidence assistant that “listens” to practitioner-patient interactions, “reviews” electronic medical records and provides current, context-specific evidence. A more holistic encounter and targeted outcome are promoted.

For individuals: A 24/7 electronic personal evidence companion (based upon a private, secure and integrated view of your genetic, environmental and behavioural profiles – adapting its advice to every nuanced change, and shared with people, professionals and organizations in strict accordance with your pre-authorization) focused on maximizing your health and well-being.

#4. How do you prefer to share evidence with people, and explain it to them? Do you have a systematic way of doing it, or is there a format that you follow?
The process that works best for me encompasses, understanding what the stakeholders need, anticipating and managing their expectations, presenting evidence in a form they can digest and apply, learning from successes and failures, and continuously innovating.

What mistakes do you see people making when they explain evidence?
Often, there is a failure to take stakeholder contexts into account – framing evidence with sensitivity to their cultural, behavioural, procedural, and decisional drivers.
Insistence upon ‘evidence-based’ behaviour. In reality, evidence-related behaviour falls along a continuum, from ‘evidence-aware’ to ‘evidence-informed’ and finally, ‘evidence-based.’

#5. What do you want your legacy to be?
I’d like to be remembered for my passion and commitment. Helping individuals and organizations to leverage evidence – to represent it in ways that attract attention, understanding, commitment, action and positive outcomes.


Many thanks for your time, Peter. These responses give me lots of food for thought.

____________________________

Programming note: I'll be taking next week off for the Thanksgiving holiday in the U.S.

Chime in. Would you like to be interviewed, or do you have someone to recommend? Drop me a note at tracy AT evidencesoup DOT com.

Wednesday, 10 November 2010

Interview Wednesday: Rick Austin, health communicator.

Rick Austin Today for Interview Wednesday, we talk with Rick Austin, Senior Communication Specialist for the Research Into Action project at The University of Texas School of Public Health in Houston, Texas. (View Rick's blog here or follow him on Twitter @KTExchange.)

The Five Questions.
#1. What got you interested in evidence?
I would say that this job is what really got me focused, but I had always been interested, as a layman (which I still am!), in asking “how do we know this?” Related to this, my daughter, when she was a senior in high school, took a class entitled The Theory of Knowledge, which was a year-long conversation about “how do we know what we think we know?” I would’ve loved to sit in on that all year.

What types of evidence do you work with most often (medical, business research, statistics, social science, etc.)?
Since the Research Into Action (RIA) project is based in a school of public health, that’s where we generally start, with empirical public health research.

What is your involvement with evidence: applying it, advocating its use, researching/developing it, synthesizing/explaining/translating it, communicating it?
Not quite all of the above. RIA is a knowledge translation project, so that encompasses advocacy, synthesizing, translating, communicating.

Where do you go looking for evidence, and what types of sources do you prefer? (formally published stuff such as journals, or something less formalized?)
We start with peer-reviewed journal articles as a basis for the subject we’re translating, but then we’ll go wherever the search for support and amplification takes us. We’ve generally found that a single peer-reviewed research project may be a good starting point for the conversation, but the nature of empirical research is such that any useful conclusion is going to be too narrowly focused. So, we find ourselves looking for similar research that will help support our viewpoint.
For example, we did some knowledge translation work on a study that originated here at the UT School of Public Health, looking at physical activity and academic achievement among elementary school students. The PI found some reliable correlations between increased physical activity and improved academics. However, the single study wasn’t enough to hang our hat on: The sample was fairly small, and the improvements weren’t across the board. We went looking for, and found, a slew of related studies on physical activity and improved self-esteem, improved on-task behavior, and improved classroom management, to name a few. So, we broadened our focus slightly, and had a strong argument for increased physical activity.

#2. On a scale of 1 to 10, where 10= ‘It’s crystal clear.’ and 1=’We have no idea why things are happening.’, how would you describe the overall “state of the evidence” in your primary field?
That depends on what you define as “my field.” I think the field of public health research is very strong, inasmuch as it’s populated by not only PhD public health field practitioners, but many who are also MDs and/or laboratory-based researchers. So, probably an 8 or 9. The field of knowledge translation/dissemination/exchange/social marketing, is extremely fluid in the United States, and also siloed in many areas. For example, the NIH’s clinical translation work is heavily focused on bench-to-bedside, and, they’ve got all the money. They might find it very enlightening to talk a little more freely with the social marketers and health communicators at the CDC. So, from the standpoint of knowledge translation, probably a 2 or 3.

Which of these situations is most common in your field?
a) Much of the evidence we need doesn’t yet exist.
b) People don't know about the evidence that is available.
c) People don't understand the evidence well enough to apply it.
d) People don’t follow the evidence because it's not the expectation.
Hmm, this is a hard one because the umbrella I work under is so broad. From the knowledge translation standpoint, I’ll have to go with “much of the evidence we need doesn’t yet exist.”

#3. Imagine a world where people can get the evidence they need, and exchange it easily and transparently. What barriers do you believe are preventing that world from becoming a reality?
Looking at this from the perspective of the average citizen/consumer, the two things that immediately spring to mind are information overload and lack of critical thinking skills. You and I have both blogged about all of these energy field necklace/bracelet/patch scams on the market, and they’re a perfect illustration of the general inability/unwillingness to think critically. Just a few seconds of thought (“Hmm, it’s a little piece of plastic held against my wrist by an elastic band; that’s silly, what could it possibly do?”) would restore some sanity to the consumer health discussion.

Where do you see technology making things better?
The ability to connect worldwide, the speed of dissemination (a double-edged sword), and something that nobody has even imagined yet – look at all the experimentation going on in geo-location, mobile health, new hardware platforms, ubiquitous internet connectivity – that will come as a complete surprise.

#4. How do you prefer to share evidence with people, and explain it to them? Do you have a systematic way of doing it, or is there a format that you follow?
Depends entirely on the evidence and the audience. Actually, here at RIA we’re trying to hammer out a second draft of a model of knowledge translation right now. We hope to describe a lot of the knowledge translation scenarios and weave some systemization into what is currently a very ad hoc process. We’ll post it on the discussion board at KTExchange when we’ve got it ready, and let people throw darts at it.

What mistakes do you see people making when they explain evidence?
Several of our podcast interviewees have beaten researchers about the head and shoulders on this subject (see how I put the onus on the interviewees there? Twasn’t me!). One point they’ve made is that many lab researchers have neither the motivation nor the communication skills to explain what they do to laymen. Joanne Silberner, formerly a science reporter with NPR, talks about trying to encourage a laboratory pathologist by telling him to explain something the way he would at a cocktail party. He got huffy and said, “I don’t go to cocktail parties.” Another obstacle to explaining evidence is the researcher’s instinctive aversion to definitive statements. Their research is never definitive; there’s always room for more exploration. The next-door neighbor doesn’t understand that mindset at all.

#5. What do you want your legacy to be?
Right now, I don’t see any long-term legacies in my work. It’s great fun and very challenging, but it’s my job, not my life. I’ll have to go with a cliché instead: I’d like to be remembered for being a good father and a good friend, both of which are way harder than I thought they would be.

Thanks, Rick.

____________________________

Chime in. Would you like to be interviewed, or do you have someone to recommend? Drop me a note at tracy AT evidencesoup DOT com.

Wednesday, 27 October 2010

Interview Wednesday with Terri Griffith (management professor at Santa Clara University).

Terri Griffith For today's Interview Wednesday, we talk with Terri Griffith, a Professor of Management in the Leavey School of Business at Santa Clara University. I've known Terri for several years, and admire how she strives to maintain high standards for evidence-based decision-making, while also staying realistic (as she says below, we can confidently explain about 67% of the things we study). She writes a blog called Technology and Organizations; you can follow her on Twitter @terrigriffith.

The Five Questions.
#1. What got you interested in evidence?
"My interest in the hard sciences carried over to my career in the social sciences. I give a great deal of credit to the faculty at UC Berkeley and Carnegie Mellon for helping me see the power of methods and good measurement. It probably also helps that I do most of my work in science, engineering, & software companies. They expect to make decisions with evidence - though most of the time I think they are surprised that I agree."

What types of evidence do you work with most often (medical, business research, statistics, social science, etc.)?
"Social science and business research."

What is your involvement with evidence: applying it, advocating its use, researching/developing it, synthesizing/explaining/translating it, communicating it?
"All of the above. In my teaching I try and instill an understanding for the value of evidence and some basic around how to collect it. I’ll admit it’s one of the toughest issues to convey in a general management class - but it’s also an opportunity to demonstrate how different business courses give you different approaches for collecting and evaluating evidence.

"My colleagues and I do research based on social science techniques. The underlying effort and care is sometimes a surprise to our enterprise collaborators. I recall a pharma company being taken aback that we needed contact with hundreds of teams to effectively answer a particular question. Yes, it would have been a lot of work, but so was the resulting value and quality of the answer. Unfortunately the project wasn’t being driven at a high enough level for the project to go through. I have high hopes that a culture of evidence is growing. Companies like Google and Yahoo! are doing a lot to demonstrate the value of large scale studies before making a decision. I especially love Marissa Mayer’s (Google) 'Data is Apolitical.'

"...and my colleagues and I often have to develop custom assessment tools before beginning a new project. Right now we’re working on an assessment tool for Systems Savvy."

Where do you go looking for evidence, and what types of sources do you prefer? (formally published stuff such as journals, or something less formalized?)
"Top tier peer-reviewed journals tend to be my first stop. I’m typically looking for something very specific and am likely to know the people involved and their approach to the work. I love thinking about how I used to do that in 1980 and how I do it now (a combination of Google Scholar and our University library’s resources). Gartner and Forrester are also excellent resources for demonstrating a trend or business perceptions around a topic. They have access to sources that most academics can’t parallel."

#2. On a scale of 1 to 10, where 10=‘It’s crystal clear.’ and 1=’We have no idea why things are happening.’, how would you describe the overall “state of the evidence” in your primary field?
"5 -- As I tell my students, I can give them great confidence about 67% of behavior."

Which of these situations is most common in your field?
a) Much of the evidence we need doesn’t yet exist.
b) People don't know about the evidence that is available.
c) People don't understand the available evidence well enough to apply it successfully.
d) People don’t follow the evidence because “evidence-based” is not the expectation.

"Answering in terms of how practitioners would apply my research:
a) Much of the evidence we need doesn’t yet exist. Again, hoping that we are developing a 'culture of evidence' in organizations. I look forward to the day when it is common for organizations to approach the business school to answer burning questions. This will require a combination of my colleagues and I being able to move more quickly and there being greater value placed on the answers we find.

"also d) People don’t follow the evidence because 'evidence-based' is not the expectation. Social science research methods and results are not well known. The translation from a statistical finding to a dollar amount is often not direct and unexpected."

#3. Imagine a world where people can get the evidence they need, and exchange it easily and transparently. What barriers do you believe are preventing that world from becoming a reality?
"Combination of the above. If we just take the case of academic research and think about the underlying data, we see that compatibility and standards are a big issue.  The National Science Foundation is working on this one both in terms of how they make researchers deal with data transparency and in their funding of repositories and systems for sharing in the research process and data. Barriers are coming down, just not quickly."

Where do you see technology making things better?
"I'm hoping for a 'semantic web' of data. Something where variable names wouldn’t have to be identical for data to be matched. Google is trying to help: Google Fusion Tables."

#4. How do you prefer to share evidence with people, and explain it to them? Do you have a systematic way of doing it, or is there a format that you follow?
"Depends on the audience, and for other academics, it depends on the setting. Formal presentation, looks a lot like a journal article in terms of the flow. That’s the language that people are expecting. If it’s a shorter event, then it begins to look more like a public presentation. There I go with the story and then drill down where proof is needed. Love what Decker Communication calls 'human scale.' One example had to do with the cost of a bottle of water versus tap. They present the number in terms of how many years you could drink a bottle of tap water a day before you would hit the cost of a single bottle of brand-name water."

What mistakes do you see people making when they explain evidence?
"How about the mistakes I make...? I assume my business students have the same implicit understanding of modeling that I do. I’ll start drawing a box and arrow diagram and talking about the statistical significance, and then I realize I’ve lost most of them. I think I get excited about the idea and switch into a language that researchers take years to develop. No reason to expect the audience to speak that language."

#5. What do you want your legacy to be?
"I’d like to be remembered for making systems savvy a valued organizational skill. Systems savvy is the ability to weave technology tools, organizational practices, and human capabilities together for consistently powerful organizational performance. I see systems savvy as being important on the same scale as goal setting. Most managers have a reasonable knowledge of goal setting. We have a strong base of evidence around goal setting’s value, and why it works, and how to teach it. I hope to be involved in taking systems savvy to the same place. The result would hopefully be a vast improvement in organizational performance and similar reduction in human frustration."

#6. What question do you wish we'd asked?
Terri: "When should evidence-based thinking be introduced in education?"
"In kindergarten. From math classes to current events, evidence-based thinking should be a foundation."

Chime in. Would you like to be interviewed, or do you have someone to recommend? Drop me a note at tracy AT evidencesoup DOT com.

Wednesday, 20 October 2010

Interview Wednesday: Ben Miller (Univ of Colorado School of Medicine).

Evidence Soup is starting something new: It's called Interview Wednesday. We'll be interviewing insightful people to find out how they work with evidence, assess the "state of the evidence" in their field, and explore how things might be improved. (Would you like to be interviewed, or do you have someone to recommend? Let me know.)

Ben Miller First up is Ben Miller, who wants to change the game in healthcare, and who knows that relevant evidence is crucial to his efforts. (He wears many hats - and has one of the longest email signatures I've ever seen.) Ben Miller, PsyD, is Assistant Professor / Department of Family Medicine, Associate Director of Primary Care Outreach and Research, University of Colorado School of Medicine. Plus, he's Administrative Director - Collaborative Care Research Network (CCRN), and Senior Scientist - AAFP National Research Network. Follow Ben on Twitter: @miller7.

The Five Questions.
#1. What got you interested in evidence?
"Throughout my graduate training and even now, I am often the 'mental health' provider working in primary care. I have seen firsthand what happens when two systems of health care are integrated, and also witnessed the barriers in policy adopting such a comprehensive strategy to address the healthcare needs of the whole person. I watched as more and more evidence emerged onto the scene supporting the integration of mental health into primary care, but saw no policy movement to accommodate this new information.

"I began to wonder, what is it about the evidence that is not compelling enough to change or move policy? Being trained as a clinician, I had an appreciation for research and using evidence based treatments in whatever setting I was working, but it was not until I began to take a step back and look at why systems had not changed and what evidence they need to change that I realized I would likely spend the rest of my career working on collecting 'this' evidence."

What types of evidence do you work with most often (medical, business research, statistics, social science, etc.)?
"I am lucky to have the opportunity to work and learn from national leaders in practice-based research networks (PBRN). According to the Agency for Healthcare Research and Quality (AHRQ), PBRNs are based in primary care, and have been around for more than twenty years. PBRNs involve community-based clinicians and staff in specific activities designed to understand and improve primary care. 'The best PBRN efforts link relevant clinical questions with rigorous research methods in community settings to produce scientific information that is externally valid, and, in theory, assimilated more easily into everyday practice.' I love practice-based data – collected at the place where the 'magic is happening' – where it is difficult to control for all conditions and where you can really study, in a focused manner, the whole person."

What is your involvement with evidence: applying it, advocating its use, researching/developing it, synthesizing/explaining/translating it, communicating it?
"I am the co-creator and administrative director for the CCRN. I'm the principal investigator on federal two grants using the network to research models of integrating mental health and primary care.

"Building off the idea that a national network is stronger than individual practices in examining research questions (PBRN), the Collaborative Care Research Network (CCRN) was developed to implement a national, practice-based research agenda to evaluate the effectiveness of collaboration between mental health providers and primary care providers. With the emergence of the patient-centered medical home, it is clear that such integrated services are to be a prominent dimension of responding to a community, patient and families’ whole person health needs. However, there is an equally crucial need to evaluate the effectiveness of such 'collaborative care' in ways that are both useful to primary care practitioners, and answer the important questions that have and will be raised. For more info on the CCRN, go to www.aafp.org/nrn/ccrn. With the right evidence collected, we hope to influence policy so that healthcare systems can change and defragment."

Where do you go looking for evidence, and what types of sources do you prefer? 
"I prefer to see the evidence in practice. To watch transformative change on an individual level, community level or even national level, there is something very powerful that happens. Maybe it is in the measuring of it, dissecting of it, reporting of it or even the feeling of it that keeps you coming back for more. You start to look for better ways to research the idea, more novel ways to study the problem and more interesting ways to share the information with those who could benefit the most for it."

#2. On a scale of 1 to 10, where 10= ‘It’s crystal clear.’ and 1=’We have no idea why things are happening.’, how would you describe the overall “state of the evidence” in your primary field?
"Considering we are publishing the 'research agenda' for our field, I would say 4."

Which of these situations is most common in your field?
a) Much of the evidence we need doesn’t yet exist.
b) People don't know about the evidence that is available.
c) People don't understand the available evidence well enough to apply it successfully.
d) People don’t follow the evidence because “evidence-based” is not the expectation.

#3. Imagine a world where people can get the evidence they need, and exchange it easily and transparently. What barriers do you believe are preventing that world from becoming a reality?
"Fragmentation of healthcare delivery systems and competing business models of how to pay for healthcare."

Where do you see technology making things better?
"Opportunities to aggregate data through such mechanisms like all payer database. I also run a project that is working towards extracting data from electronic medical records across the country into one aggregated data set to perform comparative effectiveness research."

#4. How do you prefer to share evidence with people, and explain it to them? Do you have a systematic way of doing it, or is there a format that you follow?
"I like to tell the story of why we are investigating a particular topic. I use 'Ms. Jones' a lot to help me. Telling the story, and whether supporting it with evidence, or justifying why you did what you did is a highly effective way for me to get my message across."

What mistakes do you see people making when they explain evidence?
"Over generalizing findings, controlling for everything but the one disease condition (who really has just one problem?); and not explaining such essential pieces of information like NNT."

#5. What do you want your legacy to be?
"I’d like to be remembered for changing the way healthcare is operated and delivered.”

Chime in. Would you like to be interviewed, or do you have someone to recommend? Drop me a note at tracy AT evidencesoup DOT com.