Evidence Soup
How to find, use, and explain evidence.

« November 2009 | Main | January 2010 »

16 posts from December 2009

Thursday, 10 December 2009

PR consultancy announces “evidence-based communications” methodology: A scientific approach to delivering client messages.

Burson-Marsteller, a global consultancy, has adopted a research-focused approach they call 'Evidenced-Based Communications' (complete announcement here). This year, the firm has rolled out a methodology for developing and measuring programs. “The media and communications landscape is changing and so have the needs of our clients,” said Burson-Marsteller CEO Mark Penn. “In a world of citizen journalism, social media, and instant information about events happening around the globe, we are investing in a more diligent and scientific approach to developing and delivering key messages.”

It's great to see marketing and media moving in this direction. Here's how the firm describes the benefits (there is, um, a bit of spin applied). "Evidence-based communications ends the guess work. All strategies are based on evidence, not speculation." (Okay. But trying something creative and new involves unknowns. Don't let evidence crowd out big ideas and experimentation.) "It is cost-effective... ensuring that each client’s communications dollars are spent on tactics and messages that will deliver results. It is measurable. By... benchmarking at the beginning of a program and measuring effectiveness at the end, clients will be able to demonstrate a positive communications return-on-investment."

Evidence Based Communications

They continue, saying theirs is a holistic approach: "While it is common to use some basic research to drive a communications message or to assess the reach of a program at the end, the Evidence-Based approach is a complete methodology. The approach ensures a thorough use of data and tools designed specifically to insert science into the process where appropriate. It provides proof of PR value to the organization’s C-Suite. By using an Evidence-Based approach, communications professionals can demonstrate the value that PR brings to their organization at large."

I wish them well with their evidence-based methodology.

Wednesday, 09 December 2009

NAC does an excellent job with their guide to evidence-based teaching practices for autistic students.

The National Autism Center has developed an impressive guide for educators: Evidence-Based Practice and Autism in the Schools [258-page pdf here]. They do a really nice job of translating solid evidence into hands-on, practical advice, and provide numerous forms and checklists to help teachers collect and analyze their own data. However, I believe they should have included a pullout called "Putting the Evidence into Action" (about 10-12 pages long, and heavy on charts, graphics, and step-by-step) summarizing key concepts and directing people to additional information within the 258-page document. As it is, although this guide is very approachable, it's so long that it's not immediately obvious where to turn for specific details (for example, their table summarizing 'Established Treatments with Favorable Outcomes Reported' appears in the appendix, on page 222).

Setting the stage. NAC explains that they "developed this manual as a means of promoting evidence-based practice for Autism Spectrum Disorders (ASD) in the schools. Why? Because we know that evidence-based practice is in the best interest of the student and that it is most likely to produce positive outcomes with this population. The information presented herein is meant for all “front-line” interventionists who work in school settings. Although research findings are essential, they are not the only component of evidence-based practice. Evidence-based practice requires the integration of research findings with other critical factors.

"These factors include: Professional judgment and data-based decision making; Values and preferences of families, including the student on the autism spectrum whenever feasible; Capacity to accurately implement interventions. This definition of evidence-based practice is applied to school settings throughout this document. Evidence-based practice is complex and requires both ongoing communication and respectful interactions among all stakeholders. Even when a list of effective treatments is identified, collaboration is the key to achieving the best outcomes. To that end, we have provided examples involving a broad range of professionals and support staff throughout the manual to illustrate the points we make." I couldn't have said it better myself.

Three types of evidence. Chapter 2 presents 'Research Findings of the National Standards Project'. Treatment practices are grouped into three categories, based on the supporting evidence: established, emerging, and unestablished. The 'established' treatments are clearly identified, and have names such as: antecedent package, behavioral package, joint attention intervention, naturalistic teaching strategies, modeling, self-management, and story-based intervention package. They've done a standup job translating the research into practical advice, accompanied by examples.

Evidence-Based Practice and Autism in the Schools 

Teaching by example. Discussing the modeling treatment approach, they explain by example: "Consider the case of Steve, a 16-year-old student for whom you have used video modeling in the past. While video modeling works well with many students, Steve seems to have a hard time performing the modeled task unless everything in the classroom setting appears exactly the way it does in the video. Therefore, you decide that the variability that may naturally occur with live modeling may be better for Steve. You train two peers to model the target behavior (in this case, how to make plans to meet a friend at lunch). Although you used the exact same teaching procedure with both peers, you notice there is some natural variation in the way they demonstrate the target behavior."

Evidence-Based Practice and Autism in the Schools


Roll your own. Perhaps most important is Chapter 3, 'Professional Judgment and Data-based Decision Making'. The guide provides forms and recommendations for collecting and analyzing empirical data.

Evidence-Based Practice and Autism in the Schools

Gold star. The folks at NAC should be very proud of this guide. As I said earlier, my one criticism is that they've produced a 250+ page document that - even though it contains lots of white space and nice presentation - might seem unapproachable to a busy educator. A step-by-step quick reference, 10 or so pages long, would be the ideal supplement for this.

Monday, 07 December 2009

Experts call for evidence-based clinical psychology, more science in education and practice.

Fascinating discussion about the science of clinical psychology on Science Friday this week: A group of practitioners is recommending a new accreditation system for training programs to ensure that the methods used by clinical psychologists are backed by scientific research. Advocates of evidence-based methods also claim that more practitioners need to provide evidence demonstrating that their methods work, backing up their claims that they are providing effective treatments.

It's time for some evidence. One of the Science Friday guests was Richard McFall, the Executive Director of pcsas.org, the Psychological Clinical Science Accreditation System. The group's mission is to "provide rigorous, objective, and empirically based accreditation of Ph.D. programs in psychological clinical science.... PCSAS was created to promote superior science-centered education and training in clinical psychology, to increase the quality and quantity of clinical scientists contributing to the advancement of public health, and to enhance the scientific knowledge base for mental and behavioral health care."

McFall is also a co-author of the report Prospects of Clinical Psychology: Toward a Scientifically Principled Approach to Mental and Behavioral Health Care [42-page pdf here]. "Clinical psychologists’ failure to achieve a more significant impact on clinical and public health may be traced to their deep ambivalence about the role of science and their lack of adequate science training, which leads them to value personal clinical experience over research evidence, use assessment practices that have dubious psychometric support, and not use the interventions for which there is the strongest evidence of efficacy. Clinical psychology resembles medicine at a point in its history when practitioners were operating in a largely prescientific manner."

A practical approach. In discussing the use of criteria for data-driven decision-making in mental health care, the report addresses efficacy; effectiveness and dissemination potential; costs and cost- effectiveness; and scientific plausibility. "[T]he future of clinical psychology will be dictated largely by what data show regarding the relative cost-effectiveness of psychosocial and behavioral interventions compared with other competing intervention options in mental health care. Before we can make sense of these data, however, we first must understand clearly the criteria by which such evaluative comparisons are made. Clinical psychologists must offer compelling evidence relating to these criteria if they expect their psychosocial and behavioral interventions to have a fair chance of gaining widespread support, to be adopted in the health delivery system, and to be funded via health coverage mechanisms.... However, there is considerable evidence that the data for the four classes of criteria we discuss... - efficacy, effectiveness, cost-effectiveness, and scientific plausibility - already significantly influence health care decisions."

The report also addresses the merits of psychosocial interventions and future prospects of clinical psychology, saying there's both good news (empirical support for psychosocial interventions) and bad news (psychology’s failure to develop as an applied science).

Diane Chambless of the Univ. of Pennsylvania offered great insights: She said psych students need to learn to infuse their treatments with science, becoming familiar with what's in the database, and what methods are shown to work. She remarked that practitioners need to gather empirical evidence in their practice so they can evaluate outcomes.

But one detractor said adoption of evidence-based methods doesn't make a big difference because they typically replace treatment techniques that were also effective, so the improvements are small. (Hmm... might want to study that empirically.)

Let's present the evidence more effectively. The report advocating evidence-based clinical psychology makes a very compelling case for scientifically supported, cost-effective treatments. I only wish the authors had made their information easier to dig through; they've fallen into the traditional trap of offering page after page of text, supplemented by long lists of references. The empirical evidence and examples they cite aren't easy to spot or grasp quickly. It would be great to see some reader-friendly charts, at-a-glance summaries of treatment specifics, details of training programs, and easy-to-read findings or other evidence.

Saturday, 05 December 2009

Very old evidence is both awesome and hilarious.

Wired has a nice writeup about some old scientific publications in Ground-Breaking Science: Very Old Papers Are Both Awesome and Hilarious. "...history-making papers published by the United Kingdom’s Royal Society [are] released in their entirety to celebrate the 350th birthday of the world’s oldest scientific body. The 60 papers are a testament to human curiosity, and the power of ingenuity and rigorous observation to overcome ignorance." Some examples:

Hilarity:
1666: Tryals Proposed by Mr. Boyle to Dr. Lower, to be Made by Him, for the Improvement of Transfusing Blood out of One Live Animal into Another. "...this suggested that the nature of organismal character might be revealed by swapping blood between dogs. [Wondering] if 'a fierce Dog, by being often quite new stocked with the blood of a cowardly Dog, may not become more tame'?" (For the record, my incredibly good-natured boy Petey is not amused by this discussion.)

Awesomeness:
1965: The Fit of the Continents Around the Atlantic.

Wired: Old scientific papers "Even though man was about to walk on the moon, the idea that continents drifted across Earth’s surface was still controversial. In this paper, Edward Bullard showed how neatly the continents fit together, from their shape to common properties of rocks and fossils. Plate tectonics is now widely accepted, and an instructive reminder of how human knowledge is continually under construction."

Friday, 04 December 2009

Apophenia: The enemy of evidence-based management?

Finally, we might have a name for the phenomenon we've been combating all this time: apophenia. From Wikipedia: "Apophenia is the experience of seeing patterns or connections in random or meaningless data. The term was coined in 1958 by Klaus Conrad1, who defined it as... a 'specific experience of an abnormal meaningfulness'." 
1Brugger, Peter. "From Haunted Brain to Haunted Science: A Cognitive Neuroscience View of Paranormal and Pseudoscientific Thought," Hauntings and Poltergeists: Multidisciplinary Perspectives, edited by J. Houran and R. Lange (North Carolina: McFarland & Company, Inc. Publishers, 2001). [Note: There's a 2007 edition on amazon.]

The Lostpedia on Wikia (about the TV show Lost), describes apophenia as the "perception of patterns, or connections, where in fact none exist. Most psychologists agree that this condition exists in everyone, to some degree; it is a bias of the human mind." They refer to the Wikipedia page.

Happy Fun-with-Evidence Friday. So is it time for a smackdown between EBEs (Evidence-Based Enthusiasts) and Apophenites?

Oops, not so fast. Not everyone uses this term pejoratively: A social media researcher named Danah Boyd describes apophenia as "making connections where none previously existed" - well, that sounds okay... isn't that what research, innovation, and discovery are all about? Boyd also has the apophenia.com domain.

And finally, on SourceForge there's Apophenia: An open statistical library for scientific computing. It has "functions on the same level as those of typical statistics packages (OLS, Probit...)".

Thursday, 03 December 2009

Introducing ExplanationScience.org: Help us figure out better ways to explain the evidence.

ExplanationScience.org I'm delighted to announce a new membership organization: ExplanationScience.org. We're pioneering a new field: The 'science of explaining'. Our slogan says "Explaining is one of the most important things people do, so we're giving it the attention it deserves." We believe that people must do a better job of explaining evidence if evidence is going to make the biggest possible difference.

Case in point: Explaining mammogram guidelines. In recent hearings about the U.S. mammogram kerfuffle, the people responsible have said that the way they explained their recommendations was at the heart of the problem. (I wrote about the task force recently - the guidelines should have been a step forward for evidence-based medicine, though sadly it's not working out that way.) From the testimony of Diana B. Petitti [pdf]: "[Physician reviewers} expressed concern that the wording of the language... would be misunderstood by clinicians, patients, policy makers, and insurers. The Task Force recognizes now the wisdom of [their] advice. The communication... was poor. Our message was misunderstood. [However] the Task Force stands behind the evidence and the conclusions based on the evidence."

Join us (it's free!). These are early days, so our community site and membership signup process isn't yet completed. To be included on the preliminary ExplanationScience.org mailing list, please send your name, location, and work email address to joinus@explanationscience.org and we’ll keep you posted on new developments.

On his blog, Evidence-Based Management | Skeptical Thinking, Richard Puyt interviewed me about Explanation Science. An excerpt:

Richard: So Tracy, why did you start this community?
Me: "I started ExplanationScience.org after years of observing (and obsessing over) how people develop, communicate, and apply evidence when they want to make change happen. I realized that although evidence matters a great deal, how we explain that evidence is equally important.... Explaining is one of the most important things people do, but it hasn't received specific management attention or R&D focus. 'Explaining' can happen anywhere, and can be done by anyone: When we are problem-solving, debating/arguing, teaching, discussing results with investors or stakeholders, announcing research findings, selling products, etc.

"It's a place [to] find innovative ways to develop valid explanatory information, or find better ways to present a persuasive explanation. For several years, I've been an outspoken advocate for processes and technologies that enable evidence-based management. I see [this effort] as complementary to initiatives supporting 'evidence-based _____'. If we can help people gather better explanatory information, and help them provide better explanations, then we can help them drive wider adoption of evidence-based management methodologies."

Richard: Who participates in the community?
Me: "We welcome people from all walks of life who want to do a better job of explaining, and better understand what belongs in a ‘good’ explanation. People who are asking questions like:
- What information do we need to explain customer buying behavior?
- What techniques are best for explaining a particular health outcome or environmental impact?
- Which technologies are best for presenting explanations, or searching for them?

"We offer an opportunity to contribute to the development of a new field: The 'science of explaining'. People with fresh ideas can make a real difference by joining our group. We're providing a place to demonstrate expertise and share experiences on an important topic. We'll provide numerous ways to participate: Leading (or contributing to) online conversations and organized debates. Publishing applied research. Speaking at conferences. Contributing to a knowledge base. Showcasing technologies."

Go here to read the entire interview. Thanks, Richard.

Want to figure out better ways to explain the evidence? To be included on the preliminary ExplanationScience.org mailing list, please send your name, location, and work email to joinus@explanationscience.org and we’ll keep you posted on new developments. 

Follow us: We are @ExplanationSci on Twitter.