Evidence Soup
How to find, use, and explain evidence.

« December 2010 | Main | February 2011 »

9 posts from January 2011

Friday, 28 January 2011

'Facts are for people who can't create their own truth.' Plus more insights from Bucky.

Get Fuzzy does a great job explaining the difficulties of weighing the evidence: Rob, Bucky, and Satchel each have particular strengths and weaknesses. Bucky Katt was in top form this week. 

Happy Fun-with-Evidence Friday. Recent highlights: Thursday Jan 27 - The big picture is inefficient. "I just find the correct little picture and go with that."

Get Fuzzy Friday Jan 28 - "The truth is boring, man." and "I don't have to know a bunch of egghead 'data' to be right about something."

Get Fuzzy

Wednesday, 26 January 2011

Interview Wednesday: Elizabeth Lusk, entrepreneur and evidence broker.

LizLusk Today we talk with Elizabeth Lusk, Co-Founder & KT Conceptual Design Lead for gestalt collective. She's located in Toronto, Ontario, Canada (lucky her). As she explains it, the gestalt collective "specializes in the development, implementation, and evaluation of strategies that help people and organizations better manage their knowledge assets. Committed to moving knowledge into action, together we identify new opportunities to maximize knowledge flow, drive performance improvements and optimize efficiencies to help people, groups, and organizations realize their potential."

Liz is the KT Conceptual Design Lead for the Canadian Dementia Resource and Knowledge Exchange (CDRAKE), and KTE Associate and Knowledge Broker for the Seniors Health Research Transfer Network (SHRTN). She's on LinkedIn and Twitter (@gestaltKT). Also, you can follow the CDRAKE project at @knowdementia.

The Five Questions.
#1. What got you interested in evidence?
Insatiable curiousity. I like to know. I like delving deeper to see if I agree with how someone arrived at a particular piece of ‘evidence’. I like having moments where people’s perspectives blow mine out of the water and I shift and grow.

What types of evidence do you work with most often (medical, business research, statistics, social science, etc.)?
Every type I can get my hands on. I really like exploring what people are saying across fields and disciplines – to see where there is overlap – check out the variety of perspectives and language to describe what are often similar challenges and phenomenon. As well, how people or groups are approaching challenges in differing contexts. I like making those connections and then sharing and discussing it with people and organizations.

If I have to pick a particular type of evidence that I work with most often it is in the realms of design and systems thinking; knowledge translation, exchange, brokering, mobilization; knowledge asset management; and communication. I am also increasingly sourcing and applying concepts from complexity science.

What is your involvement with evidence: applying it, advocating its use, researching/developing it, synthesizing/explaining/translating it, communicating it?
At present I’ve moved away from synthesis because I feel like by the time you do it properly it’s bordering on irrelevancy, or the political landscape changes swiftly and you miss opportunities. I’m finding the greatest success, at present (my ubiquitous qualifier), by telling stories and developing infographics to communicate evidence in a way that does not require expertise as it relates to the topic. Less time consuming, greater return on investment, greater impact.

That said, the landscape will change and I’ll need to continue to be creative in how I communicate evidence with the people I collaborate with. I am really into creating and / or identifying channels in support of knowledge exchange and flow as well as safe and supportive space to co-create. Helping people make connections to others who have experience with a particular evidence set is also proving to be rewarding. I like to help people, organizations, and networks think and act like knowledge brokers.

Where do you go looking for evidence, and what types of sources do you prefer? (formally published stuff such as journals, or something less formalized?)
I need and want to communicate with all the people I work with effectively. I aspire to be an evidence chameleon – someone who can discuss evidence with people using their language to enhance our capacity to connect and collaborate. For the academic in me, I read journal articles. I want to know what the ‘field’ is saying and whether there is consensus or debate. For the journalist in me, I read blogs.

I love reading individuals' unabated opinions and perspectives. I get a greater sense of the energy around a topic through non-peer reviewed evidence. I also love Twitter because you can find some great evidence curators. You can really leverage it as a continuous learning tool. I love books, too. Basically I'm always reading and enjoy lots of sources. I like reconciling the lot of them.

#2. On a scale of 1 to 10, where 10= ‘It’s crystal clear.’ and 1=’We have no idea why things are happening.’, how would you describe the overall “state of the evidence” in your primary field?
If I consider my primary field to be knowledge translation, mobilization, and brokering, I believe the overall ‘state of the evidence’ is about a 6. I see this field as having a few ‘camps’ that speak about similar things being driven from different perspectives and for differing purposes. That can make it difficult for people to navigate.

Which of these situations is most common in your field?
a) Much of the evidence we need doesn’t yet exist.
b) People don't know about the evidence that is available.
c) People don't understand the evidence well enough to apply it.
d) People don’t follow the evidence because it's not the expectation.

A little ‘a’ mixed with a whole lot of ‘c’.

#3. Imagine a world where people can get the evidence they need, and exchange it easily and transparently. What barriers do you believe are preventing that world from becoming a reality?
We tend to build tools and mechanisms to access ‘evidence’ using a predominantly linear and library centric approach. We want to catalogue and codify everything – it’s all very binary in nature. People then become dependent on there being a go to place or source where the evidence is stored. Then they want it translated for them and to their context. This approach overestimates access to information and does not address connecting people; people who have experience with evidence and can therefore add dimensions to evidence that a document can never capture.

While there is great value in library science and I frequently benefit from libraries and librarians, we need to broaden our scope because it is limited as it relates to getting knowledge into practice. Simple access to information is not proving to be enough. What we need (in my humble opinion) is a complementary approach. We need to help people develop the skills and comfort required to work within our complex (and rich) systems for information sharing by giving them space to do so; start trusting one another again and shake off our stereotypes regarding people’s ‘roles’ and try and connect with the person in the role.

 I think we can reach our knowledge potential if we shift our focus to the flow of knowledge rather than only capturing and storing knowledge. Archiving and codifying is good for historical purposes. We can always learn from our history. But if we could get better at exchanging knowledge by connecting with people then maybe we’ll unclog the system and re-energize people. Of course to ensure that quality evidence ‘flows’, we need to diligently help people understand and critically appraise evidence.

Where do you see technology making things better?
Technology helps us have global reach. I love technology. People can access people through multiple channels having never met them in person. It can help us broaden our perspectives and the lens through which we navigate evidence. Ultimately though, it’s the people behind the technology. Technology enhances access to people if it’s used as a learning tool - not just a marketing tool. When used as a learning tool, I believe the person on the other end is oriented to connecting rather than simply selling or pushing. A reciprocal relationship can be established.

With the variety of free online publishing platforms to choose from (e.g. Wordpress, Tumblr, Blogger, Digg, Twitter etc.), more and more people are able to afford and indulge in curating evidence and sharing it online. That said, there is a segment of people I run into through my work that lament over the need for one place to go online and that ‘it is all so confusing’.

My business partner and I were reflecting upon this and thought ‘imagine there was only one style and location to purchase jeans’, no matter what our body type or preference. That wouldn’t fly, right? By having quality options that address the variety of people’s information-seeking preferences, the hope is that one can find evidence tailored and organized in a way that is meaningful to them. When people connect with evidence in a way that resonates with them, they tend to apply it to their life and work. Certainly, critical appraisal skills apply here as well.

#4. How do you prefer to share evidence with people, and explain it to them? Do you have a systematic way of doing it, or is there a format that you follow?
This definitely depends on the person or group I’m sharing and discussing evidence with, as well as in what capacity I am working with them. My approach is always tailored and dynamic. For instance, if I’m partnering with a group as a knowledge broker – with the more formal people and groups – early on I mirror their communication style so we can ease into knowing one another and develop trust. I will share journal articles that are relevant to their work, and if I see something in the news I forward that along as well. With more established working relationships I share peer-reviewed articles from different disciplines, infographics, blogs, videos, and other sources of media I think may contribute to our work. Adding this dimension tends to energize our thoughts and creates space for us to innovate. I mobilize around that energy and when relevant, we begin a co-creation process.

What mistakes do you see people making when they explain evidence?
Injecting their bias, but that’s human nature, right? People also use jargon. I’m guilty of it myself. I’ll have in my head the terminology from whatever I read last that inspired me and I just really want to share it! I love language so it just slips out. I have to engage in some serious self-talk to remind myself not to do this with colleagues, mostly. I still go there though. The biggest thing I notice on the whole is that people default to speaking about evidence only in terms of the peer-reviewed literature. I respect peer-reviewed literature, however, there are a lot of people experiencing things, writing about their thoughts, and sharing through other channels. In the end, I wish for people to expand their notion of what constitutes ‘evidence’. Let’s shift our focus to helping people navigate evidence.

#5. What do you want your legacy to be?
I’d like to be remembered for being a flexible and thoughtful person. Wait – scratch that – that doesn’t sound how I meant it. Let’s try this again; an open-minded person who cares what you have to say and is willing to help.

Thanks, Elizabeth. I really enjoyed hearing your perspective - particularly on the need for navigating and appraising evidence, not just codifying and storing it.


____________________________

Chime in. Would you like to be interviewed, or do you have someone to recommend? Drop me a note at tracy AT evidencesoup DOT com.

Saturday, 22 January 2011

Evidence-based policing Hall of Fame: Call for nominations.

Excellent. There's an Evidence-Based Policing Hall of Fame, recognizing "innovative law enforcement practitioners who have been central to the implementation of a high quality research program in their affiliated agency, highlighting individual excellence in both using and conducting policing research". The project is sponsored by the Center for Evidence-Based Crime Policy at George Mason University (Fairfax, VA, USA - @cebcp on Twitter).

They've issued a call for nominations for the Hall of Fame. They're looking for people who:

  • Are central to the implementation of a documented, rigorous scientific evaluation in their affiliated agency in which a police intervention, tactic, strategy, or deployment was tested for effectiveness.
  • Demonstrate a record of incorporating evidence-based practices in their agency. For example, strategies with characteristics of interventions falling within the 'realms of effectiveness' of the Evidence-Based Policing Matrix.

Nominees must have police practitioner experience, either sworn or civilian.

Friday, 21 January 2011

Text analytics of online horoscopes reveals they pretty much all say the same thing. We knew that, but the analysis sets a great example for all of us.

On Information is Beautiful, David McCandless describes how he applied text analytics to 22,000 horoscopes. They were scraped from Yahoo! Shine (*sigh*, a site "for women looking for the latest information and advice from experts"). Findings were used to generate a meta prediction for all star signs, for every day -- here's an excerpt: "Family and friends matter. The world is life, fun, and energy. Maybe hard. Or easy." Heh.

Am I repeating myself? Evidence showed that at least 90% of the words in these horoscopes are the same. I wonder how that compares to the text of The Da Vinci Code (or to a book I'm currently reading, Counterfactuals and Causal Inference - despite what you may have heard, I know how to have a good time, people).

We all should do such a thorough job. Astrology is a lightweight topic, but the analysis sets a very good example: McCandless clearly explains his methodology, provides links to underlying data, and specifies which scripts and tools were used (including TagCrowd).

Horoscoped by information is beautiful


Happy Fun-with-Evidence Friday. McCandless spells out his process and data here: http://bit.ly/horoscoped, saying "That way it’s all balanced and you can make up your own mind. Typical Libran!"

Monday, 17 January 2011

Power Balance (of hologram fame) has purchased the naming rights for an NBA arena. (Yes, really.)

Silly me. I figured when the folks selling the Power Balance bracelet admitted there's no evidence to support their claims, that would be the end. But lo and behold, evidently they have sold so many hologram thingies that they've purchased the naming rights to the Sacramento Kings' arena for the next five years. I kid you not.

power balance admits no scientific evidence A depressing turn of events for all things evidence-based. My critique of Power Balance is well documented, including the post How do those Power Balance bracelets work? I think it's because of the 20-Hz difference between a genius and an ascending colon. Gawker, too, was taken by surprise, writing that Company Admits It's a Scam, Promptly Buys NBA Stadium Naming Rights. (I highly recommend reading the comments on that story -- so many gems, though I think my fave is "I understand that the decision was very difficult between this and 'Nigerian 419 Arena'.") As Gawker explains:

Ad Age estimates that the deal is probably worth about a million bucks a year for the Kings, and also points out that the Kings suck and they probably couldn't get anything better. Yes: this professional basketball team could do no better than a company whose marketing copy actually reads, "Power Balance is based on the idea of optimizing the body's natural energy flow, similar to concepts behind many Eastern philosophies. The hologram in Power Balance is designed to resonate with and respond to the natural energy field of the body."

After the Australia kerfuffle, the company announced "In our advertising we stated that Power Balance wristbands improved your strength, balance and flexibility. We admit that there is no credible scientific evidence that supports our claims and therefore we engaged in misleading conduct." Subsequently, the company issued this statement:

"Power Balance stands by our products," said Keith Kato, President of Power Balance, LLC. "Millions of people around the globe are wearing Power Balance products and are thrilled with the results. Dozens of high profile professional athletes swear by the results they've experienced from wearing our products." CNBC recently named Power Balance as the "Sports Product of the Year for 2010."

...That said, there has been some negative press about our products coming out of Australia recently, followed by a class action law suit filed here in the US based on those misstatements, and we wanted to set the record straight.

Contrary to recent assertions in the Australian press, Power Balance has made no claims that our product does not perform. This is simply untrue. Apparently, some previous claims in our marketing ads in Australia were not up to ACCC standards. Changes were voluntarily made immediately, approved and the issues were believed to have been resolved. We were obviously surprised to see the recent press coming out of Australia followed by a class action law suit here in the United States.

"The mission of Power Balance has always been to develop and deliver quality products that enhance people's lives," said Kato. "...Frankly, we know there will always be critics of new technologies, but our products are used by those with open minds who experience real results. Our company is absolutely committed to further evaluating the technology behind its products' performance so that we can continue to offer products that enhance people's lifestyle."

Supporting evidence. I checked other media outlets, just in case Gawker had posted the naming rights story as a joke. Nope. It's on Bloomberg and Business Insider, among other sites.

 

Wednesday, 12 January 2011

Interview Wednesday: Richard Puyt, lecturer, consultant, and researcher.

Richardpuyt Today we talk with Richard Puyt, a freelance lecturer at the University of Applied Science in Amsterdam, management consultant, and researcher (he's working on his Ph.D). Richard is located in Amersfoort, the Netherlands. He's on LinkedIn here, he blogs at Evidencebased-management.com, and on Twitter he's @richardpuyt.

The Five Questions.
#1. What got you interested in evidence?
In 2007 I read the presidential address [pdf] to the Academy of Management by Denise Rousseau for the first time [Ed. note: here's Denise's recent Evidence Soup interview]. This immediately made sense to me. I’ve worked more than ten years as a consultant in several firms (both government and commercial), and had a hard time understanding the decision-making processes. There always seems to be a sense of urgency. If you have a skeptical mindset and ask simple questions like: 'Why are we doing this?', managers tend to get uncomfortable. They prefer it if you ‘just do your job’. When pressed for an answer, responses are ‘it is necessary’ or ‘this is beneficial for our company’.

Evidence supporting these statements is rarely provided. This is in itself not a problem, but the impact of a (major) change in the organization (for example implementing a new ERP system or a merger) is rarely started by looking for the best available evidence, let alone appraised. Especially in government settings, the initial reason (or the business case) is soon forgotten and the moment the project is declared finished, everybody is running towards the next project. Unfortunately, evaluation and learning are never top priority. Evidence of the benefits (or the lack of) of the change is not established. Lessons-learned reports are not written, and nobody asks for them when starting a new project. The same mistakes are bound to happen in future projects. When accepting a new consulting job, I always start out with looking for evidence. This starts with interviewing people and desk research.

What types of evidence do you work with most often (medical, business research, statistics, social science, etc.)?
Mostly business research and evidence from social science (and sometimes medical).

What is your involvement with evidence: applying it, advocating its use, researching/developing it, synthesizing/explaining/translating it, communicating it?
On the evidence-based management collaborative blog, authors and contributors advocate the use of evidence and discuss the latest developments. We are always looking for people who want to participate in the discussions or write reviews. For my Ph.D research, I’m researching and synthesizing the use of evidence by managers in the management literature. We can learn a lot from medical science (but not copy everything they have developed).

Where do you go looking for evidence, and what types of sources do you prefer? (formally published stuff such as journals, or something less formalized?)
The answer depends of the question I need to answer. The formalized stuff is good as background information and a way to frame the question. However, the context of the problem (and the people working there) is a valuable source as well. The real skill is synthesizing the different levels of evidence, and accepting that sometimes we just don’t know.

This is actually one of this major issues in management decision-making: Access to the best available evidence. The scholarly databases have restricted access, and sometimes research is proprietary and will never become publicly available. Most managers don’t have the time or the skills to phrase their problem and research it. This is why they don’t use scholarly sources. Then there is Google. My students need to be trained in appraising information from the Internet. Although Google keeps getting better, you still need to appraise the information. The same applies to managers. The idea of quick fixes and copying approaches from airport books (remember the book From Good to Great by Jim Collins?) need to be actively discouraged in formal education. To quote Henry Mintzberg: Management is an art, largely a craft, and not a science - but it uses science.

#2. On a scale of 1 to 10, where 10= ‘It’s crystal clear.’ and 1=’We have no idea why things are happening.’, how would you describe the overall “state of the evidence” in your primary field?
Hmm. In one big generalization, an 8. In some fields we do know a lot - however, in most areas we don’t. Bear in mind, management is a social science.

Regarding the notion of things appearing to be ‘crystal clear’, I recommend reading Trust in numbers: The pursuit of objectivity in science and pubic life by Theodore M. Porter, and Making sense of the organization by Karl Weick. Denise Rousseau is now editing the Handbook of Evidence-Based Management: Companies, Classrooms, and Research, which will be published by Oxford Univ Press the Academy of Management; also something to look forward to (I read the draft already).

Which of these situations is most common in your field?
a) Much of the evidence we need doesn’t yet exist.
b) People don't know about the evidence that is available.
c) People don't understand the evidence well enough to apply it.
d) People don’t follow the evidence because it's not the expectation.

All of the above apply.

#3. Imagine a world where people can get the evidence they need, and exchange it easily and transparently. What barriers do you believe are preventing that world from becoming a reality?
Politics, and the fact that evidence is not self-explanatory.

Where do you see technology making things better?
It can improve access to the best available evidence, and evidence can be appraised and aggregated faster. Exchange of evidence is much easier.

4. How do you prefer to share evidence with people, and explain it to them? Do you have a systematic way of doing it, or is there a format that you follow?
At present I don’t have a standard format, but with a few colleagues we are developing teaching material.

What mistakes do you see people making when they explain evidence?
Bias, self-interest, misinterpretation, selective shopping in the available evidence (interesting clip here).

#5. What do you want your legacy to be?
I’d like to be remembered for being modest, but asking the right questions at the right time.

Thanks, Richard. Lots of useful insights. I'm going to look up your excellent reading recommendations.
____________________________

Chime in. Would you like to be interviewed, or do you have someone to recommend? Drop me a note at tracy AT evidencesoup DOT com.

Tuesday, 11 January 2011

Why do we frown on health guidelines that are based on expert knowledge? Sometimes that's the current, best evidence.

Fishoil Is a medical guideline useless if it's not based on randomized, controlled trials? How much is expert knowledge (or opinion) worth? I'm asking myself these questions after reading Analysis of Overall Level of Evidence Behind Infectious Diseases Society of America Practice Guidelines, a new article in the Archives of Internal Medicine. Findings revealed that more than half of the associated recommendations are based on Level III evidence (expert opinion).

The question of how to weigh recommendations is relevant not only to healthcare professionals, but also to people in business, government, and the non-profit sector. I learned about these new findings via Gary Schwitzer's excellent Health News Review, which asked "New questions about medicine's 'best practice' guidelines: evidence or opinion?" (on Twitter: @HealthNewsRevu).

What's the fuss about? According to the abstract, the study analyzed 41 Infectious Diseases Society of America (IDSA) guidelines encompassing 4,218 individual recommendations. This analysis considered both the strength of recommendation (Levels A through C) and overall quality of evidence (Levels I through III). The researchers concluded that: "More than half of the current recommendations of the IDSA are based on level III evidence only. Until more data from well-designed controlled clinical trials become available, physicians should remain cautious when using current guidelines as the sole source guiding patient care decisions." (Ahem. Shouldn't they "remain cautious" even when recommendations are based on "better" evidence? Otherwise aren't they at risk of practicing cookie-cutter medicine?)

Sometimes expert knowledge *is* the current, best evidence. The basis of evidence-based management is making decisions based on current, best evidence: Randomized, controlled experiments are great. But they're not always available -- or in many fields, even possible. We can't disregard other forms of evidence: We're much better off systematically weighing the fuzzier evidence than we are disregarding everything that doesn't fit the narrow definition of 'Level I' evidence.

Here's a recap of the study findings:

  • Quality of evidence. 14% of the recommendations were based on Level I evidence, 31% on Level II, and 55% on level III.
  • Strength of recommendation. The investigators explain that among the Level A (strong) recommendations, 23% were based on Level I evidence (≥1 randomized controlled trial), while 37% were based on expert opinion only (Level III).

Maybe not apples & oranges, but different varieties of apples. The guideline story was widely covered in the medical media, including Fierce Health and Reuters. Reuters oversimplified the situation, saying "more than half the recommendations relied solely on expert opinion or anecdotal evidence." Okay, but that's what 'Level C' is for: To indicate a relatively weak recommendation. And since when is expert opinion not valuable?

Buyer beware. Here's what the Infectious Diseases Society says about its practice guidelines: They "are systematically developed statements to assist practitioners and patients in making decisions about appropriate health care for specific clinical circumstances [Institute of Medicine Committee to Advise the Public Health Service on Clinical Practice Guidelines, 1990]. Attributes of good guidelines include validity, reliability, reproducibility, clinical applicability, clinical flexibility, clarity, multidisciplinary process, review of evidence, and documentation. It is important to realize that guidelines cannot always account for individual variation among patients. They are not intended to supplant physician judgment with respect to particular patients or special clinical situations. IDSA considers adherence to the guidelines... to be voluntary, with the ultimate determination regarding their application to be made by the physician in the light of each patient’s individual circumstances."

 

Wednesday, 05 January 2011

Interview Wednesday: Chris Lysy, research analyst and evidence enthusiast.

Chris_lysy For today's Interview Wednesday, we talk with Chris Lysy, a research analyst at Westat in Research Triangle, North Carolina, USA. He has extensive experience with real-world evidence about social and educational programs: How to find it, explain it, and apply it.

I highly recommend EvalCentral.com, a site Chris created to aggregate interesting evaluation content - everything from tools for finding evidence to presenting findings to clients. You can read more about Chris here, and follow him on Twitter at @clysy.

The Five Questions.
#1. What got you interested in evaluation and evidence?
I’ve always had lots of questions and they tend not to answer themselves. As it relates to my career, positions in contract research and evaluation compliment my inquisitive nature. After undergrad I accepted a grunt work role in contract research (evidence only occasionally collects itself), and then earned an MA in Sociology. After that I bounced around a couple of small research consultancies then fell into evaluation as a data specialist with North Carolina’s statewide early childhood program. I’ve stuck with evaluation, but have returned to my original company (Westat) accepting a better, less grunt work, role. I enjoy working in a field where the evidence is often directly connected to something tangible.

What types of evidence do you work with most often (medical, business research, statistics, social science, etc.)?
Currently special education, but I’ve had had experience in multiple fields (including all listed examples) and often find myself immersed in evidence from other disciplines.

What is your involvement with evidence and evaluation: applying it, advocating its use, researching/developing it, synthesizing/explaining/translating it, communicating it?
Occupationally, I am an evidence collector, analyzer, and reporter. Personally, I am an advocate and communicator.

Where do you go looking for evidence, and what types of sources do you prefer? (formally published stuff such as journals, or something less formalized?)
That ultimately depends on the question I am trying to answer. I have a knack for finding evidence quickly over the web using search engines and a snowball approach, but my position also calls for the use of more formally published works. If I work with someone who knows more than me about something, they are usually my first source. If not, I turn to Google. Also, I think if you keep your eyes open, evidence finds you.

#2. On a scale of 1 to 10, where 10= ‘It’s crystal clear.’ and 1=’We have no idea why things are happening.’, how would you describe the overall “state of the evidence” in your primary field?
This would require me to pick a primary field... if it’s evaluation, than I would say 7, because finding evidence is kind of the point - but there is always room for improvement. 

Which of these situations is most common in your field?
a) Much of the evidence we need doesn’t yet exist.
b) People don't know about the evidence that is available.
c) People don't understand the evidence well enough to apply it.
d) People don’t follow the evidence because it's not the expectation.

Again, depends on the questions and situation, but I would say a and b. There is a constant need for new evidence (which I’m thankful for because it keeps me employed) but there are also plenty of opportunities to better use the evidence that already exists. Government agencies and organizations at all levels are increasingly offering access to raw data. When working with local communities having access to accurate data at the county or city level is immensely useful but you have to know where to find it and understand how to analyze it.

#3. Imagine a world where people can get the evidence they need, and exchange it easily and transparently. What barriers do you believe are preventing that world from becoming a reality?
The biggest barriers to the type of world you describe are privacy and cost. Evidence is rarely free, it costs money to collect and store. I don’t think that always receives the consideration it deserves. If you speak with a professional in the early childhood field tasked with building a comprehensive data system, you will quickly learn about all the associated costs and challenges.

As for privacy, it’s critical to make sure that evidence is collected with the rights of the subject in mind. Privacy concerns keep agencies from the table. As an analyst I love data at the individual level, I can always aggregate it later. You can do a lot with it, including match the data with other sets, but in order to do that you need a link, some type of identifier, and access to the full datasets. But the more individual the data, the greater the privacy concerns. You have to show that the evidence is useful enough to justify the cost but also make sure that it can never fall into the wrong hands. These will continue to be big issues well into the future.

Where do you see technology making things better?
By making evidence accessible, less expensive, and easier to analyze and understand. There is a lot of technology that already exists to do these things, we also need to use the technology.

#4. How do you prefer to share evidence with people, and explain it to them? Do you have a systematic way of doing it, or is there a format that you follow?
No systematic way, but I’m a big fan of charts and visualizations. I prefer to share more evidence than less, dumbing down data to single numbers and percentages takes away the context. Context is important when presenting evidence.

What mistakes do you see people making when they explain evidence?
They don’t give the person they are presenting to enough credit. Sure, you may be keyed in on a single metric but the supporting data is also important. It’s ok to let your audience arrive at the point on their own, or even come up with a different point. Use your voice to point out trends or individual findings. Removing the context makes evidence harder to understand, not easier. Of course, don’t expect your audience to always appreciate this. The goal is for them to understand, not for you to get praise for an awesome presentation.

#5. What do you want your legacy to be?
I’d like to be remembered for helping academics, researchers, and evaluators make better use of the vast array of available web tools. Also, as a great husband and father.

Thanks, Chris, for these useful insights.


____________________________

Chime in. Would you like to be interviewed, or do you want to recommend someone? Drop me a note at tracy AT evidencesoup DOT com.

Monday, 03 January 2011

Power Balance admits there's no evidence to support their claims about the bracelets (in Australia, anyway).

As reported on Gizmodo and other media outlets, the Power Balance folks have admitted there's no scientific evidence to support their claims that the hologram bracelets improve strength, balance, and flexibility. I wrote about these things last year in How do those Power Balance bracelets work? I think it's because of the 20-Hz difference between a genius and an ascending colon. and later in Again with the Power Balance bracelets.

In Australia, the Power Balance company has published an admission in the media and is offering refunds. Their statement says "In our advertising we stated that Power Balance wristbands improved your strength, balance and flexibility. We admit that there is no credible scientific evidence that supports our claims and therefore we engaged in misleading conduct."

Power Balance statement Australia