Evidence Soup
How to find, use, and explain evidence.

46 posts categorized "evidence-based marketing"

Tuesday, 21 June 2016

Free beer! and the "Science of X".


1. Free beer for a year for anyone who can work perfume, velvety voice, and 'Q1 revenue goals were met' into an appropriate C-Suite presentation.
Prezi is a very nice tool enabling you to structure a visual story, without forcing a linear, slide-by-slide presentation format. The best part is you can center an entire talk around one graphic or model, and then dive into details depending on audience response. (Learn more in our writeup on How to Present Like a Boss.)

Now there's a new marketing campaign, the Science of Presentations. Prezi made a darn nice web page. And the ebook offers several useful insights into how to craft and deliver a memorable presentation (e.g., enough with the bullet points already).

But in their pursuit of click-throughs, they've gone too far. It's tempting to claim you're following the "Science of X". To some extent, Prezi provides citations to support its recommendations: The ebook links to a few studies on audience response and so forth. But that's not a "science" - they don't always connect between what they're citing and what they're suggesting to business professionals. Example: "Numerous studies have found that metaphors and descriptive words or phrases — things like 'perfume' and 'she had a velvety voice' - trigger the sensory cortext.... On the other hand, when presented with nondescriptive information — for example, 'The marketing team reached all of its revenue goals in Q1' — the only parts of our brain that are activated are the ones responsible for understanding language. Instead of experiencing the content with which we are being presented, we are simply processing it."

Perhaps in this case "simply processing" the good news is enough experience for a busy executive. But our free beer offer still stands.

2. How should medical guidelines be communicated to patients?

And now for the 'Science of Explaining Guidelines'. It's hard enough to get healthcare professionals to agree on a medical guideline - and then follow it. But it's also hard to decide whether/how those recommendations should be communicated to patients. Many of the specifics are intended for providers' consumption, to improve their practice of medicine. Although it's essential that patients understand relevant evidence, translating a set of recommendations into lay terms is quite problematic.

Groups publish medical guidelines to capture evidence-based recommendations for addressing a particular disease. Sometimes these are widely accepted - and other times not. The poster-child example of breast cancer screening illustrates why patients, and not just providers, must be able to understand guidelines. Implementation Science recently published the first systematic review of methods for disseminating guidelines to patients.

Not surprisingly, the study found weak evidence of methods that are consistently feasible. "Key factors of success were a dissemination plan, written at the start of the recommendation development process, involvement of patients in this development process, and the use of a combination of traditional and innovative dissemination tools." (Schipper et al.)

3. Telling a story with data.
In the Stanford Social Innovation Review (SSIR), @JakePorway explains three things great data storytellers do differently [possible paywall]. Jake is with @DataKind, "harnessing the power of data science in service of humanity".


Photo credit: Christian Hornick on Flickr.

Tuesday, 15 March 2016

Analytics disillusionment, evidence-based presentation style, and network analysis.

Polinode New_Layout_Algorithm

1. Visualizing networks.
@Polinode builds innovative tools for network analysis. One nifty feature allows creation of column charts using a set of nodes. A recent post explains how to use calculated network metrics such as centrality or betweenness.

2. Analytics are disconnected from strategic decisions.
An extensive study suggests analytics sponsors are in the trough of disillusionment. The new MIT Sloan-SAS report, Beyond the hype: The hard work behind analytics success finds that competitive advantage from analytics is declining. How can data do more to improve outcomes?

Analytics insights MIT-SAS report

The @mitsmr article notes several difficulties, including failure to drive strategic decisions with analytics. "Over the years, access to useful data has continued to increase, but the ability to apply analytical insights to strategy has declined." Dissemination of insights to executives and other decision makers is also a problem. The full report is available from SAS (@SASBestPractice).

3. Evidence shows graphics better than bullets.
There's new empirical evidence on communicating business strategy. 76 managers saw a presentation by the financial services branch of an auto manufacturer. Three types of visual support were displayed: bulleted list, visual metaphor, and temporal diagram. Each subject saw only one of the three formats. Those who saw a graphical representation paid significantly more attention to, agreed more with, and better recalled the strategy than did subjects who saw a (textually identical) bulleted list version. However, no significant difference was found regarding the *understanding* of the strategy. Also, presenters using graphical representations were more positively perceived those who presented bulleted lists.

4. Linking customer experience with value.
McKinsey's Joel Maynes and Alex Rawson offer concrete advice on how to explicitly link customer experience initiatives to value. "Develop a hypothesis about customer outcomes that matter. Start by identifying the specific customer behavior and outcomes that underpin value in your industry. The next step is to link what customers say in satisfaction surveys with their behavior over time."

5. Never mind on that reproducibility study.
Slate explains how Psychologists Call Out the Study That Called Out the Field of Psychology. In a comment published by Science, reviewers conclude that "A paper from the Open Science Collaboration... attempting to replicate 100 published studies suggests that the reproducibility of psychological science is surprisingly low. We show that this article contains three statistical errors and provides no support for such a conclusion. Indeed, the data are consistent with the opposite conclusion, namely, that the reproducibility of psychological science is quite high." Evidently, OSC frequently used study populations that differed substantially from the original ones - and each replication attempt was done only once.

Tuesday, 09 February 2016

How Warby Parker created a data-driven culture.


4 pic Creating a Data Driven Organization 04feb16


What does it take to become a data-driven organization? "Far more than having big data or a crack team of unicorn data scientists, it requires establishing an effective, deeply ingrained data culture," says Carl Anderson, director of data science at the phenomenally successful Warby Parker. In his recent O'Reilly book Creating a Data-Driven Organization, he explains how to build the analytics value chain required for valuable, predictive business models: From data collection and analysis to insights and leadership that drive concrete actions. Follow Anderson @LeapingLlamas.

Practical advice, in a conversational style, is combined with references and examples from the management literature. The book is an excellent resource for real-world examples and highlights of current management research. The chapter on creating the right culture is a good reminder that leadership and transparency are must-haves.


Although the scope is quite ambitious, Anderson offers thoughtful organization, hitting the highlights without an overwhelmingly lengthy literature survey. My company, Ugly Research, is delighted to be cited in the decision-making chapter (page 196 in the hard copy, page 212 in the pdf download). As shown in the diagram, with PepperSlice we provide a way to present evidence to decision makers in the context of a specific 'action-outcome' prediction or particular decision step.

Devil's advocate point of view. Becoming 'data-driven' is context sensitive, no doubt. The author is Director of Data Science at Warby Parker, so unsurprisingly the emphasis is technologies that enable data-gathering for consumer marketing. While it does address several management and leadership issues, such as selling a data-driven idea internally, the book primarily addresses the perspective of someone no more than two or three degrees of freedom from the data; a senior executive working with an old-style C-Suite would likely need to take additional steps to fill the gaps.

The book isn't so much about how to make decisions, as about how to create an environment where decision makers are open to new ideas, and to testing those ideas with data-driven insights. Because without ideas and evidence, what's the point of a good decision process?


Tuesday, 08 December 2015

Biased hiring algorithms and Uber is not disruptive.

This week's 5 links on evidence-based decision making.

1. Unconscious bias → Biased algorithms → Less hiring diversity
On Science Friday (@SciFri), experts pointed out unintended consequences in algorithms for hiring. But even better was the discussion with the caller from Google, who wrote an algorithm predicting tech employee performance and seemed to be relying on unvalidated, self-reported variables. Talk about reinforcing unconscious bias. He seemed sadly unaware of the irony of the situation.

2. Business theory → Narrow definitions → Subtle distinctions
If Uber isn't disruptive, then what is? Clayton Christensen (@claychristensen) has chronicled important concepts about business innovation. But now his definition of ‘disruptive innovation’ tells us Uber isn't disruptive - something about entrants and incumbents, and there are charts. Do these distinctions matter? Plus, ever try to get a cab in SF circa 1999? Yet this new HBR article claims Uber didn't "primarily target nonconsumers — people who found the existing alternatives so expensive or inconvenient that they took public transit or drove themselves instead: Uber was launched in San Francisco (a well-served taxi market)".

3. Meta evidence → Research quality → Lower health cost
The fantastic Evidence Live conference posted a call for abstracts. Be sure to follow the @EvidenceLive happenings at Oxford University, June 2016. Speakers include luminaries in the movement for better meta research.

4. Mythbusting → Evidence-based HR → People performance
The UK group Science for Work is helping organizations gather evidence for HR mythbusting (@ScienceForWork).

5. Misunderstanding behavior → Misguided mandates → Food label fail
Aaron E. Carroll (@aaronecarroll), the Incidental Economist, explains on NYTimes Upshot why U.S. requirements for menu labeling don't change consumer behavior.

*** Tracy Altman will be speaking on writing about data at the HEOR and Market Access workshop March 17-18 in Philadelphia. ***

Tuesday, 20 October 2015

Evidence handbook for nonprofits, telling a value story, and Twitter makes you better.

This week's 5 links on evidence-based decision making.

1. Useful evidence → Nonprofit impact → Social good
For their upcoming handbook, the UK's Alliance for Useful Evidence (@A4UEvidence) is seeking "case studies of when, why, and how charities have used research evidence and what the impact was for them." Share your stories here.

2. Data story → Value story → Engaged audience
On Evidence Soup, Tracy Altman explains the importance of telling a value story, not a data story - and shares five steps to communicating a powerful message with data.

3. Sports analytics → Baseball preparedness → #Winning
Excellent performance Thursday night by baseball's big data-pitcher: Zach Greinke. (But there's also this: Cubs vs. Mets!)

4. Diverse network → More exposure → New ideas
"New research suggests that employees with a diverse Twitter network — one that exposes them to people and ideas they don’t already know — tend to generate better ideas." Parise et al. describe their analysis of social networks in the MIT Sloan Management magazine. (Thanks to @mluebbecke, who shared this with a reminder that 'correlation is not causation'. Amen.)

5. War on drugs → Less tax revenue → Cost to society
The Democratic debate was a reminder that the U.S. War on Drugs was a very unfortunate waste - and that many prison sentences for nonviolent drug crimes impose unacceptable costs on the convict and society. Consider this evidence from the Cato Institute (@CatoInstitute).

Monday, 17 January 2011

Power Balance (of hologram fame) has purchased the naming rights for an NBA arena. (Yes, really.)

Silly me. I figured when the folks selling the Power Balance bracelet admitted there's no evidence to support their claims, that would be the end. But lo and behold, evidently they have sold so many hologram thingies that they've purchased the naming rights to the Sacramento Kings' arena for the next five years. I kid you not.

power balance admits no scientific evidence A depressing turn of events for all things evidence-based. My critique of Power Balance is well documented, including the post How do those Power Balance bracelets work? I think it's because of the 20-Hz difference between a genius and an ascending colon. Gawker, too, was taken by surprise, writing that Company Admits It's a Scam, Promptly Buys NBA Stadium Naming Rights. (I highly recommend reading the comments on that story -- so many gems, though I think my fave is "I understand that the decision was very difficult between this and 'Nigerian 419 Arena'.") As Gawker explains:

Ad Age estimates that the deal is probably worth about a million bucks a year for the Kings, and also points out that the Kings suck and they probably couldn't get anything better. Yes: this professional basketball team could do no better than a company whose marketing copy actually reads, "Power Balance is based on the idea of optimizing the body's natural energy flow, similar to concepts behind many Eastern philosophies. The hologram in Power Balance is designed to resonate with and respond to the natural energy field of the body."

After the Australia kerfuffle, the company announced "In our advertising we stated that Power Balance wristbands improved your strength, balance and flexibility. We admit that there is no credible scientific evidence that supports our claims and therefore we engaged in misleading conduct." Subsequently, the company issued this statement:

"Power Balance stands by our products," said Keith Kato, President of Power Balance, LLC. "Millions of people around the globe are wearing Power Balance products and are thrilled with the results. Dozens of high profile professional athletes swear by the results they've experienced from wearing our products." CNBC recently named Power Balance as the "Sports Product of the Year for 2010."

...That said, there has been some negative press about our products coming out of Australia recently, followed by a class action law suit filed here in the US based on those misstatements, and we wanted to set the record straight.

Contrary to recent assertions in the Australian press, Power Balance has made no claims that our product does not perform. This is simply untrue. Apparently, some previous claims in our marketing ads in Australia were not up to ACCC standards. Changes were voluntarily made immediately, approved and the issues were believed to have been resolved. We were obviously surprised to see the recent press coming out of Australia followed by a class action law suit here in the United States.

"The mission of Power Balance has always been to develop and deliver quality products that enhance people's lives," said Kato. "...Frankly, we know there will always be critics of new technologies, but our products are used by those with open minds who experience real results. Our company is absolutely committed to further evaluating the technology behind its products' performance so that we can continue to offer products that enhance people's lifestyle."

Supporting evidence. I checked other media outlets, just in case Gawker had posted the naming rights story as a joke. Nope. It's on Bloomberg and Business Insider, among other sites.


Wednesday, 01 December 2010

Interview Wednesday: Ashley Welde, Director of Evidence-Based Communications at Burson-Marsteller.

Ashley Welde Today I'm delighted to introduce Ashley Welde, Director of Evidence-Based Communications at Burson-Marsteller (New York, New York).

I asked Ashley where we can learn more about her, and she provided these links: "I blog and tweet under the Burson-Marsteller name. If you don’t mind receiving other company news, you can hear about the research I do @BMGlobalNews. Some studies I published this year which may be of interest include The PR Effect, which provides evidence for the effectiveness of PR programs; The Message Gap Analysis, which explores the 'gap' between a company’s messages and what ultimately appears in the media; and Fortune 100 Global Social Media Study, which uncovers evidence about how Fortune 100 companies are using social media."

The Five Questions.
#1. What got you interested in evidence?
I’m a natural-born psychologist who is always trying to understand why people think what they think, and why they do what they do. Thus, I’m compelled to uncover the root cause of people’s beliefs and actions, which instinctively leads to gathering evidence. I’m also interested in business, so I have an MBA and I’ve spend the last 15 years in marketing/PR, gathering evidence to understand what drives consumers’ brand perceptions, what influences their purchases, what drives corporate reputation, etc.
My father was also a psychology professor who did extensive experimenting on the chemistry of the human brain, so research about people and their motivations must run in my blood.

What types of evidence do you work with most often (medical, business research, statistics, social science, etc.)?
My job title, “Director of Evidence-Based Communications,” is kind of fancy name for market researcher, which is what I actually do. The evidence I collect as a researcher is a combination of social science and business, and the evidence drives two specific goals: 1) to develop PR programs that will influence stakeholders perceptions and behaviors and 2) to measure the impact of the PR programs.

For example, maybe I have a client who wants to encourage citizens to recycle. For the first goal, I would collect evidence to understand why citizens are not recycling (i.e., maybe they don’t know which items are recyclable, maybe they don’t know where to bring their recyclables, maybe they’re just lazy). Then we use that evidence to develop a PR program targeted at the issues preventing citizens from recycling. For the second goal, I identify measurement techniques to collect evidence about whether or not citizens’ recycling behavior has been influenced by the program.

What is your involvement with evidence: applying it, advocating its use, researching/developing it, synthesizing/explaining/translating it, communicating it?
I do all of the above. In addition to applying evidence to client work as I described above, I also design thought leadership studies to promote Burson-Marsteller’s communications services. Within the company, I am responsible for educating my colleagues about what evidence is available and to apply it to their client work. I also communicate and promote the Evidence-Based thought leadership we do throughout the PR community.

Where do you go looking for evidence, and what types of sources do you prefer? (formally published stuff such as journals, or something less formalized?)
The evidence for my industry is published in PR/marketing industry trades/websites such as PR Week, Advertising Age, the Center for Media Research and eMarketer. I always try to understand the methodologies used – and often email the researcher if I can to get a better sense of how the research was conducted – because the quality of research varies greatly. I also read some academic research about consumer behavior, which is often more methodologically sound but unfortunately cannot always be applied to real-life PR in a practical way.

#2. On a scale of 1 to 10, where 10= ‘It’s crystal clear.’ and 1=’We have no idea why things are happening.’, how would you describe the overall “state of the evidence” in your primary field?
Sorry to give a cop-out answer, but I would say “5.” (As a researcher, I always prefer 1s and 10s!) Traditionally, “evidence for success” in PR meant getting coverage for a client in a major newspaper, instead of whether or not you were really changing people’s perceptions and behaviors. Public relations is genuinely hard to measure, which I could explain in more depth except it would consume this entire interview. However, clients are starting to demand more evidence to prove that PR programs have an impact on the business’s bottom line, so people like me are developing tools to do just that. And, because so much of PR is becoming digital (including social media), it is getting easier and easier to gather evidence to demonstrate impact.

Which of these situations is most common in your field?
a) Much of the evidence we need doesn’t yet exist.
b) People don't know about the evidence that is available.
c) People don't understand the evidence well enough to apply it.
d) People don’t follow the evidence because it's not the expectation.

“C” I just described above that PR has a problem with “A” because there is a lack of evidence prove the impact of PR, but I believe that a lack of understanding about how to apply evidence is a bigger problem. PR professionals are extremely creative and excellent at coming up with big ideas, but they are not trained to apply and generate evidence. This is quickly changing though, and it is the driving force behind the Evidence-Based Communications program at Burson-Marsteller – to help encourage professionals in the PR industry to use evidence to be more effective.

#3. Imagine a world where people can get the evidence they need, and exchange it easily and transparently. What barriers do you believe are preventing that world from becoming a reality?
I think the barriers to an evidence-based world are that people do not always want to believe the evidence. For example, 18% of Americans still believe President Obama is a Muslim despite mountains of evidence and media to prove that he is not. People tend to support evidence that is already aligned with what they want to believe, regardless of whether it is scientific evidence, social science evidence, etc. Evidence will always be open to interpretation, and new evidence will make us realize that the “old evidence” was completely wrong. I believe these are the most limiting factors.

Where do you see technology making things better?
Technology makes things better because we can collect evidence more thoroughly and accurately, and I believe this is true for both social science and medical/natural science. However, technology and free survey tools such as surveymonkey and zoomerang – which are fabulous in many ways, and I find them extremely useful for my work – also enable people who are not properly trained to write biased survey questions and to collect, analyze and publish data with erroneous conclusions. And, technology allows for these erroneous conclusions to spread even more rapidly through digital/social media, which I find concerning. Even as a trained professional, sometimes I find it hard to determine which data I read are valid and which are not.

#4. How do you prefer to share evidence with people, and explain it to them? Do you have a systematic way of doing it, or is there a format that you follow?
I like to develop a dialogue with people so we can have a conversation about the evidence I’ve collected. When I talk with clients, colleagues, or media about evidence, they all begin the conversation with different assumptions about what the data means, and unless I get them talking about how they interpret the data, I won’t be able to correct their misinterpretations. So I always begin by presenting the data, but then I take an organic approach to learn the best way to get my message across to each unique audience.

What mistakes do you see people making when they explain evidence?
Researchers often present evidence, but they don’t offer ways to apply the evidence in an actionable way. They don’t tie all the data points together to create a story. The evidence has to be woven into a meaningful story with implications in order for someone else to make use of it.

#5. What do you want your legacy to be?
I’d like to be remembered for helping colleagues and clients recognize that evidence drives creativity. Many of my colleagues worry that data and evidence kill the creative PR process, but I want them to see that evidence inspires it.

Amen to that. Thanks for sharing your insights, Ashley.


Chime in. Would you like to be interviewed, or do you have someone to recommend? Drop me a note at tracy AT evidencesoup DOT com.

Wednesday, 28 July 2010

More evidence about Old Spice: I stand corrected.

[Update: Marketing VOX offered an explanation of conflicting Old Spice sales figures. See the end of this post.]

Yesterday I wrote about growth in Old Spice body wash sales. My information came from Eliot Van Buskirk, writing on Wired's Epicenter. However, in my haste to post a quick writeup about it, I oversimplified the evidence.

It's the coupons, too. I'd intended to also mention the Old Spice coupon program, but neglected to do so. I received a note from Ashley Welde, who is the Director of Evidence-Based Communications at Burson-Marsteller in New York City, a firm I wrote about last year (thanks, Ashley, for keeping me on the right path -- and for doing it so gently). She wrote: "Regarding sales of Old Spice, there is controversy over whether the sales are coming from the great ads/You Tube videos or from couponing." Both Ashley and Eliot Van Buskirk referred to an Ad Age article and SymphonyIRI sales data.

Elusive evidence. Ad Age quoted P&G spokesman Mike Norton: "How much of Old Spice's recent gains... come from Mr. Mustafa's ads and how much from the coupons? It's impossible to know." Should Old Spice be measuring awareness, coupon redemption, market share, sales growth, or something else? I would certainly hope it's some of each. One would think people are more likely to clip and use a coupon if they've got more product awareness (in this case, thanks to really funny, memorable commercials and videos). But measuring all this in the real world is problematic.

Did you want evidence of top-line performance, or bottom line? Ashley Welde from Burson-Marsteller added: "I think it would be very interesting if it turns out that coupons are driving the sales, because couponing overall is becoming more and more effective, even though it’s such an old school marketing tactic. We’ll see what the evidence shows in the long run!" Though she pointed out that coupons can dampen financial results, quoting Ad Age: "Nor is it clear how much Old Spice's 106% gain will disappear from P&G's top line when coupon redemptions, which don't figure into scanner data but do come off the company's top line when financial results are reported next month, figure in."

Body Wash sales figures - SymphonyIRI

Hard evidence? Today's graphic isn't as fun as yesterday's, but it tells an interesting story, too. Obviously, how you measure makes a big difference. In this case, they said: "During the four weeks ending June 13, market research firm SymphonyIRI... found that sales of the product were 106 percent higher than during the same period last year." Ad Age, citing evidence from SymphonyIRI, said: "other men's brands also have been making substantial share gains of late, including P&G sibling Gillette and Beierdorf's Nivea. And the thing Old Spice, Gillette and Nivea have in common isn't Mr. Mustafa, but rather multiple national drops of high-value coupons.... reflecting unprecedented levels of promotional intensity in the category."

And still more evidence. Eduardo also commented on yesterday's Old Spice post, saying "Hmmm. Yahoo says the sales have *not* increased, and have in fact decreased: 'The problem: Sales are down a surprising seven percent. This situation is far from unique - mass attention does not always equal money in the bank....'"

Sigh. It's okay, Old Spice guy, we'll always have YouTube.

Update: From Marketing VOX (today): "New figures quantify the viral success of the Old Spice viral video campaign using the one metric that ultimately matters to marketers: sales.... Earlier numbers suggested the opposite: sales of "Red Zone After Hours Body Wash" - the specific Old Spice product the top spot promoted - fell by 7% in 2010, according to WARC figures. The first ad debuted at the Super Bowl in February. But those numbers were calculated before the campaign's latest push - its two day, 186-plus viral video extravaganza, in which the actor portraying the Old Spice Man, Isaiah Mustafa, took questions and responded to posts on Twitter, Facebook, Digg and Reddit. They were produced at roughly a rate of 7 minutes per video."

Tuesday, 27 July 2010

Look at your man's evidence-based marketing. Now look at mine. Old Spice campaign boosts sales substantially.

[Update 28-July-2010: There's more evidence about Old Spice. I wrote a second post here.]

I often wonder if popular marketing campaigns actually pay off (such as the chihuahua who had us all saying ¡Yo quiero Taco Bell!)

Old Spice campaign increased sales 106% Does your man's evidence look like mine? According to Eliot Van Buskirk, writing on Wired's Epicenter, this year's Old Spice campaign on YouTube (and network TV) grew sales substantially. He says "Sales of Old Spice body wash more than doubled earlier this summer, coinciding with the rise in popularity of its social-media-friendly online ad campaign in which be-toweled former NFL wide receiver Isaiah Mustafa answered specific viewers’ questions...."

Buskirk believes irreverence really paid off. The tag line for the first video was 'We’re not saying this body wash will make your man smell into a romantic millionaire jet fighter pilot, but we are insinuating it.' Good stuff.

Tuesday, 18 May 2010

MarketResearch.com is a resource for evidence-based management & marketing.

Looking for evidence to support your product development, strategic planning, and marketing efforts? Traditional market research and analyst reports are a quick way to build the foundation of your evidence base. MarketResearch.com offers a substantial collection of reports for a wide variety of industries - from familiar names (such as IDC) and smaller firms you perhaps haven't seen before.

The site has a straightforward organization and some handy features. For instance, here's a search I did on the electronic medical records (EMR) market.You can share these results on delicious, etc., get an RSS feed from the site, or email notifications.


Search inside. You can search inside reports before purchasing them (similar to the search feature on amazon). Here I searched on the keyword "meaningful," hoping to find research on the new U.S. meaningful use requirements for EMR incentive payments. The search results tell precisely how many times that phrase appears in the document, and on which pages. (They cleverly replace important numbers, dollar figures, and other findings with '##' so as not to inadvertently give away this extensive research for free. Good idea - it protects the publisher while providing more details to the potential customer.)

MarketResearch.com Search Features

The results highlight the search phrase so you can see the context where it's discussed in each report. A nice emulation of what amazon does (example here).

Search Inside Before You Buy