•  The Conversation

    Turning Social Media into Consumer Insights.

January 24, 2017

What Social Listening Tools Don’t Tell You (That Conversation Research Does)

If there is one core reason the Conversation Research Institute exists it is that social listening tools only collect and represent data. They don’t analyze it. Try all you might, but you will never get an algorithm to produce 100% reliable insights from counting things. It’s the human processing of the counted things that results in usefulness of that data.

Case in point: The topic analysis of social listening tools. What this feature does in most softwares designed to “listen” to online conversations is count keywords. Topic analysis are often presented in word clouds. We prefer to look at them in terms of pie charts so there’s a more natural understanding of the volume of that particular topic in relation to the whole.

Here’s an analysis of specific “Dislikes” around Kentucky Fried Chicken I conducted in 2012. This is very much like the topics chart that a social listening platform would produce. You can see that 30% of the negative conversations mention “chicken,” eight percent mention “chip,” and so-on. (Note: Because this was produced from an automated topic analysis, the keywords it counted and collected are raw and what was present online in the conversation at that point in time.)

Topic Analysis Example - KFC

But looking at this you only know that these keywords were present or popular in those conversations. You don’t know the critical, insight-producing answer, which is to the question, “Why?”

When you perform conversation research, even if you do it using automated tools, you dig a layer or two down to uncover that answers. So here’s a look at Olive Garden’s negative themes from that same research in 2012. We broke out the negative theme of “Taste” to show that the qualifiers … leading to the answer of “Why?” … include Nasty, Disgusting and Like Shit. There’s also Bland, Gross, Like Microwaved Food and Weird.

Topic Analysis - Olive Garden

So we can now apply more specific context around why people who didn’t like the taste of Olive Garden. Drilling down yet another level, to analyze the mentions of “Nasty” or “Disgusting” to see if there are specific menu items, specific reasons or perhaps even specific locations where those qualifiers emerged, we may uncover insights that inform Olive Garden’s business.

The point here is certainly not to pick on KFC or Olive Garden. These charts were produced in 2012 using automatic theme analysis. Chances are, the results today would be very different. But the automatic theme analysis is the key point to consider. At Conversation Research Institute, we insist on human analysis to break down the “Why” and offer more concrete insights to help your brand.

While a few researchers can’t possibly analyze hundreds of thousands of conversations manually, our process is a two-step one for larger conversation sets. We first isolate posts we consider to be the voice of the consumer. That definition changes slightly depending on the client and project at hand. Once we have filtered out posts that do not fit that definition, if necessary, we sample at rates much higher than traditional research sampling standards.

The bottom line is this: If you are relying on machines to spit out insights, you are being tricked into thinking charts and graphs are insights. There’s a big difference in counting (to get the what) and analyzing (to get the why).

Let us help uncover more about the voice of your consumer. Drop us a line today for an introductory discussion.

January 17, 2017

Sneak Preview: Senior Care Industry Report Shows Conversations Happen In Known Communities

Our first industry report is due out any day now. The Conversation Report: Independent Living to Nursing Homes: Understanding the Buyer Journey for Senior Care looks at online conversations over the course of a calendar year in which people discuss senior care facilities and services with some level of intent to buy. We’ve researched, indexed and analyzed over 19,000 conversations, surfaced almost 1,200 that are true voices of the consumer and have a laundry list of insights to share with those buying the report.

To ensure you get first chance to download the executive summary and purchase the full report, be sure to join our list. The report is due out any day now.

Our exploration surfaced many insights about senior care shoppers we didn’t expect to find, as well as some we did. While I personally had not explored the conversation set in the senior care industry much before the endeavor, my experience with conversation research as a whole tells me that consumers have conversations in exactly the types of communities that social media marketing often ignores: forums and message boards. And for senior care, that is accurate.

So while we can all agree that Facebook, Twitter, LinkedIn and other social networks are the sexy, consumer-driven platforms that quickly surface as popular for social media, as brands we should understand that consumers often turn away from them and to known and more intimate communities for recommendations, referrals and support during buying decisions. In my experience, the more personal and private the decision, the more this hypothesis proves true.

Forums and message boards make up more than 80% of the online conversation about the senior care space. Consumers there turn to communities they trust built around the topic at hand (AgingCare.com was popular) but they also turn to known communities — ones which they are already a member of for other reasons (WeightWatchers.com ranked high as well).

For brands this means to truly engage potential customers, you have to be more aware of social media than most seem to be. Facebook and Twitter alone won’t cut it. Minding your own social profiles doesn’t scratch the surface of where your audience is engaging around the topics most likely to lead to new business for your brand. It also means investing in true community managers who go beyond minding the social profiles and assimilating into existing communities to be a formal or informal representative of the company could be a smart play.

While charts like this have existed for years and the knowledge that forums and message boards play a big part in any brand’s online conversations is not new news, it is shocking how poorly brands have adapted to it. We found no instance of a brand representative responding to these forum posts.

Don’t miss more insights in the upcoming Conversation Report: Independent Living to Nursing Homes: Understanding the Buyer Journey for Senior CareSubscribe to our updates on the form on our home page.

 

January 10, 2017

Identifying the Buyer Journey through Conversation Research

The first Conversation Report is due out any day now. Our dive into understanding the buyer journey for the senior care space, which includes nursing homes, assisted living, independent living and more, is coming in at around 15,000 words, over 75 charts and graphs and dozens of insights we’ve synthesized from the data that help senior care brands understand their customers better.

In conducting the research, we had a peculiar challenge. Our goal was to not only find only the posts produced by true customers, but those actively considering senior care options for themselves or a loved one. How do you isolate not just the consumer, but one that is actively looking without knowing first who they are or where they are — both answers you get from the research?

It seems a Holmesian Catch-22.

But it’s not.

All of our broad level research begins by trying to understand the consumer’s conversation habits first. We seek and discover individuals who have been, or are going through, the buying process and interview them. But we don’t necessarily ask all the questions we hope to answer with the research. Instead, we focus on how they go about discovering information about the product or service at hand. We uncover how they talk and think about the topic in terms of lexicon and verbiage. We try and get at what they might say in an online conversation should they resort to social media and online communities to ask questions about the topic at hand.

By canvasing a small focus group on how people talk about buying or shopping for the product in question, we can then produce more accurate search variables to uncover similar conversations on the web. Consider it our social media version of Google’s Keyword Tool. While search terms also contribute to our pool of knowledge and understanding about the audience, people may search for “senior care” but they don’t use that term in a sentence when chatting about the search for a solution online.

As you approach conversation research, you should consider that your assumptions about your audience and how they discuss certain topics in online conversations are biases. You need to vet them properly to get to a more accurate read on what is being said. One misstep in the search variable construction and you could eliminate thousands of relevant conversations. Or, perhaps worse, you could create thousands more to weed through that aren’t relevant at all.

This reinforces something you’ll hear us say over and over at CRI: Social listening software isn’t enough. You have to add the human element to your data gathering mechanism to make sense of all this noise.

How do you go about constructing your searches? We’d love to hear your thoughts and processes in the comments.

January 5, 2017

How Audience Index Can Produce Insights in Conversation Research

It’s one thing to know what percentage of a given audience is male-female, different ages, ethnicities and so on. It’s another to understand how that audience compares to the norm. Indexing a given set of results against a generally understood or accepted point of reference not only frames the context of that audience characteristic, but can help you elevate important insights in conversation research.

Some social listening platforms offer audience indexing in the demographic and psychographic data. This seldom used and often misunderstood statistic is one we constantly refer to at CRI since it can lead to more intimate understanding of the overall make up of a given audience.

To better understand indexing, take a look at this chart on a given audience’s ethnicity. Its primary function is to show the percentage of the audience broken down by ethnicity.

But we’ve also displayed the index compared to the general demographic profile of a commonly used site (in this case, Twitter). We know from multiple resources (Pew, Northeastern University, etc.), in general, Twitter’s audience parallel’s the U.S. population in terms of ethnicity.  Even with some variations considered, at a minimum, we are comparing our audience to an audience of people who are active social media users.

Indexing audiences in Conversation Research

As you can see in this audience, caucasians index at a 1.14 rate. That means that this audience if 14% more likely to be caucasian than the base audience of Twitter users. So it skews white. It is comprised of slightly more African-Americans, 19% less Asian, a bit more less American Indian or Native Islander and “other.”

But look at the Hispanic index. An index of 0.28  means this audience is almost 80 percent less likely to feature Hispanics than the base audience of Twitter users.

What does this tell us? It could tell us a few things:

  • Hispanics aren’t talking about this topic (if you’re doing conversation research) or buying this product (if you’re analyzing sales data)
  • The industry or brand in question does not appeal to Hispanics
  • The industry or brand in question ignores Hispanics

The definitive answer would require more detailed research, but seeing the huge disparity in the indexes gives us reason to investigate and perhaps an opportunity to fuel decisions to improve the business.

And keep in mind that demographics aren’t the only thing that can be compared in index form to Twitter or other data sets. You simply need a known and common data points. In CRI’s research, we frequently surface indexing for age, gender, ethnicity and geography, but also social interests, professions, bio terms and more.

Indexing is a powerful statistical feature to understand as a researcher or a marketer. Understanding it could be the key to unlocking equally as powerful insights for your business.

For help with understanding your audience and how they index compared to known audiences, drop us a line. We’d love to help.

December 22, 2016

How Conversation Research Supports Traditional Advertising

An advertising agency friend recently challenged me that conversation research isn’t relevant to traditional advertising. “We focus on print, radio and TV, so that online stuff isn’t a primary concern.” Yes, I laughed and performed a hearty shaking of my head.

“So when consumers see those advertisements, what do you think they do next?” I replied.

“They either buy or they don’t.”

More furious head shaking.

“No, they go online to research. They talk to friends to see if someone knows more about that brand or has experience with them. They look for validation. In fact, I would argue that the online conversation is more important to purchase consideration than your ad in the first place (though they go hand-in-hand and one isn’t likely without the other).

So he asked me to prove conversation research would support traditional advertising. Even in a three-month-old company, I had a case study.

We were approached recently by a high-end home and lifestyle brand who had some suspicions about their advertising campaign. They didn’t think their messaging around quality and style was really resonating with consumers. Their campaign was developed on assumptions, not assertions and they felt like they’d guessed wrong.

So we analyzed what consumers were saying about their brand — when they turned to the social web to find out more about it — and discovered the brand’s suspicions were correct. The buying decision topics that emerged were almost completely focused on price. There were no (as in zero) conversations discussing quality and style.

Now, the presence of topics (or lack thereof) doesn’t an insight make. Deeper conversation research could yield more understanding of why. Were style and quality assumptions? Did those decision points even matter? Were they simply too high priced for consumers to focus on anything else?

All of those questions made for a great follow-up research project.

The point to my friend was to say that conversation research supports traditional advertising many ways. Some include:

  • Validating assumptions made without adequate consumer research
  • Confirming consumer talking points about the product to focus one’s messaging
  • Discovering tangential topics or qualities resonating with consumers the brand isn’t aware of
  • Uncovering audience segments for better targeting that fall outside the brand designation of its target

And there’s more.

So how’s your advertising doing? Are you happy with the results? Does your messaging resonate with your audience? How do you know?

If you don’t, we can help. Drop us a line and we can chat about how.

December 13, 2016

Testing Underlines the Importance of Facebook Topic Data

Facebook Topic Data is perhaps the most important, yet underestimated sea of consumer data in existence. One notable C-level executive at a social listening software company told me recently that they weren’t focusing on Facebook Topic Data much because their clients didn’t show much interest in it.

I, for one, hope that changes quickly. And the reason is simple math.

In several separate tests over the last six months, we at Conversation Research Institute have entered standard brand and topic queries into popular social listening platforms. Our testing included NetBase, Nuvi, Brandwatch, Sysomos and Mention. In each of our tests, we looked at how many conversations over a given time period surfaced, then we input the exact same search threads into a Facebook Topic Data search to compare.

Almost to a decimal place, we could predict how many conversations would surface on Facebook based on the number of conversations we found on the open web. Would be surprised to hear that Facebook is nearly 1.5 times as many?

That’s right. According to our testing over around a dozen or so different terms, Facebook accounts for around 60% of the online conversation. In some cases, it’s even higher.

A recent search conducted for a client in the pest control industry turned up 58,000 interactions on Facebook in the time frame of Nov. 22 until Dec. 12. When you use Facebook Topic Data, the validity of the posts you receive is managed by DataSift — Facebook’s exclusive data provider. What that means is they do the disambiguation to ensure the posts you get are the posts you want, rather than ones that include irrelevant topics.

Out of those 58,000 interactions, we estimate that only about 10% of them are irrelevant — those not caught by DataSift’s processing. (In all fairness, two articles that were shared in the period included references to politicians, lawyers and even a religious group as “termites” which is difficult to eliminate without manual analysis.)

However, in that same period of time, searching the open web for the same exact Boolean thread, we found 8,690 mentions. Some 4,560 of them were on Twitter with Reddit (352) coming in a distant second. And those numbers do not factor in disambiguation (meaning the open web search would net far fewer results).

Twitter is the darling of the social analytics industry because it’s free and open to analyze. Facebook is a walled garden that protects its users’s posts from one-by-one cataloging and analysis by the social media softwares of the world. While the challenge exists that Facebook Topic Data does not provide post-level data, meaning you can’t index and analyze every single post individually, the sheer volume of conversation makes it important.

But make no mistake about it: Facebook is where most online conversations happen. And Facebook Topic Data is going to be essential research fodder for anyone interested in understanding their customers.

Need help finding and analyzing Facebook Topic Data for your company? Drop us a line. We would love to help you understand the conversation.

November 21, 2016

An Example of Why Social Listening Needs Conversation Analysis

A key value proposition for Conversation Research Institute is that we offer conversation analysis to social listening data that finds the insights that can help drive your business decisions. But that’s not just a fancy set of words, there’s real reason behind it.

First, know that what we mean by offering analysis is that social listening tools themselves aren’t enough. Pretty charts and graphs and word clouds don’t do your business any good if you can’t explain what they mean, how the data was discovered and what insights surfaced that can help you.

Conversation analysis

No social listening software does that for you. You have to have conversation analysis – from a human being – to understand the data and surface that information manually.

Case in point, while working on a research project for an upcoming Conversation Report, we found this entry in a sea of data about the elderly care space:

“The social worker at the nursing home ~ when mom first went there ~ had to go to bat for mom and went to court to get a guardian (not my brother) for mom.”

The software in question gave this entry a neutral sentiment and picked out no sub-topics or themes for the entry. The software surfaced “social worker” “nursing home” and “guardian” as word cloud entries, but again, did not attach any sentiment or context to them.

Because we are manually scoring and analyzing this data, and our perspective is to look at the voice of the consumer as it relates to the elderly care providers (nursing homes, assisted living facilities, independent living communities and other long-term care providers), we add lots more context to the analysis:

  • The sentiment is negative toward the nursing home because the patient needed an advocate
  • The sentiment is positive toward the social worker who served as advocate
  • The source is a family member
  • The theme is patient advocacy
  • A sub-theme is non-family guardianship

And that’s before we went to the original post (which has other excerpts appearing in the research) to clarify even more:

  • The brother in question filed for guardianship after lying for years about having the mother’s power of attorney
  • The social worker was advocating for the patient, but also the rest of the family
  • The author (a daughter of the patient) was considering hiring a lawyer to fight the brother’s claim for guardianship.

So family in-fighting over the burden of care, cost and decision was another important theme.

When you let a computer spit out analysis of tens of thousands, or even millions, of conversations you get roughly one tenth of the context and actual insight possible from truly understanding what is being said. Certainly, on scale there’s no way to be as thorough.

But relying on automatic charts and graphs is keeping you away from the one thing you’re looking for: True consumer insight.

That’s what we surface. If you’re interested in finding it for your brand, let us know.

 

November 16, 2016

Understanding your audience with conversation research

There’s a spirits brand we’re familiar with at the Conversation Research Institute, not because they’re a client, but they’re a favorite for us when we break for a drink at the end of the week. Their marketing is not unlike other sprits brands in their category. It’s focused on tradition, heritage and quality. It’s aimed at men and of a particular status in life.

Honestly, you could take one of about two dozen brands in this category and put them in the same advertisements or even social media posts and, generally, the communications would work.

But we did some snooping around the conversation about the brand and found something interesting. The professions of the people who talk about the brand don’t exactly align with who the brand thinks they’re talking to.

Over the course of two months time, almost one fourth of the authors talking about the brand online listed themselves as artists. While certainly more research needs to be done to determine what type, what gender, how serious and the like, if you are targeting your messaging at male executives, does this data not give you pause?

Yes, 15 percent of the authors talking about the brand fall into the executive label. But the labels of “artist” “teacher” and even “journalist” add up to almost half of the online conversations about your brand, don’t you think segmenting and targeting them could result in more, bigger or better?

Conversation research isn’t just about finding sentiment and tone. It’s about uncovering insights about your brand that help you make critical marketing and business decisions. This particular brand of spirit is missing out on a huge content marketing or even targeting paid spend potential if they aren’t paying attention to the data that conversation research can unearth.

More can be had for your brand. Let us know if we can help.

November 1, 2016

Can Conversation Research tell you why sales are down?

 

A large national retailer in the food and beverage industry was riding high last year. Sales were up, the brand was healthy, consumers were immersed in the experience. Years of hard work had put the brand on the top of the heap in their category.

But then they noticed that sales of certain beverages had started flat lining. They couldn’t quite figure out why. Nothing in their formulas had changed. Customers weren’t indicating why they were switching drinks or passing on the drinks when they ordered. What was the brand to do?

They turned to online conversations and posed the question, “Are sales for these drinks flat lining because of a consumer shift or something else?” Consumers would likely tip their hand if it was the former. If the research was inconclusive, it wasn’t likely because of a consumer need, but something else.

The conversation research for the brand turned over an insight that explained it. The brand’s customers were becoming increasingly concerned about the sugar content of the drinks in question. They were interested in more healthy options.

So the brand formulated a new line of fruit-based, all-natural drinks just in time for spring.

The sugary drink sales stayed flat while the new line took off, exceeding expectations and satisfying customers.

So yes. Conversation research can tell you why sales are down. It may also tell you how to make them go the other direction.

Call us to see how conversation research can help your brand.

October 24, 2016

A peek inside Conversation Research around the travel industry

Understanding how conversation research data can help your business is certainly your first step in knowing what to ask for, who to ask it from and how you might approach discovering insights for your brand. There’s high-level data that points you in a general direction, then specific, granular research that can point to specific insights that help you make decisions.

I recently had the honor of sharing information about conversation research to the audience at TBEX, the world’s premier travel writing and blogging conference, in Manila, Philippines. In preparation for that talk, I recorded a little video to share some of the differences in high-level vs. specific insights with you. I also talk a bit about a specific example of a high-level insight that led to answers at a granular level.

So, what questions do you have about your business or industry that the consumer conversation may answer? I’d be happy to tell you as a response how conversation research may be able to help. Go ahead — the comments are yours!

Interested in learning more?
Subscribe to our free newsletter and get a free style advice every week. We will also notify You about new offers and discounts. Check it!