About Hilary DeCamp
Hilary oversees LRW’s quantitative research methods, strategies and tactics. She especially loves to collaborate with LRW clients and staff to solve the most challenging business problems in a world of rapidly changing marketing tools and research methods. Hilary enjoys figuring out what matters to people, in both making their product choices and determining their customer satisfaction. Prior to joining LRW, Hilary previously served in research roles at M/A/R/C and Los Angeles Times. She guest lectures at USC, holds an M.M.R. from the University of Georgia and a B.S. in quantitative psychology from UCLA.
Follow Worthix on LinkedIn
Follow Worthix on Twitter: @worthix
Follow Mary Drumond on LinkedIn
Follow Mary Drumond on Twitter: @drumondmary
Tune in to the Voices of CX Podcast to hear conversations with top leaders in CX, marketing, data analytics and beyond.
Introduction (00:04): Hilary DeCamp lives and breathes research. She’s charged with overseeing all of LRW’s research methods, strategies, and tactics. She especially loves to collaborate with LRW clients and staff to solve the most challenging business problems in a world of rapidly changing marketing tools and methods. She has spent over half her life designing primary research solutions to meet the custom needs of clients. She holds a Master’s degree from the university of Georgia and a bachelor’s in quantitative psychology from UCLA.
Mary Drumond (01:16): Welcome back to one more episode of Voices of Customer experience. We are on season five and today I am joined by Hillary DeCamp who is chief research officer at LRW. Hi Hilary! How are you?
Hilary DeCamp (01:30): I’m great, thank you.
MD (01:32): Thanks for coming on today. I’m really excited to hear your expertise and experience on market research and surveys, which is kind of the underlying theme that we’re doing the season.
HD (01:41): Okay. Happy to be here.
MD (01:43): So for our listeners’ benefit, let’s start off by you just giving a quick intro on yourself, talking about what you do, how you came to do what you do, and what really makes you feel passionate about this business.
HD (01:56): Oh, certainly. I’ve been doing marketing research for 30 years now. I started out studying quantitative psychology at UCLA when I had a summer job where I was a market research interviewer in shopping malls. You know the people with the clipboard that say, ‘do you want to do a survey?’ I thought it was just a fascinating industry. I changed my major to study how people think and how to research how people think. And then I proceeded to get a Master’s in marketing research, specifically from the university of Georgia. They were the first program in the country at the time. There are very few opportunities to get professional training in market research in America, and they were the first groundbreaking program to do that. I then proceeded to work for the Mark group for six years in Dallas. They were a full service market research firm, very similar to what LRW is today. Basically we are a premium priced company that you call for the hard stuff. We don’t typically get the easy assignments. People come to us to solve the problems that other people are struggling with. We really pride ourselves on our research rigor, our creative problem solving and our ability to design design [?] understanding and I find that fascinating, which is why I’ve continued to do it for 30 years really enjoyed it very, very much.
MD (03:11): That’s awesome. You did the MMR program at UGA?
HD (03:15): Yes.
MD (03:15): So I sit on the board, Worthix is a board member and it really is a fascinating school. We’ve had professor Marcus Cunha who’s the program director on this podcast as well, talking about the program. We’re huge fans and that’s where I saw you speak, especially focusing on market segmentation, which is something that you guys specialize in over at LRW, right?
HD (03:36): Yes, that’s right. Market segmentation and strategy work is a large component of what we do. We also do a lot of tracking and a lot of choice modeling and things of that nature. But segmentation is my personal passion and it’s a practice that I’ve built here over the 20 years that I’ve been at LRW.
MD (03:52): And what is it that makes it so interesting for you? Why are you so passionate about it?
HD (03:57): Well, I think that first of all, it’s foundational strategic research. So it’s going to impact how your clients do business for years to come. So it’s hugely important. It’s hard to sometimes quantify the ROI because you may not make a specific tactical decision that you can, you know, demonstrate what benefit you got from the work. But it really is the common language for everybody at the different stakeholder groups of company to work together and think about their market in a particular way. It’s really the only way to be customer centric, is to understand the different kinds of consumers you have in your marketplace. There is no average consumer, there are different types of consumers. While a lot of people are starting to talk about, you know, segments of one and one-on-one marketing and whatnot, you still need to understand a handful of types of people so that you can build your strategies around those different types of people, and that’s what market segmentation allows you to do. To kind of take that wildly complex ocean of individuals and all of their nuances and parse it down to a level of granularity that you can keep straight and that your organization can work with. It’s four types, five types, seven types, not 7 million types.
MD (05:07): Well, one of the reasons that I wanted to focus on marketing research in this season of the podcast is because our main audience is customer experience. But customer experience is kind of a modge podge of everything just kind of together. Like there are people from marketing, there are people from market research, there are people from data analysis, you get people from all different walks of life, let’s say, that are working in in customer experience and sometimes the fundamentals aren’t there. So what I wanted to try to do was deliver some of the basics, and that’s why I’m calling people like you, like other specialists in, to talk about these things on a basic level and just clarify the main pillars, the foundation of this work and why it’s important to respect the science behind data collection, all of these things. So if we were to get down to the very basics of segmentation, okay, so let’s start off talking about the crucial reason that it exists in the first place.
HD (06:07): Yes. So segmentation is crucial because there really is no average consumer. I am approaching your category one way, someone else is approaching it differently. For example, when you think about a retail environment, some people want to walk in the door and be showered with customer service to have somebody help them make recommendations, walking around the store, find things for them, and other people are absolutely allergic to that level of attention from a customer service representative. Those are two very distinct segments, and if you understand the kinds of customers you have, you can actually customize your customer service. You can customize your call center scripts, you can customize your outbound marketing. So within a customer service CX world, just understanding that what makes some people tick is different from what makes other people tick, what their needs are are different, allows you to customize not only your product offerings but your messaging and your customer service offerings. And that can be very powerful. Personalization boosts effectiveness of marketing activities.
MD (07:07): So I’m not a specialist in this at all, but if I remember correctly, there are four main types of segmentation, geographic, demographic, psychographic and behavior. Is that it?
HD (07:20): Probably. That is one way of looking at it. I think the slide you may have seen me present may have had five or six buckets, but those were just parsing out some of them a little bit. So demographically, it would be grouping people based on their personal characteristics. They’re usually observable. They’re often available in first party and third party data. So they’re easy to buy marketing advertising against. A demographic segmentation is easy to wrap your head around because we can all picture a young, multicultural, millennial mom, something like that. So it’s brings it to life fairly quickly. But many categories don’t really gain much benefit from demographic segmentation because the people who prefer my brand versus your brand are all over the map, demographically speaking. So age is very predictive of telelogical adoption. And so in some categories, age can be incredibly helpful and there are some obvious connections to gender. But in most cases we find that the demographic differences on what people are looking for and how they want to be treated and what their benefits are that they’re seeking from your category that would lead them to choose your service versus someone else’s service, are really very loose. There’s not strong relationships to demographics. So we find that a lot of people do that as their first approach. That’s also something that’s relatively easy to do through do it yourself marketing engines that are out there, they’ll generate demographic segments for you. But if you then move to behavioral segmentation, that is popular these days because behaviors are being captured passively in big data streams and it’s very possible to simply group of people based on the way in which they interact with your services. If you’re an Amazon or a Netflix or even an American Express, where you have visibility into incredibly granular details about what people are doing and how they’re doing them, you can group them and treat them differently in a pretty effective way. The main drawback to the behavioral segmentation is that you don’t know why they’re doing what they’re doing and since you don’t know why they’re doing what they’re doing, it can be difficult to figure out how to change what they’re doing, how to convert them to your brand, how to get them to adopt new use occasions and so forth. So behavioral is powerful and it’s becoming popular, but it doesn’t provide all the insight necessary to really help drive your strategy and you also may not be able to identify a person at a focus group facility or in a recruiting situation to figure out what segment they belong to, because you don’t have on the fly access to all of that big data stream. Then if you move into psychographics, that would be basically how I approach the world, how I think about life, what my values and motivations are. Those can be really powerful ways of segmenting into categories where the category itself is fairly commoditized or isn’t the sort of thing that people give very much thought to, such as your choice of soft drinks. Do you want Coke versus Pepsi? They’re fundamentally the same beverage with they slight difference in taste preferences, but really they position themselves based on kind of like, what’s your idea of fun? You know, are you warm and wholesome and loving and bring the world together or are you rah, rah, let’s go to a concert, let’s enjoy life carpe diem kind of person. And so they both position themselves against your psychographics, your lifestyle, your values. They don’t position you against how much carbonation do you want, is this enough sugar, do you prefer red to blue on your can, things like that. They’re not at all functional. That’s psychographic. Lifestyle can be very helpful if you are doing longterm innovation because you’re trying to develop the next new cell phone. People can’t tell you what unmet needs they have m.ost of the time they could tell you what their pain points are, they can tell you, wow, it’s really a hassle that it takes so long for my computer to boot up and from that, someone like Steve jobs can say, Oh we need to have instant-on tablet capability, but they can’t necessarily come to you and say, what I really need is a tablet. And so, knowing what someone’s lifestyle is like, are they a road warrior, are they a soccer mom, are they a dual income empty nester or are they somebody who’s in the sandwich generation or whatnot? Like the way one lives one’s life can be very informative for innovation because the R&D department can rely on, okay, I understand this person, I know what they’re going through, I know what their needs are and I know what the technology will be able to do two, three, four, five years from now. I will figure out how to develop a product for those people. The most common segmentation approach that we use at LRW, though, relies on attitudes and needs that relate specifically to the product category. When I gave the example of the shopping styles, how do you prefer to engage in a retail environment? What sorts of styles are you looking for? Are you deal-driven versus absolute price driven? Are you price driven versus quality driven? Things along those lines where we try to make the segmentation be very specific to not only your category but to your brand. We do a lot of work in the same industry for different clients and each of those clients gets a different segmentation scheme, a different segmentation survey because each of them has different tactical needs that set them apart. Dairy Queen and Arby’s don’t have the same fundamental competitive advantages and disadvantages. And so the structure that’s going to be useful for them is going to be somewhat different. So the attitudes and needs-based is where we prefer to play most often. But we can be flexible in work in the other spaces if those are more applicable to a particular client’s needs. The behavior-driven segmentation can be particularly applicable to CX professionals given that if they have a large stream of transactional data on their customers. Because if you’re trying to optimize your customer experience app, the website or the call center or the dealership, and you’re able to identify which segment you believe somebody belongs to based on the data that you have immediate and realtime access to that can allow you to customize your treatment of those individuals in a way that it should be more effective. So when thinking specifically about the CX space, as it interacts with segmentation, that’s one place where the behavioral data can be particularly helpful.
MD (13:52): When it comes to geographic data, I have a question because like when you have countries like the U S it’s like coast to coast. There are so many differences that vary from state to state, from region to region. Not only that, it’s a country that’s composed of immigrants, so within the same nation you have people from all sorts of backgrounds, all sorts of cultures, all sorts of ethnicities, all sorts of religions even, and this must create some sort of impact when it comes to doing the research, does it not? What sort of challenges do you place like on a global level? I mean that’s almost insane for me.
HD (14:30): Absolutely.
MD (14:30): You know, the amount of data that you’d need to classify people geographically.
HD (14:35): Well, I have to say the global work is certainly much more challenging then work within just the United States. It brings in a whole lot of additional complexities. But to your point, even the U S is very diverse, it has a lot of complexity, a lot of cross cultural variation. The biggest challenge you run into is a measurement challenge. With consumer surveys, we’re asking people to express their attitudes, opinions and needs using rating scales typically or answering other closed ended questions that we can quantify and feed into statistical algorithms. But how people answer those questions is a very personal thing. For example, when you’re thinking about your ride share app, do you automatically give somebody a five just because nothing went wrong? Or do you only give them a five because they did something special, they went above and beyond, they provided added value, they had, you know, mints in the car. I find that people use scales differently. Individuals with any culture use them differently and more importantly, people in different parts of the world use them very differently. So there is a measurement error challenge that feeds into the analysis. So you have to be super, super careful when mixing data from different countries, just statistically speaking, so that you don’t accidentally create groups of people that are defined by how they use the scale rather than what they were trying to tell you. Or if you did a segmentation in the United States and now you want to take it to Japan or Brazil, you’re going to be at grave risk of misrepresenting those markets if you don’t take careful efforts about how you do the measurement in the first place and how you generalize the measurement to these markets where people tend to use scales very differently. So global’s quite challenging. The biggest challenges is a measurement challenge. The next largest challenge is understanding the culture. There are big differences. For example, non-binary gender is a current hot topic that’s going on right now. Less than 1% of respondents in most countries will self identify as a third gender. But in India, this has been a construct for hundreds of years and there are large numbers of people who self identify as third gender there. I believe they call it hijra. And so that is a cultural difference, manifested in a demographic trait, but there are all sorts of cultural differences. For example, in some countries, I would not want to admit that I’m trying to impress other people with the clothes I wear or the hotel I stayed at or the car I drive because that’s just not done. Even though we all acknowledge that–
MD (17:02): It’s considered wrong.
HD (17:03): Right, and yet there are other countries where people are proud to tell you this. Absolutely, this is what I’m doing, because that’s culturally acceptable in that, in that marketplace. So when doing global work is just really important to include, if at all possible, your local partners, the local marketing teams, people with expertise on the ground who can say, you’re completely missing this concept here. This is not applicable. When laptops first became popular, they were the sort of thing that really tech savvy people used in the U.S. It was like, wow, I’m really into computing if I have a laptop. But in Japan you had a laptop because you’re living in 400 square feet and you needed the smallest possible computer you could find no matter how unsophisticated you were technologically. So you just, if you understand sets the markets apart from each other to make sure that you’ve captured those constructs, then you can build a global segmentation that will work well in all of your markets. You have to make sure you don’t have your U.S. Lens on when you’re building your survey and executing analysis.
MD (18:03): I see that all the time and it’s one of the main things that frustrated me. So I’m really glad you’re bringing it up cause it’s a question that I have here on my notepad. How important is it for market research teams to have representation of the population that they’re surveying? So do you think that this applies like across the board, because then how could we possibly do this in a country like the U.S.? It’s so diverse, like there are some countries even in, you know, in South America or Spain, I think Spain is a good example where most of the population is composed of Spaniards or Spanish people and their parents are, and you know all of the generations before because there wasn’t a lot of immigration, so it becomes easy to find a representation of that demographic on your research team. But in a country like the U.S Or like the UK or like Brazil or any other country that has a lot of diversity, it becomes so much more challenging. How important is it to have that diversity on the marketing research team?
HD (19:05): I think diversity and inclusion is particularly crucial in the marketing research industry. Your job is to understand the market and all of the market. I spent six years in between my research vendor jobs working at the Los Angeles Times newspaper and I was there when their first ever market segmentation was created and they had a mandate to serve everyone. They used to have a circulation of 1.5 million on any given Sunday, and they wanted to meet the needs of everybody no matter who they were. And yet when we ran them through the typing tool, a short series of questions that figures out what segment you are, their editorial board were all members of the first two segments out of the seven segments that existed in the Los Angeles market. They literally could not relate to five out of seven types of people in their market because everyone they knew belonged to one of their two target segments. They were in those one or two target segments. We often say don’t rely on grandmother research like, Oh, well my grandmother was like this, and you definitely don’t want to generalize any one person, you know, to the population at large. However, if you know someone like this, I guarantee you there are more people like this. Is it 1% of the market? Is it 25% of the market? That, I don’t know and that we need to go out and quantify through research, but it’s really important to have people from all levels. Some companies are like, Oh, we’re all about sustainability, or we’re all about organic, where that’s really what they stand for and they really sincerely mean it, but they attract people who also sincerely mean it who can’t relate to the person who doesn’t care about that or doesn’t understand that. And so they don’t know how to effectively sell to them or how to effectively serve their needs. So if you have a diversity of opinions on the team, and that’s usually driven by a diversity of backgrounds, diversity of ages and genders and ethnicities and educational backgrounds and so forth. You know, if you’re only recruiting from like, you know, we traditionally were pretty bad about only recruiting almost exclusively out of UCLA and USC because we’re headquartered here in Los Angeles and those are the two biggest and best schools in the area. But we realized that even if we were getting a diversity of genders and ethnicities out of those schools, we were getting people who had a very similar background. And so we’ve been very actively expanding our reach to people who have lived in other countries, people who have gone to other kinds of schools, people who grew up in other circumstances, people from other parts of the country. Because you’re right, if you are all looking at this through the lens of a particular kind of person, you’re going to completely miss the bigger picture. One of the things we also do to make sure that you’re getting all those voices is, Oh, we actually have a methodology that’s not survey-based at all. It’s called online anthropology where we’re doing a sort of like web scraping on steroids, where we basically access all of the conversations that are going on in blogs and forums where people tend to have actual dialogues and express opinions and ask questions. Not Twitter, not Facebook. Really the places where they get into like offering advice to each other and things along those lines.
MD (22:12): Like Quora and Reddit and stuff like that?
HD (22:14): Yeah, Reddit is a huge source of this sort of information. And we go in and we look at that data and we hear what people are saying. And rather than having 12 people in a focus group in Boise, Idaho, we have 120,000 people who posted a comment about this particular topic and we can review all of those comments. It’s a quasi quant/ quasi qual way of reviewing information without introducing any interviewer bias, beause you’re just looking at what people are naturally saying out there. And that’s a great way of seeing, are we missing something? We’ve found in the past that you can definitely discover, ohh, I thought my product was for kids so I’ve only been talking to moms of kids, and it turns out this is really popular with young adult gamers and we weren’t talking to them at all. So you can really discover some things that you weren’t thinking to look for by listening more broadly to what everybody is saying, not just who you thought you should be talking to or what the people who are sitting around the conference room table believe.
MD (23:14): And that probably increases the sample size significantly as well because you’ve got so much more data.
HD (23:19): Oh absolutely. Yes. You’ve got hundreds of thousands of comments.
MD (23:23): So that brings me into my next question, which is about survey response rates and your opinion on that a lot. I’ve hear a lot of people saying that that’s going down, that people are burnt out, and other people are saying that the market has always faced these challenges, they’ve always been low, especially when it comes to surveys. So what’s your opinion on that?
HD (23:47): Response rates have always been low. They’re currently dismal, but the real issue is, are the responders and the non-responders different with respect to what you’re trying to study or understand? And if what you’re trying to understand is the role of ketchup in someone’s life, whether I am the kind of person who likes to take surveys and express my opinion and impact companies through doing so, or the kind of person who chooses to click delete when I see those invitations, probably is not going to impact our understanding of the different ways in which people use ketchup and why they use ketchup and which brands of ketchup they prefer. So the response rates are not necessarily a problem depending on what you’re trying to understand. On the other hand, if you’re trying to understand something that’s directly correlated to the degree to which people want to offer opinions and the degree to which they think about issues or the degree to which they use the internet, then you can introduce some fairly serious bias of through the fact that response rates are low. When we see how low response rates are is when we’re using a client’s sample that they’ve used from their database, they send it out and you see just how few people actually do reply. We try to combat that by making the surveys a little more interesting, making them less difficult to complete and making them shorter, offering a variety of incentives. Some people are motivated by the process of participating. Others are motivated by the process of trying to win something or to earn points. Others want to have you donate to charities so you know, we do what we can to boost response rates. Yes, they’re dismal, but I don’t think that the tendency to respond is correlated with the business question very often and so I feel like we’re still getting excellent data.
MD (25:29): Do you feel like in the future we’re going to have to be a lot more creative with the way that we collect data, or do you think we still got a couple of decades left before the response rates go to zero?
HD (25:43): Yes, we are going to have to get more creative. I think we’re going to have to stop our bad behavior. Over the last 10 years, data collection has jumped from the interviewer-administered telephone surveys into self-administered online recruited surveys. There has been a shift from a disciplined execution of 10 to 15 minutes worth of questions into an undisciplined execution of 25 to 45 minutes worth of questions, and that has led people to have very bad experiences when participating as a respondent. Basically what we’re doing is we are beating people into submission and we’re saying, this is not fun. This is not interesting. You don’t want to do this again, do you? And we’re driving them away. So as long as the industry together can agree to treat respondents with respect, to understand, those are our consumers, to try to make the experience a positive one, make it brief, make it entertaining, make an interesting, make it easy, that will help keep response rates from eroding faster than they need to. We do need to come up with clever and new ways of doing things. There are companies out there that are doing that. People are progressively becoming more comfortable with recording selfies and selfie videos and submitting things like that, especially the younger generations that are growing up that way. I think we’ll be moving more that direction. Companies are moving towards chat-based surveys through messaging. We need to experiment with all of these things and see what works. But the most imminent threat is the issue of mobile compatibility. If your surveys are too long or too complicated to be done on a phone or if the formats of the questions you’re asking are not friendly to people who are doing surveys on mobile, you’re not getting a representative sample. The majority of people under 35 who come into a survey tend to do so on a mobile device. And if you try to make a survey that is not mobile compatible, you’re going to lose all of those people. They’re not gonna run home and crack open their computer and wait for it to boot off and go to the web and find the invitation and go back in and say, Oh, now I can do it. No, they’re going to go, ehh nevermind. I felt like doing a survey right now. Can’t do it on my phone. I’m out. So you really need mobile compatible work to have a good representative sample these days and it’s getting progressively worse. And so it’s remarkable how fast this has gone from five to 10% of completes to 50, 60, 70% completes depending on the study and the audience.
MD (28:13): Yeah, I think that we should make a convention of marketing research firms and anyone who does surveys, who we’re like, Hey, let’s all agree that we’re ruining our market, so let’s commit to making surveys better and stop burning out our people. Let’s make surveys suck less so that in 10 years we don’t deplete all of our resources, in this case. I totally think that this is a thing that should happen, because as long as there’s somebody that’s still sending out 37 questions– you know, like from a consumer perspective, myself as a consumer, at times I answer surveys because it’s part of what I do. So it’s, it’s almost, it’s research for me to answer surveys and I’ll get some really cool ones. I’ll get companies that are doing amazing jobs with incredibly cool UX and visuals and everything, and some are, you know, like the company I work for, Worthix, we’re focusing on making it short. We’re focusing on making it intuitive, we’re focusing on making it self adaptive. But then every once in a while I’ll regret clicking on that button so much, because it’s never-ending. You just keep going and at some point you know what happens? At some point, I just start, “blehh,” skipping. Or writing anything, and then it’s not a qualified response anyway.
MD (29:36): But the problem is whether or not the vendor recognizes that you are not a qualified response. That’s why you are doing research. We always build in traps, quality controls, ways of identifying whether or not the person appeared to be paying attention, whether their results, make sense, whether they were speeding through and you have to time the survey in chunks because somebody who starts out like you did diligently trying to be thoughtful and to answer the questions well, who finally got to a point where you’re like, are you kidding? And then they start clicking through their total interview line may appear reasonable, but the last part of their survey they sped through. And so you need methods for catching that so that you can clean out the data. Of the people who either tried to lie their way into the study in the first place because they’re entirely driven by incentives. Sometimes those are click farms in other countries. Sometimes those are survey bots of people who’ve programmed devices to take surveys for them. And sometimes it’s just people like you and me who are like, I just can’t take it anymore. It’s sort of like the senators right now, they’re like I can’t take it anymore, you know?
MD (30:42): But like as an expert, is there a way to at least preserve the data that that was valid in the survey? Like okay, Mary went all the way to question 33 but from 33 to 37 it was crap. Is it possible for companies to keep the, can they even find out or filter the ones that were valid and the ones that have to be disqualified? And like at some point our company’s like, okay, we have like a 99% drop off rate in question 33 let’s make the survey shorter.
HD (31:17): Yeah. So I would say that companies vary quite a lot on how rigorously they clean your data. I know this because we have vendor partners in other countries who are always pushing back on us like, why is this data not acceptable to you? Any of our other clients would have taken it. And we’re just like, sorry, we’ve investigated it. We don’t trust these data records and we’re not paying for them. You need to replace them. So I think there’s a lot of variability on how rigorous people are in their checks, how effective they are at identifying the bad data and once they have identified the bad data, you’re raising an interesting question, which is should you keep the part that was good or should you jettison all of it? If you’ve deemed this person to be a valid person, a qualified respondent, who tuned out eventually because the survey got out of control, should you keep the data you had? And that’s a very controversial topic. Depending on how the study is designed, you might be able to retain those people and jettison the rest, but it creates a lot of problems with things like weighting the data to be representative demographically because then you would need different weighting schemes for different sections of the survey. Quite frankly, I think generally speaking, it’s so much more cost effective to simply throw out your interview, but you know that is, that is a problem because the people who quit are maybe potentially different from the who didn’t quit. We know they’re different on their device types. People on smart phones are more likely to drop off because it’s taking longer and they have better things to do, but also your engagement in the category. Either you’re less engaged in the topic and finally you’re like, this is silly. Why am I wasting my time talking about this thing that I don’t care about and you might quit. Or you might be more engaged, because oftentimes questions are contingent on answers to previous questions and somebody who’s very engaged might get a much longer survey than someone who is less engaged, and therefore this person who’s so important to you because they do so many relevant things in your category, has a higher than average propensity to quit. And so you really need to design your survey carefully around, first of all, not allowing the longest path to be outrageously long. Don’t just think averages. There’s a common misconception to think about the average interview length is say 20 minutes and you might be okay with that. But if the average is 20, are there some people who would be getting 40? That’s completely unacceptable. You need to be thinking in terms of the longest path, not just the average path.
MD (34:09): Do you think that it is possible to reduce that time by 80%? because I would never take a 20 minute long survey, and I think that I speak for most.
HD (34:26): There are some business questions that can be addressed quickly and there are other business questions that can not. So in a CX environment, it’s very common for surveys to be exceptionally brief. I’ve taken satisfaction surveys from, it’s one of the airlines but I don’t want to assign the wrong one if I’m wrong, where they basically get their overall KPIs or key performance indicators, and then they basically say, Hey, here are some other areas you might want to tell us about. If so, please do, if not, thank you for participating. And so they give you the opportunity if you feel you have input to provide in these other tactical areas, to provide that input. But they don’t want to introduce a systematic bias by people opting out. Like especially if you think about air travel, you know the people they care that very most about are those incredibly busy road warriors who simply don’t have time to take a lengthy survey but have plenty of time to answer three, four, five, 10 questions and want the opportunity to let you know, Hey, that was a really great flight. Or guess what? That flight was the worst I’ve had all month. And so you can do some objectives like those relatively quickly. On the other hand, if you’re trying to understand the landscape of a particular consumer and what types of segments exists, you might need 15 minutes worth of questions in order to do that. Generally speaking, I can’t get a segmentation, the foundational study, done in less than 15 minutes, 20 is where we more often net out. We try to avoid 25, but you just have too many questions you need to ask and so that’s when it comes back to response rates, right? You might not have as representative a sample willing to do the 20 minute survey, but it gives you the information you need to understand what the framework of the customer base is, and then you can go out and do another really short survey with just the key questions, the typing tool algorithm to figure out the size of the broader market or within your customer base. Like I don’t want your customer sample taking the 20 minutes survey. You don’t want your customer sample taking the 20 minute survey, but you would be happy to send them the three or four minute survey that figures out what segment they belong to and get some additional information. So we try to protect the client sample.
MD (36:37): Can’t you like do most of the segmentation within the company database? Like especially with air travel, I mean the company has all of the information on that customer. can’t you just get all of that out of the way prior ,within the database? So you already know exactly who that person is and you know so much about them already. Can’t you just go right to the questions that actually matter?
HD (37:02): You can do that if you care about how they use you, but if your goal is to get them to convert some of their usage from another airline to you, that may not necessarily work. For example, United Airlines and particularly American airlines, actually United thinks I’m a skier, Southwest thinks I’m a partier, and American airlines thinks I’m a business traveler because I go to the mountains on United, I go to Vegas on Southwest or to visit my mom and I do most of my business travel on American. And so their databases have very different opinions of who I am as a person. And if the goal of United is to steal me away from American airlines, they’re not going to do that very effectively by continuing to promote ski trips, like I can only go so many times per year. What they really need is to get my business travel, and they don’t know to do that because they’re not seeing that in their behavioral data.
MD (37:56): And do you find that most of the surveys that you have to do, there has to be some sort of compensation to get people to respond because of the length?
HD (38:04): The river and panel sample, which is the general population sample that exists out there, pretty much exclusively operates on an incentive model. The incentives differ depending on the source of the sample and where you get it and they offer different amounts depending on how long and difficult the interview is and what the propensity to respond of the demographic group is because there’s wide variability and the propensity of your respond spending on what kind of person you are. So incentives are absolutely integral to the general population sampling. A client sample is where the big debates occur on, we’ll use an incentive, we’ll not use an incentive. What should that incentive be? Is this incentive going to bias the people who reply? For example, if you offer your customers a coupon on their next purchase, you’re going to bias the responses towards people who actually liked the food and want to come back. You’re going to under-represent the people who actually didn’t like the food and don’t want to come back. And so you’re going to fool yourself into thinking that people are happier than they are because your incentive was basically a coupon against future purchase. So you need to be very careful about how you choose your incentives and how you deploy your incentives. There are a lot of laws around what incentives can and can’t be, and so you have to be truly expert in each state’s legal guidelines about what you can and can’t do. Raffles are not legal everywhere, for example. And so you need to really know what works and what doesn’t work when you set up your incentive structure. Some people will simply reply because they love you. But that’s the thing. If you’ve used your client sample and you’ve said, hi, I’m brand X and I want you to take my survey, and you don’t offer them any incentive, you will predominantly get replies from people who love you.
MD (39:46): Yeah. And then your data is going to be totally skewed and you’re not going to have the right negative feedback that you need, which is probably the most important.
HD (39:52): Precisely. And that’s why sometimes when we work with client sample, they are willing to accept the lower response rates associated with a blinded study. Meaning that we would identify it as a survey from LRW rather than a survey from brand X. You will get a lower response rate, but it’ll be a more representative sample.
MD (40:11): Well Hillary that was awesome, thank you so much. We have been, I mean we’re 40 minutes in. That was great. I didn’t even see the time go by. It was so interesting. If our listeners want to hear more from you, learn more, understand more, how do they reach you?
HD (40:29): My email address is firstname.lastname@example.org or we have a whole lot of content on our website, LRWonline.com, blogs, white papers, short opinion pieces on a wide variety of different topics, and we’re always adding new content all the time. That’s one of my roles, is to contribute to that growing set of information. So you can either reach out to me directly with a specific question or you can go check out what we have on our website.
MD (40:59): Awesome. Do you do social media?
HD (41:00): I do. Do Twitter at @decamphilary. I don’t do it as often as I used to, but I am tweeting, trying to avoid politics right now. So I’m trying to stay out of that. And I occasionally will forward some of our nice pieces of blog material and so forth on LinkedIn.
MD (41:21): That’s awesome. Thank you so much for coming on. Really appreciate it. We have to hope you back in the future.
HD (41:25): Oh, you’re welcome. And I’d love to.