<img height="1" width="1" style="display:none;" alt="" src="https://dc.ads.linkedin.com/collect/?pid=74383&amp;fmt=gif">
Contact Us
Request Demo
Menu
Contact Us
Request Demo

Market Research -- The Secret to Profitability with Patrick Campbell

Emma Waldron
21 Jan, 2019

This post is a transcript of S2 E22 of the Voices of Customer Experiences Podcast with Mary Drumond, featuring Patrick Campbell.

[00:06] Mary Drumond: You're listening to Voices of Customer Experience. I'm your host Mary Drumond, and on this podcast we shine the spotlight on individuals who are making a difference in customer experience. We also proudly bring you the very best of customer experience, behavior economics, data analytics and design. Make sure to subscribe or follow us on social for updates. Voices of Customer Experience is brought to you by Worthix. Discover your worth at worthix.com.

[00:35] MD: Patrick Campbell is the co-founder and CEO of ProfitWell, the industry standard software for helping companies like Atlassian, Autodesk Meetup and Lyft with their monetization and retention strategies. ProfitWell also provides a turnkey solution that powers a subscription financial metrics for over 8,000 subscription companies. Prior to ProfitWell, Patrick led the strategic initiatives for Boston based Gemvara and was an economist at Google and the US intelligence community. Thanks for coming on, Patrick. It's great to have you.

[01:05] Patrick Campbell: Hey, it's good to be here.

[01:07] MD: I've been looking through your material, watching some of the content that you post on your blog. It's called Protect the Hustle, right?

[01:16] PC: Yeah, that's the new podcast. We've been publishing lots of fun content for awhile now, but that's the newest one.

[01:23] MD: Awesome. Well, for our listeners who might not be familiar with your work and what you do, can you give us a bit of a rundown of your background and what you do for a living, what your educational background is and what your mission is in changing the world? 

[01:36] PC: Yeah, definitely. So my personal background is in econometrics and math. I was born in a small town in Wisconsin. One of those towns have more cows than people and so that definitely definitely shaped me. And maybe that's why I went towards the numbers because there weren't a lot of people. Yeah, I think it's kind of speed things up, you know, and not go too deep into the history. I started my career working in the intelligence community in DC, and then ended up going to work at Google and basically both places like really early in my career. And so I was doing kind of grunt work around economic modeling and a little bit of sales and a little bit of fun stuff. And so I'm left there, worked at a company called Gemvara, which was an ecommerce, like a custom ecommerce company.

[02:23] PC: So basically you could design your perfect ring or perfect piece of jewelry with all different types of gemstones or colors of metals and things like that. It was kind of like a gemstone focused Blue Nile, if you're familiar with the market. And there that's when I kind of got the itch around pricing and customer research because we would basically focus in on making small changes to our pricing or making small changes to are our stack and then we would see these big swings in revenue. And so that's, that's kind of where I got the customer research or customer development bug kind of bit me. And then founded a ProfitWell,  we used to be called Price Intelligently, which is essentially a suite of products that help subscription companies with their growth. And we have a free product that basically gives you your subscription metrics, so you plug it in and you get your active usage, your MRR, all that kind of fun stuff. And then we make money by selling products that help with pricing and help with your retention and things like that. And so that's kind of the quick and rambly version of where I come from and where I am now.

[03:32] MD: Well, I heard you mention, I mean you're probably the perfect person to be on this podcast because that's exactly what we love talking about. We love talking about customer research and how that ties into not only revenue and profit, but how that also kind of, how can I say this connects to all the other departments inside organizations. And one thing that we really like doing is, is it kind of widening the funnel so that it's not just customer experience, people talking to other customer experience people, but kind of broadening that view and, and taking a different angle or a different perspective of how the entire company or the entire organization can benefit from consumer research. So that's why I think you're perfect. In your most recent episode, I believe, of the blog, you talked specifically about a customer research and how it can be used to shorten the cycles in SAS companies. Is that right?

[04:35] PC: Yeah, definitely. That was something we went deep on and I think it's super crucial around like feedback cycles in general, you know, a lot of us, we kind of guess and check our way to success when in reality you aren't going to know if it's gonna work or not until you truly ship it, but what you can do is realize, you know, kind of hedge that guessing and checking the, doing research not only from a customer perspective but also on the market, competitors, and things like that.

[05:06] MD: You had Steve Blank on the podcast, talking a little bit about that and he's a Silicon Valley guy, right. Tell me a little bit about what he does and the message that he got across.

[05:18] PC: Yeah. So Steve Blank is kind of one of these luminaries of Silicon Valley. So he's been building products for decades at this point. I don't want to age him too much, but he's kind of the, you know, the father or the godfather of kind of the customer development kind of method, but also kind of, you know, really just harping on how important customer research is, and the whole thesis there, and I kind of got into that already, because the market is starting to move so quickly mainly because software and technology and businesses have just gotten quote unquote easier to build or at least relatively easier compared to 10 years ago. Because of that, essentially what's happened is that you need to start building the right things,and it's harder and harder to figure out what the right things are and it's harder and harder to figure out who the right customer is. And so because of that, that's where it's so important to make sure that you're validating or invalidating your hypotheses in the business that you're building. So you're doing that research and it doesn't always have to be like surveys or interviews or things like that. It can be looking at your data and understanding where people are coming from, but that's Steve's big thing, which is, you know, you got to talk to the customer, you got to get out of the building as he talks about and kind of go from there.

[06:35] MD: You know, that's interesting because it reminds me of an episode we did this season with Dr. Nammy Vedire. And she works at the Center for Deliberate Innovation at Georgia Tech and what they do, do you know about them? Have you heard about them?

[06:50] PC: No, I haven't. That sounds like something I should check out though.

[06:52] MD: Yeah, they're awesome. And for our listeners, it's episode seven. If whoever wants to check it out. What they do is they actually created a method or a process to try to decrease the fail index for startups. And they do this through research. So they encourage startups to go out on the street and ask questions and it's really interesting. I'm not going to spoil it for people, but it talks about hoe they created a method to keep the whomever is doing the research from falling into the trap of biases so they actually have a method that they wrote out, and it's pretty neat. They based a lot of their studies on Daniel Kahneman's book Thinking Fast and Slow and the biases that he explores in that book. So it's, they actually created an entire method around it and there are some pretty brilliant minds working on that. So I encourage everybody to check it out. But that's something that I heard on your podcast as well, talking about how research starts with a hypothesis and you can confirm that for me, since you're an expert, most research starts off at the hypothesis and then you look to confirm that hypothesis through experiments in this case, through talking to people and asking questions. Is that right?

[08:22] PC: Yeah, definitely. I think it, it's kind of following like most research or at least in what I've found and what I've seen be successful around, you know, running some sort of a research process. You're just kind of following the scientific method or some variation of the scientific method that you learned back in probably middle school or maybe even earlier in elementary school, and it really just comes down to establishing what you think, you know, it's a hypothesis, um, you know, and putting a framework around how can we get data to support or to refute or at least, take away support for that particular idea. And then based on that, what we can do is basically understand that, okay, this decision that we were going to spend a bunch of time on, it's actually the right decision or this decision we're going to spend a bunch of time on it is not the right decision because when you're building a company and when you're making decisions, you're not, you're normally fighting time more than anything else. Money is obviously something that, you know, can, can come into play. But normally time and money are almost equal in a lot of ways.

[09:31] MD: No, I absolutely agree. And I think that that's probably the most traditional way it's been done. But I asked you this because we've heard some people from academia and other companies as well that have begun to remove that step from the process. So they take away the whole idea of starting with a hypothesis, and they go straight to the data and they do this and then you say, how do you do this? Because you need the questions, right? You need to form the questions and what they try to do is have extremely open ended questions that are kind of lead on by the respondent themselves and you allow people to kind of just talk in a more unstructured way and you then look at the data and this is something that's only possible now because we've got big data and we've got data analysis to a degree that didn't exist before. Right? And that way what you're able to get is just the voice of the customer totally void of our biases or even company biases. So instead of the customer answering the questions that we want the answers to because it's relevant to our business, they just talk about what they like, what they think is relative, what they think is valuable, where their expectations are with the market and then we interpret that data and take out what's relative and then apply that. What do you think of that process?

[11:00] PC: Yeah, that's a really good. So that's kind of the standard or the, at least a similar framework or process that we've heard and talked about. I think one really important thing to kind of point out though, I think is all data is going to have some sort of bias, right? So it's going to have, you know, some sort of, oh, you know, there's a location bias. You only ask people around you in Boston or like that's where we're located or it's going to have some bias because, you know, they're not exactly in the same environment through which they're going to make that decision. Or they're right after they made that decision or you know, a whole host of things. And I think that we get caught up a lot in bias because bias is important, but folks who maybe aren't academic or anything like that, they, they kind of think like, oh, this is just going to be biased, so I shouldn't do it at all.

[11:50] PC: That's what a lot of people that we run into end up thinking and the secret is more around, okay, uh, always understanding the limits of your data. So if I'm going to make a, and I mentioned this in the podcast, if I'm going to make a $10,000,000 decision, I probably shouldn't make that decision off of just like 10 random conversations that didn't have a lot of data were very open ended, right? I probably shouldn't have a research process that starts with me making very specific surveys are very specific environments because I'm putting my own bias into what I think should be out there. So long story short, I think that framework makes a ton of sense. We recommend, yeah, starting with a hypothesis and then it doesn't always have to flow this way, but getting some qualitative information and then you know, if you're not satisfied or you feel like you need some quantitative going out and getting it, but I think the biggest thing I would recommend to people listening is your job is to understand the limits of your data and depending on the gravity of this decision, eliminate and reduce as much bias as you can essentially.

[12:54] MD: So that's part of the job of the data scientist who's looking at that information and being able to say, hey, this is compromised. Either there are too few responses or it's too limited. Or also trying to, I don't know, maybe find some value in what is there and what you have. I mean, we're talking about companies here of all sizes, in your case, SAS companies that sometimes have tons and tons of customers and other times we're just getting started. So not everyone has access to big, huge numbers of respondents, right?

[13:25] PC: Yeah. I think I, I think that's right, but I also think that there's a pretty healthy environment or ecosystem out there for you to get respondents, right? Um, and one thing that I think I find when I talk to people about customer research is that if you truly cannot get respondents for something, like you truly cannot find them, you probably don't have a great market because then that means it's going to be that much harder to go out and try to sell someone something versus you know, buying them a coffee or getting them on the phone or getting on LinkedIn, introduction to them or something like that. And so I would say like overall I think the biggest thing that I would think about is, you know, where can you go to where your customers are now. There are some very natural environment.

[14:12] PC: So you know, if you're selling a retail product, obviously like going to a store and like talking to them, there's like more natural things for a business to business kind of a environment which is find them on LinkedIn, getting introductions, starting a blog, contacting email subscribers. And then there's, you know, quicker but more expensive ways, which is to use these market panelist companies, and so you can get access to anyone from, you know, I say a soccer mom or dad in the middle of Kansas all the way to a fortune 500 CIO in Europe, you know one costs very differently than the other. But it's one of those things that, it's relatively easy nowadays to get relatively cheap responses from people. And to me, there's no excuse for not being able to do your research. You might not be able to do the amount of research you want to do, but you should be at least gets good able to do some, if that makes sense.

[15:08] MD: Voices of Customer Experience is brought to you by Worthix. If you're interested in customer experience, behavior economics or data science, follow Worthix on social media or subscribe to our blog for the best content on the web.

[15:26] MD: How about the idea that people are no longer that open or that tolerant of answering surveys? You know, my personal opinion is that maybe we had a whole generation of people who had to answer really bad surveys. They were either way too long or the questions were irrelevant or it was questions that the company should already know. There was an example very recently in the customer experience industry of a person who took a flight on Emirates, and the survey was 30 minutes long and it was just exhausting and the person went through the entire process just to prove a point and screenshotted every single screen and said, this is absurd. They're asking me for my email. They're asking me which class I'm in, but they know exactly which seat I'm on so they should have access to all this information. He said, I'm a loyalty member. You have all this information. Yet you took up 10 minutes of my time asking questions that were totally irrelevant that you should already know. Do you think that this is an epidemic? You think this is really happening where people are like, nope, nope, not going to answer surveys anymore. Too long waste of time.

[16:39] PC: I know exactly what you're talking about and this is, I love to talk about this because it's. Well, surveys don't work well. What's the last survey you sent? Well, it was 45 questions long. It was emailed to you. The first question was what's your email address, right? So it's just kind of an intense no problem where we've kind of trained people to the fact that we're not going to respect, you know, your time, therefore like why should I use my time to give you some value? And so I think that we have affected the industry, like we as just operators out there looking for customer research, but I think you can retrain people pretty easily and that's by sending, if you're not going to compensate them. And when I say compensated, I mean like compensate on an individual basis, not like, hey we're gonna give away a prize or something like that.

[17:29] PC: But if you're not going to compensate, you can do a 30 to 60 second survey and be super clear in the subject line as well as in the actual email copy that, hey, this is 30 to 60 seconds. The problem is if you go over 60 seconds, typically your response rate typically does go down and then when you go over four minutes and if it's not compensated, normally what ends up happening is the quality goes down at least in the data that we've seen and we've sent probably about $40 million of these things at this point because one of our pieces of software is very survey based. And so I think it's one of those things where a long story short, I think you can retrain people to that you're going to basically respect their time, um, and then, you know, get them into a mode of being more than happy to share feedback. Now there's other ways to receive feedback even through a survey, right? You can do it on the site, you can do it a lot of different places, but I think you hit the nail on the head, but the that the biggest thing is you got to respect the respondent's time.

[18:29] MD: Do you think that as a continuation, as a followup, you think that companies are proactively changing the way they're doing surveys or do you think there are a lot like the airline I mentioned that just don't care that they just have all these questions that they want the answers to and maybe someone is just being lazy and they're like, eh, we're going to ask them questions anyways. There might as well put them all in there. Right. What do you think is the tendency of the market at this stage?

[18:57] PC: Oh, I think it's pure laziness and I think that what ends up happening is a lot of people, they burned their people out both internally and externally and so I don't know. I think that I do think the world of kind of surveys and research is getting better. Mainly because of the tools are getting better. If you think about, you know, even just like Survey Monkey or Qualtrics or some of these other survey tools that are out there, they've definitely gotten, yeah, much, much cleaner. But I will say that for the most part, um, I think that we're probably still kind of in the same mode that we've always been. And hopefully that's, that is going to shift. But I guess what I'm trying to say is I don't know if I have enough evidence to suggest that we are getting better. I know there's a lot of companies that are getting better, but I don't know if it's a global trend quite yet.

[19:51] MD: Right. Sometimes when speaking to people who are conducting surveys, there's a large frustration that I have, which is if someone opens your survey and says, Hey, you know what? I'm going to take the time to answer that. That shows that the customer really takes value in your brand either because they want to give you negative feedback of something that's really bothering them and they want to fix it. So they're trying to give the company an opportunity to solve the issue or when it's to give positive feedback, which is a lot more rare. It's because they really want, they feel passionate and they're an evangelist somehow and they want to stimulate or encourage the company to keep doing well. So when you get a survey response, it's someone who actually cares about the brand I feel and they're doing it, especially when they do it for free. This is so valuable. Companies should treat this with the utmost respect and, and when they don't, I feel like it backfires. It has the adverse effect where you're actually alienating your customer and decreasing their satisfaction rate because you're telling them that you don't care about them as a customer and you don't care that they're taking the time to answer you and give you feedback. What do you think?

[21:11] PC: I mean it is about that experience, right? And I think what ends up happening is we kind of feel like a survey isn't well, we think that about a lot of things, right? Like, oh, our receipts aren't part of the experience. Our transactional emails aren't part of the experience, a survey isn't part of the experience. We somehow believe that this, this, this experience is suspended in like a animated way somehow for just when they're using our product, right? When in reality it's everything around it. And so yeah, I think that the survey's a perfect opportunity to kind of defend that experience. And to do that you need to make sure that obviously you're not kind of attacking the wrong pieces and wasting people's time.

[21:55] MD: How valuable is insight customer insight for Sas companies that have subscription plans that depend on that retention on building up their user base more and more? 

[22:10] PC: It's, it's hard to put like a metric on it. But, I will say that overall it's extremely. Yeah, I mean it's extremely valuable. And the reason is, is because the cycles, and I talked about this earlier, the cycles through which we're building products or we're acquiring customers online are getting shorter and shorter and the reason that they're getting shorter and shorter is because technology is no longer the barrier to building a company. It used to be this huge barrier and if you're working on some really fancy stuff like nuclear fusion or something like that, obviously technology is a huge barrier still. I think for most of us who are building like a subscription company, the product needs to be good just because there's so much other product out there now. But the swing has happened where all of a sudden the actual product isn't the limiting factor as much as it is, you know, finding that customer, acquiring that customer, making the customer happy.

[23:06] PC: And so you got to understand that customer because everything in your business from your sales and your marketing all the way down to your finance and operations teams is used to either drive a type of customer to a point of conversion or to justify and rationalize why that product in that price should be continued to be paid or to continue to be used. And so if you don't know who that customer is, there's, there's no way in hell you're going to know how to price, what to build, how to keep them. And ultimately how to make them satisfied with your product.

[23:39MD: Well not only that, but what they perceive as value in the current market which is changing so very quickly. Right? So like disruption and innovation is no longer something rare. It's no longer a rare occurrence. It's actually happening on a day to day basis. Right. So you can lose your customer to innovation or disruption at any given moment and you have to be able to keep up with how their expectations are changing throughout their life cycle with you.

[24:08] PC: Absolutely. Yeah, absolutely.

[24:10] MD: Yeah. Well let me ask you something. After, you know, deciding that you need to do research and creating your survey, asking the questions, you do need to think about methodology. You actually probably need to think about methodology first and there are a couple of methodologies out there that are more traditional. Others are more innovated. Worthix, my company, we have survey methodology as well, but some of the main players in the market nowadays that have the strongest following, you have companies like Net Promoter Score or Customer Satisfaction, those methodologies. You recently talked about NPS and how that correlates to retention, which is something really important in your market. Can you tell us a little bit about your research and what it found?

[24:58] PC: Yeah, absolutely. I think the big thing is in our research was basically finding out that what's interesting about nps is it's not a, it's not an absolute God metric. What I mean by that is we found that NPS does not correlate, meaning as NPS goes up, it does not automatically mean that your retention is going to be better or that businesses that have higher NPS are going to have higher retention. Now, there's a subtle point here which is when you have upper core tile, so really, really good NPS, you tend to also have really, really good retention, but that's not correlation. That's just, you know, a phenomenon that's happening. And so it's, it's a subtlety for folks who are, you maybe not as data inclined, but basically what it means is that, you know, there isn't like a one to one correlation or, or even close to one as your NPS goes up and yeah, kind of hit a little bit of a Hornet's nest with that study, mainly because NPS has become this like dogma within a lot of organizations.

[26:14] PC: And I wasn't saying that NPS isn't important. I think NPS, if used correctly and actually is it's exceptionally important and essential separately useful on both a individual score basis, but also when you segmented and we study it and compare it over time and things like that. But I think that a lot of people who don't go to those extremes, they just kind of think, oh, we have NPS of this. Our product must be amazing. Everything must be great when in reality there's a lot of other indicators that you have to look at to understand if your product truly is great.

[26:44] MD: Well, I think it's because it's just a metric that people started throwing around to answer any questions that they couldn't actually pinpoint. They're like, oh, our NPS is this, our NPS is that, but in your research, which were the things which were the areas that you found that NPS was not actually useful and which were the ones that you found were extremely useful?

[27:07] PC: So overall I think with, with NPS, it's very useful when you track it over time, it's very useful when obviously if there's signal where if someone gives you a low NPS score, you can reach out to them or treat them a little bit differently than if someone gives you a high NPS score. I think in addition to that, it's extremely useful for things like when you segment things down. So the NPS of, of people who are using this feature versus the NPS, if people are using that, that feature. I think NPS in aggregate, it's pretty not useful. Like most data, like any data that you look at an aggregate, it's typically not that useful because you know, you're not, you're not learning, you know, differences and dissonance that occurs within the data and you have to segment it down to get values. So I think those are some of the use cases that I think you can use, but I don't know, I want to live in a world where we kind of get beyond NPS, not because NPS is necessarily bad, I think it's just more around the fact that NPS is actually one of those things that when you look at it, it's like, oh, we, if we built certain models, we could do so much better.

[28:19] PC: Uh, there's so many different signals that suggest a retention or monetize their better monetization are, you know, good retention or all types of different things. And it's just that, you know, we, we as humans, we typically like, you know, one number or one kind of blunt instrument to measure things when in reality, building a custom model for every single company out there that's, that's hard, right? And it takes work and yeah, maybe it doesn't necessarily get us much more than NPS would, but I think as technology gets better and we're now able to measure things on a much, much better basis that's being, that's allowing us to basically evolve in terms of how we measure sentiments and, and things like that. And that's the future. I think. I don't know how close that future is, but I think that eventually I'm just by the nature of being able to look at other data will move beyond something like NPS.

[29:14] MD: That leads me to my last question, which is regarding cold data vs. hot data. Lots of times when companies have the initiative to survey their customers, by the time they start collecting their responses or by the time they've processed the responses and actually created metrics on it, that data is no longer hot. It's cold. It reflects the past. What are some solutions to keep that data constantly hot, especially in the market that we live in, where change comes so quickly? Is there some way that we can have like a continuous cycle of fresh data?

[29:54] PC: Totally, I think if you, I mean, if you use certain products, right? So a ton of products out there, obviously you guys know a couple of them. So I think it's one of those things, there's a ton of products out there and that's really the secret is you got to be doing this on a continual basis. It's something where if you treat this as a project every x months, it's going to end up failing. And I think that that's why it's so important to kind of keep things moving. And when I think about, when I think kind of overall about, how things kind of work inside organizations, it's got to start small. You know, if you start, hey, we're going to solve our pricing, hey, we're going to fix this big thing, you're going to end up failing for a whole host of reasons. Much of them not related to the data or anything. And so I would start small, just just start, you know, NPS has a lot of its own problems, but just start collecting NPS and then that's going to cause more questions and then start doing exit surveys when someone churns and then start doing a whole host of other things. But just kind of keep things moving essentially

[30:59] MD: So get started and once you get started, once you start digging, you'll start uncovering new things that'll lead you to dig deeper.

[31:06] PC: Yeah, exactly. Exactly.

[31:08] MD: Awesome. Well thank you so much. I think that's our last question for today, but I really appreciate you taking the time to come on. Patrick, what are some ways that our listeners can reach you or follow you and get to know more about your insights and listen to what you have to say?

[31:26] PC: Yeah. So I'm on LinkedIn. We post a lot of our content through my LinkedIn profile, just at Patrick Campbell. You can also follow me on the Protect the Hustle podcast that we recently launched and it's going well. Or  just pc@profitwell.com. We publish a lot and I'm always more than happy to help anyone. It might take a little bit for me to get back to you, but I am always happy to help.

[31:52] MD: Awesome.

[31:56] MD: Thank you for listening to Voices of Customer Experience. If you'd like to hear more or get a full podcast summary, visit the episode details page or go to blog.worthix.com/podcasts. This episode of Voices of Customer Experience was hosted and produced by Mary Drumond, co-hosted by James Conrad, and edited by Nic Gomez. Blog copy and summary by Emma Waldron.

 

Subscribe by Email