Wednesday Apr 13, 2022

Are you targeting the right people? The right way to test audiences. EP-024

Listen on:

 

Transcript

Blake Beus  0:00  
All right. Let's do this. Let's talk we're we're gonna talk about testing. Yes, yeah, yeah. And because we bring it up a lot, but we kind of realize we haven't talked specifically how to test and testing is one of those things, at least I've noticed that it's really easy to go way overboard on and stress about every little intricacy of the test, or to just say, Screw it, am I gonna test and I tend to be the kind of person that when I first started, I want to go overboard. Sure. But I've realized that's not helpful. So let's, let's just talk about testing. And it doesn't have to be just ads, you can test in other ways, as well. So so let's let's dive into well,

Greg Marshall  0:51  
here's a simple way that I like to use test it to not drive yourself crazy. And it's actually testing audiences. So when I say testing, I tend to think, test the audience testing the who, to me is more important than the what? Okay, so meaning, if I can figure out the right audience, that basically, the offer, and a lot of the other things on the back end doesn't have to be perfect. And without saying client names, we kind of discussed someone I showed you earlier. Yeah, before we hit record of how the ad is not even in the quote unquote formulaic way that all the Guru's teach. It's a very simple, straightforward ABA, because we're talking to the right person. It works. Yeah. And so when I, when I say testing, I'm referring to high levels of testing the audience. Because I know if I can make an audience work with a very average offer, or ad copy or ad creative, then it you just start digging deeper into that audience and then go, how do I make this offer? And I create all that better? Because now I know I'm talking to the right person. Right? Does that make sense? So that's how I like to view testing. How were you testing in the past? You said, you may have been overdoing it. What, what was kind of your testing protocol?

Blake Beus  2:23  
Yeah, yeah. So I have a tech background. And one of the ways we used to do testing was very scientific before I left the corporate world. And we would do basically split testing on landing pages, okay, or on home pages of our site. And so we would basically divert half the traffic to one experience, half the traffic would see the other experience, and then we would get our results. And we would measure that result. And we would get good, really good results. A lot of the time. For example, one of the results, we got a 20% lift for this financial institution, which extrapolated out to a year meant an additional $780,000 to their bottom line on this split test that we did. But when I hopped over into the small business space advertising space, trying to split up audiences to make sure there's no overlap and make sure the test was clean. And you're from a statistical standpoint, is is maddening. And it's not helpful. Like it's not getting you really great data. Because the because I feel like the audiences are shifting like it's, it's you're trying to hit this moving target all the time, right, because you've probably noticed this one test you can run last month will will perform different this month. Sure. And I think one of the reasons why testing audiences specifically is kind of like the main thing you should be testing, like you're saying is because of the the fluidity of audiences. And what I mean by that is, people are entering this audience and leaving this audience all the time. Right. And a lot of a lot of people don't realize that. Like, I didn't think about that, right. When I said, Hey, I want an audience that have people that are, you know, 35 to 45, and live in this area and are interested in a golf. Yep. Right. Well, new people enter that age range all the time. People leave that age range all the time, new people in that age range. express interest in golf. Yep. People stop expressing interest in golf. And so the audience is, is flowing. Yeah. Right. And so, when, when I was trying to do this very scientific test and make sure there's no audience overlap and all these things, I would get results, but then those results didn't scale or didn't perform. Again. Yep. Right. Like they weren't concern system. If I ran the test again, I would get different results. Yep.

Greg Marshall  5:02  
Well, and I think one of the ways to explain what it sounds like, what you're doing, and this comes from your engineer training mine, right, I think is, you know, one plus one should equal two. Yeah. Right. So you so essentially, what you're saying is like, when you're running tests, you are making the assumption that every time the ad is showed, it's always Blake. And Greg, yes. Yeah. And it's, and it's the same person. So that's the correct way to test if the person was the same. The problem is, because these audiences are changing. Is this kind of what you're saying is, yeah, because audiences changing your know, like, today's test is Blake and Greg, tomorrow's test, maybe, Joe and John. Yeah. And so it's, and we fit the same exact criteria, we're texting, we're not tackling the same verse.

Blake Beus  5:54  
No, you're, you're technically, you're definitely different peoples.

Greg Marshall  5:57  
And that's what you're saying, right? That's gonna apply as well. Yeah, you were trying to isolate it in a way that it can't be constant, because the people are interchanging.

Blake Beus  6:07  
And it was incredibly difficult and time consuming to ensure that that isolation was happening correctly, and ensure that the data was getting reported back correctly. And then it wasn't super actionable, because the tests weren't. You couldn't repeat the results. Like it didn't work.

Greg Marshall  6:24  
The other challenge that I have seen, and I've tested that too, with you many times, and you know, other clients, myself, one, because I believe in the testing methodologies, like you should always be testing always no matter what. And one of the things that I noticed is what anytime I try to put too many exclusions on it, it seems like the ad platform would have a hard time actually spending the money. Yeah, like, or a wouldn't spend it as quickly. So it takes a longer time to get the data

Blake Beus  6:58  
or your CPMs go through the roof or so. And so you're seeing for the amount of money you're spending, you're getting in front of fewer people, and therefore it's less significant data, if that's what you're aiming for. And so

Greg Marshall  7:12  
there's cost could be two or even three times more expensive, right? Doing it that way, which it would be okay. If it was like a consistent person, right, like consistent audience that never changes, but because it's always changing, it might not be the best way for that particular strategy.

Blake Beus  7:33  
In my opinion. Yeah, absolutely. And when and when you're, you know, one of the things I had to realize is when I was doing like, split testing on pages, for these companies I was working for the people coming to the page were literally the same people over and over and over again, because they would log in and, and it was in the financial industry. So they would log in and do some online banking and stuff. And so it was literally the same people over and over and over again. But when you hop over into the advertising space, and you're trying to get new people that you've not dealt with before, the the data is more fuzzy, and you need to start thinking instead of saying, Okay, what's working for? What's working for this audience, you maybe need to shift and say, How is the audience targeting with the platform's algorithm working? Yeah, is more what it is because the people are flowing through that audience targeting? So if I do this audience targeting inside of Google's ad platform versus, you know, targeting a versus targeting B? How does that targeting, you know, algorithmically perform, as opposed to how do those specific people perform? Because those people are moving in and out? Yep.

Greg Marshall  8:48  
So it's like a river? Yeah, it's just just flowing. And I think testing audiences to me, is, is what I like to test the most to discover winning people, or at least winning indicators in the algorithm, to then try to test new offers, once I figure out that this type of audience seems to perform for whatever reason, right? I don't try to get too deep into it just for whatever reason, these type of indicators, if I tell the algorithm tends to give me the result I want, once I figure that I keep getting the same result. I focus heavier on what I'm showing them.

Blake Beus  9:31  
Yeah. That I think that's a good distinction. You're You're essentially, when you're testing audiences inside an ad platform, you're actually testing how well those indicators work. Yep. And that's a much more important thing from like an algorithm based perspective. Are these indicators tagging the correct kind of people that resonate with this offer? I don't need to know necessarily necessarily, who the people are, but is this indicator of good indicator that they're my right kind of Yep. People

Greg Marshall  10:02  
Exactly. And I think that's, that's kind of an easier way. And even a long term way to train, you know, they say train your pixel or train your ad account, whatever they say, to train anything. Think is really just think of it as how do I give the right information like the competitor, I guess the what does it the analogy that I tell clients, when I tried to explain to them, what we're trying to do with their ad accounts, is I always refer back to real people. Yeah. And I say, look at your ad account as like an employee, what you're trying to do is you're trying to tell that employee what you want. And what a lot of people do is they're not training that employee to tell them, I want purchases, or I want leads or whatever. And they're not giving them any information. They're sending them out blind. So it's like if you had a business, and he said, Hey, I want to hire a sales rep, to go sell my products, but I'm not going to give him any information, I'm just going to send them out there and hope for the best. That's That's how most people were utilizing their ad accounts with not trying to find the right indicators and keep communicating that those are the right indicators to the ad account, so that they can actually find the perfect person.

Blake Beus  11:19  
Right. And I think I think that's a great analogy. Because what it's because that employee is basically interpreting what you say, into their own kind of internal dialogue. Yep. And the algorithm does the same thing. So I don't know, I know, You've trained people as well, I've trained new employees and things as well. And almost every time it's like, I'll say, This is what I want. But their interpretation was different. And that's not their fault, necessarily. That's because I'm making a bunch of assumptions. Yeah, based on what they already understand, correct. And so I have to restate thing. But that also helps me further understand, you know, what, what I'm wanting as well, but the algorithm is doing the same thing. It's saying, you're telling me I want this particular audience, and it's reinterpreting it to say, well, here, here's those people, but that may not actually be what you're getting at. And so it's this communication

Greg Marshall  12:12  
exam. And you know, the funny thing is, the reason why I made this analogy, is because it reminds me there's actually specific life situations that are replaying in my mind, as I explained this, when I worked for a big box gym early, like it was my think second job was 23, maybe. So they would send me out to go get people to sign up for the gym. But they would give me no information about what type of person Yeah, they just wanted me to go out there. So I'm in parking lots of target. And Walmart, Best Buy, I was that guy, like literally talking to cold strangers interrupting their day to get them in. And I think of it as because they didn't tell me who to talk to. It's highly inefficient. Yeah. And so when I was working there, I figured out a way to basically not do that, and focus on more referral generation, and more, trying to be more specific. So I don't waste seven, eight hours of my day, in a place with unqualified people, because I'm not being told who the best prospect is. So when I give that analogy, I'm literally thinking of, like, Whatever you do, do not send Greg out there to just random places and just say, make it happen. Oh, because it's just it's not as efficient as going out there and say, Hey, only talk to these types of people, because they would be the right fit for a gym.

Blake Beus  13:39  
Yeah, oh, man, you just use brought back some traumatic experience as a used to sell auto insurance. We would literally open the phone book. And we will open the phone book and check if they're on the Do Not Call list and then call them if they weren't on the Do Not yet list. And that's I would do that for hours and hours and hours a day. And I did that for several months. And I'm guessing you could guess how many people I got to sign up for car insurance, probably very little 00 literally got zero people signing up for you, Molly. And that's what I was told to do. And they gave me a script to read and everything and it just didn't work.

Greg Marshall  14:21  
Well. And here's the interesting thing. We can apply that to how people run the ads. They gave you a script, that's the equivalent of making the ad. But you aren't talking to the right person? Nope. So therefore that ad doesn't work. Yep. It's not that it doesn't work ever. It's that it's not going to work. If I'm not giving you a list of people that you should be talking to no. Right. And that circles back around. That's basically you're trying to do your ad accounts because you're looking for indicators of audiences or pockets of people that you can give positive reinforcement back to the employee, the Google ad or the YouTube or the Facebook ad saying, and you notice by training people, they get a result that you like, you better make sure you remind them and tell them, Oh, I liked this result. Yeah, keep go after more people just like this, because this is exactly what we want. Now that helps guide the persons who go after more of those people, and then the likelihood of you getting more is significantly higher. Yeah. Because now you're actually, you see, I'm saying that you're actually like, reinforcing the behavior. Yeah. So that's, that's essentially what you're trying to do with these guys. That's why, or in any ad platform. That's why I recommend test audience. Yeah, the indicators as you put it,

Blake Beus  15:41  
okay. So, before we wrap this up, let's talk about how you structure an ad campaign from a high level view. I mean, we could talk about specific ad platforms, Google, Facebook, whatever, to to test these audiences without driving yourself nuts, like I was doing.

Greg Marshall  16:00  
Yeah. So the way I like to test the audiences is very simple. I just have one campaign tied to one audience, as is essentially as simple as possible, like one ad group, one ad set, have the campaign, I like to label the campaign, the audience requires even less of me going deep. So I can quickly just glance and see like how it's performing. And then all I do is I don't worry about overlap. I don't worry about any of that. I just think of themes, and I themed. And then I just run the test. And I'll do really my goal is to do as many as possible, the faster I want to know what the results are the more campaigns on launch at once, and I'll do them at low dollar spend. So they're not like, obviously, you're not spent like 10,000 on low dollar campaigns a lot at a time to quickly get what works, what doesn't. And then I just take the winning kind of aspects of those audiences. And then I relaunch again, and then I have them go against each other. And then I keep doing that to figure out what are the right pockets? But essentially, I don't feel like I go crazy, because I just run a whole bunch of them with the with the intent. Number one, the budgets are low, so you don't have budget anxiety. Wow, a lot need to figure this out. And number two, I basically just say, give it three or four days before I do anything. Yeah. So I have to train myself. And I would recommend anyone else like literally, once you launch them, just pretend they don't exist for three or four days. Yeah. And then look at it. Yeah, that's that's how I keep myself sane. Yeah.

Blake Beus  17:39  
Yeah. See, and to contrast to what I was doing, I was adding exclusions or and Facebook has now has this split test feature, which guarantees the audiences are going to be completely separate. And, honestly, that's not for you. Unless you're spending hundreds of 1000s of dollars a year, you're gonna get worse results using the split testing feature, every time. Yep. Right. And so that's really only for big brands that have massive budgets. And so I wouldn't even worry about touching that. But that's what I was trying to do. I was adding these exclusions, I was trying to guarantee that we're testing a, you know, an audience, and I'm excluding when I'm creating custom audiences, and excluding them from one another to make sure that we're not overlapping. And and it was absolutely maddening. And then the results were all over the place and not repeatable. Yeah, and I think

Greg Marshall  18:33  
the thing that you're you're explaining is the thinking you're using is the right thinking, I believe, and the right setting, it's just that these algorithms, I don't think support that thinking because they're kind of unpredictable and then change. That's, I think, that's one of the reasons why exclusions and isolating things too much can be a challenge is because what works in the algorithm today, or tomorrow and

Blake Beus  19:02  
we've talked about this before, what it whenever you're dealing with an algorithm, you want to make it as easy as possible for that algorithm to do its job. And when you start adding in exclusions, and, and and things like that, you're making it harder for the algorithm to identify which people fit into your target your targeting audience. And, and therefore, if it's harder for the algorithm to do its job, it's not going to do as good of a job getting the right people and your CPMs are gonna go up, your costs are gonna go up all of that stuff. Yeah. And I think

Greg Marshall  19:37  
overall to the there is a trade off that and this I'm happy rather because there's like a, like a silent agreement that I make with myself, which is, am I willing to make this trade off? It's a question that I like to ask, which is, I could be absolutely exact, but then drive myself crazy. Yeah. Or I can just be a Little bit more loose and it's not 100% exact, but I don't go crazy. Yeah, that's that's like a silent kind of contract that I made with myself. Yeah, when I'm testing so that I know like not to get worked up about it because in the past I've had, it's where it's like if I don't get exact, I feel like this like I don't feel at ease. And that that can you know, we had this conversation before at a lunch once about the marketers anxiety Oh, yeah, it literally if if you're not intentional about what you're doing and paying attention to, you know what, what could trigger it what the risk are, you could literally just be like, in pure panic mode all day long. Oh, yeah. And if you'd like, drive you insane,

Blake Beus  20:42  
absolutely insane, especially when you're trying to scale and you're spending, you know, higher dollar amounts and things like that, like that anxiety is real. And it can cause you, your business, whatever to kind of choke, you know, and not not be able to function because you're so worried about this one aspect of your business, that it's, it's just not, it's just not worth it.

Greg Marshall  21:02  
So make the trade off the trade off should just be you want to keep your sanity while you're doing this. And so just understand Sakana be super exact. And that is okay, as long as like, from a high level, all the numbers are working, and your business is growing the way you want it to be okay with that? Yeah. So that's, that's my advice for, for testing is have a healthy trade off. That's what you're doing. Yeah, just so you don't drive yourself crazy. Yeah.

Blake Beus  21:32  
And then the other thing I'd like to say when it comes to testing, a lot of people don't think about this. But I think more people need to branch out on which platforms there are. So one thing you could test right now is an audience in Facebook. And that same same audience or as close you can get in in Google or YouTube or, or somewhere else, or tick tock or something. Right, like, branch out and see. And you you might you might be surprised. I mean, you were telling me just today about one of your clients who has traditionally done nothing but Facebook, yes. And you said, Hey, let's hop over in and run some YouTube ads. And they are floored with the results. Because apparently their audiences chillin over on YouTube. And yeah, so much on Facebook right now, higher quality

Greg Marshall  22:20  
leads, higher conversion rates, higher average order values, higher lifetime value, potential lifetime value. They've

Blake Beus  22:27  
been doing stuff in Facebook for years and years or whatever, right? They've been running

Greg Marshall  22:30  
Facebook ads for four years, four years. So they're not like, they did Facebook one month. And then they just decided to because it didn't work in one month, they switch. They've had success with Facebook ads for years. But they noticed that their CPMs are getting higher, and the quality of the person was different target with them. So I suggested to make a move, and at least test another platform. And he's getting great results. And I think the moral of the story is, that's why you should always test and never slow down as far as like testing for platforms and one platform and maybe revisit it a couple weeks later. But you should always be testing different platforms, different audiences, so that you can figure out if something changes, you're able to move, and it almost doesn't impact. It's like diversifying. Yes. Yeah.

Blake Beus  23:25  
All right. And then the last thing I really wanted to bring up and this this is my opinion, and I actually haven't actually asked you this before, but I feel like people when they're first hopping into ADS, one of the very first things they want to test is something like men versus women, sure, or age ranges, you know, 35 to 45 versus, you know, 76 or whatever. Yeah, and in my opinion, unless your business is very specific, yes. To the needs of an age or a gender. In my opinion, that is like the worst way. Yes. The worst way to cheat test audience. Yeah. What do you think? Yeah,

Greg Marshall  24:04  
I would say, and it depends, right? Because the type of product like if it's a women's product, right, I mean, obviously, you want to test women, and how many of you could have manicures met with goodbye for and I've done that. But typically, you know, whatever the product is the way you test it, you just got to figure out like, could both genders purchases? And the other question you have to ask yourself, and so this is why I actually test these types of things sometimes against each other is Am I making assumptions? That are incorrect. Yeah, I made the assumption before, like, on some of the ads were, well, I'm just going to exclude other placements or only have it mobile or only have a desktop or only have this or don't have it on TV or whatever. Right. I found that the more I kind of keep it open, the better. Yeah, you know what I mean? Like because I'm not forcing it's kind of a level of a exclusions. I'm not forcing you to, like, I'm not telling the employee to only do one thing, right? Yeah, I'm actually giving it breathing room and trusting like you should trust an employee for them to figure it out. Yeah. Right. And that's, that's what I found is, the more you can keep it to where it does include both genders. Most age ranges, the better. I've seen it.

Blake Beus  25:24  
Yeah. So yeah. And so what we're talking about specifically, here are interests interest based are a good place to start. Each platform has kind of their own unique things within YouTube, you can you can test based on what channels people are interested in, or whatever. So if there's a channel with a topic that's completely relevant to your product, or whatever, right, you can test based on that. Facebook has based on other interest groups and things you can test based on. You know, income, yep. Right. Facebook kind of sucks with that. But Google tends to be a better based on on in in, I feel

Greg Marshall  26:03  
like that's an understatement. Yeah, I feel like it's significantly more accurate than Facebook.

Blake Beus  26:09  
Yeah. And so, so look like literally look into those targeting options, and at least get familiar with what's available in each platform, and come up with some creative ways to test those.

Greg Marshall  26:20  
So yeah, so I guess to end this podcast, make sure you're testing test often. And that's the main job of a media buyer.

Blake Beus  26:31  
All right, well, Greg, how can people get in touch with you

Greg Marshall  26:33  
go to my website, Greg marshall.co, you can book a free strategy session call.

Blake Beus  26:37  
And for me just go to Blake beus.com. The SM3 Group for social media marketing is like a group kind of marketing thing where we help put together posts and things for you. That's the main kind of thing

Greg Marshall  26:50  
and be able to look out Blake and I got some cool plans coming out for that program. Yes,

Blake Beus  26:53  
we do. So yeah. Be Be on the lookout and we'll we'll let you know when those things drop. All right. We'll

Greg Marshall  26:58  
talk to you guys later. Okay. Bye.

 

Comments (0)

To leave or reply to comments, please download free Podbean or

No Comments

Copyright 2021 All rights reserved.

Podcast Powered By Podbean