How to Leverage Machine Learning For Advertising: Working WITH the Robots

As advertisers, we interact with data-driven automation, machine learning, and artificial intelligence on a daily basis. Google Adwords, Facebook Advertising, Perfect Audience, and other major ad platforms all use advanced technology to drive results. How does it work? And how can you as an agency owner or marketer truly use these tools to your advantage? Watch the webinar and Q&A below to understand the basics of Advertising AI – no experience or background required.

Speakers:

Shiv Gupta
Founder, U of Digital
Previously: AOL, Criteo

Dr. Dan Becker
Founder, Decision.ai
Previously: Kaggle (Google), DataRobot, FTC Economist

Jalali Hartman
Director, Perfect Audience A.I. Lab
Previously: Robauto.ai, Yovia, MarketingExperiments

View the slide deck:

AI-Lab-How-to-Optimize-for-AI-Webinar-Slides

Transcription:

0:00

Hey, everybody, happy Thursday. This is Jalali Hartmann, I’m the Director of the AI Lab for Perfect Audience. We’re just giving everybody a second to log on here, and we’re going to get started. So, yeah. Thanks for everybody for coming. Very excited about today’s topic, and its panel.

0:26

Just to give everyone some background and why we put this together, in pretty short order, I might add. I saw this article that was written a couple of weeks ago in ad exchanger and I was fascinated by it. And I’d sent it over to Jalali, who heads up our AI lab, and the subject of the article, or it was really what caught me. It’s like, you know, calling BS in AI. I gotta click on that. I gotta look at it, read it.

0:51

And the problem is real, right? The, you know, machine learning and AI is a hard thing to solve when you’re trying to do it right, but it’s extraordinarily valuable when you do it. So, it’s a mix of human and machines to make this thing work and to do it right and to get it right. And so really, we wanted to bring on some experts, and we’ve got 2 or 3 of them on the phone now. Doctor Dan Becker, he’s a leading expert on machine learning – he has really helped innovate and educate thousands through Google and FTC and, you know, sort of the list goes on.

1:34

And then Dan has worked with different types of iterations on TensorFlow. Just, you know, the kind of background and the kind of mind, you know, that we get to bring to the table on these events, just you know, is amazing to me. So, thank you, Dan, for, for joining us. And and Shiv Gupta is also, he’s sort of old school advertising. You know, it came out of AOL and advertising.com and Criteo. And so, you know, he knows the subject inside and out and, and has been, you know, in the Industry for, for quite awhile and has been sort of leading the charge in some of this stuff. So, appreciate you’re also joining us.

2:16

So let’s just start in right now, let’s sort of get right into it. So, Shiv, you know, can you tell us a little bit about why it is that you wrote the article, why you’re calling AI, you know, BS, you know? I kinda what was the thought behind that?

2:36

Yeah, for sure. So first of all, thanks for having me on Hi everybody, excited to be here and excited, excited to be talking to you guys about this topic. You know, my experience to your point, Eric, I’ve always been an ad tech, right, and I’ve primarily been on the sales side of ad tech, So I was selling at ad.com, at AOL, at Criteo. And, you know, frankly, when you’re selling in this industry, you kind of figure two things out pretty quickly. You figure out the things that you’re talking about and in what ways they are genuine and sincere and, you know, in which ways maybe they’re a little misguided or a little bit, you know, like, a little bit of misdirection. Then, you very, very quickly figure out what the competitors are doing, as well, in terms of misdirection. Right, Because that’s, that’s your job. You’re selling against them all the time. And after leaving Criteo, I started a company called U of Digital, and we are focused on bringing structured education to the space now. And in that effort, you know, I’ve, I’ve been able to, first of all, have a much more objective viewpoint of the industry without like sitting at one company and having to sell one technology or solution. And I’ve also, you know, one of our missions at the company is to make digital advertising a better space through education, right? And so, that’s kind of what led me to write this.

3:51

I constantly feel like, you know, part of our mission is to illuminate some of the things in the industry that are, that I think are troubling and causing, causing inefficiencies. And, you know, one of the main things in this article that I talk about is the difference between real AI and misdirection. Right? And so that was one of the main reasons I wrote this, I’m excited to be here in particular because I know you, know you guys with Jalali on the on the line and with Doctor Dan Becker on the line, like you guys are, actual experts, right. I, I’ve just been an ad tech, I’m an ad tech guy. And I have some thoughts and opinions that I wrote about in this article. But I’m excited to be here and learn from you guys because your knee deep in the AI itself. I have a bunch of questions for you guys throughout today that I’ll probably probably be bringing up about definitions. And differentiation between things like neural networks, and machine learning, and AI and deep learning, right? There’s all these terms at the industry loves to throw around, I’d love to learn, I’m excited to just learn from or more, from, from you guys here today.

4:51

Awesome. And so we have Jalali on the phone, too, and as many of you already know, and it looks like people are, keep joining the webinar. Jalali heads up, our AI Lab, here at perfect audience, and has just, you know, is really driven the, the technology forward, as we’ve been building some of these things out over the last nine months now. So we’re super excited to have him on board. So Jalali, you, want to, you want to kick us off here?

5:23

Yeah, so, thanks for opening up. So we’re gonna get ready to, Dan’s on here, and Dan is someone that I’ve known for, actually, my whole career, and he’s just, he’s brilliant. But he’s brilliant at, in particular, Machine learning. So he’s one of the foremost experts I would consider, I want to have them on here because it’s not necessarily just looking at it from an advertiser perspective, but how does this actually work so that maybe we as a community, we could get a little bit more aligned in terms of understanding what we’re even talking about. So, before we get going like, one quote that really stuck out to me and got me started in this years ago, was, the CEO of Google came out and said, he thought AI would be more impactful than electricity.

6:01

If you kinda really think about that, that’s a tremendous statement. What I think is that it’s actually probably bigger than that. We also are kind of using it the wrong way or thinking about the wrong way or how misperceptions of how it works. So with that, I’m going to turn over to Dan. Dan’s going to do some definitions, and he has a quick quiz. A machine learning’s route, because you had your Wheaties this morning now. He’s just gonna go through some kind of basic terms and shows from his perspective, like what is and what isn’t possible.

6:26

So I know there’s been a few questions about definitions I have on the lower right here. Really, a Venn diagram of AI and machine learning. deep learning, I’ll add neural nets to this in a moment. Because people just want to know what are these things. So, starting on the outer most part of this, AI, so, unfortunately, AI has a definition, which is ambiguous, and then different people have different definitions, which is in contrast to, some things are further into the circle, which are different types of AI, which are very concrete definitions. I and many others now think of AI is just, you get a computer to do something, that’s pretty smart. And that’s, you know, not not a terribly useful definition. Different people might think smart or not smart, but that, that’s a pretty common definition, and a lot of people in research are now using it. It’s, it’s easy for us to understand. Let’s get more concrete, though.

7:28

Maybe, between 90 and 95% of AI applications today are using machine learning. And that is something which is really well defined. And after this slide, I’m going to show you just a couple of examples of what is machine learning. What are the parts of it? How do they fit together? But machine learning is really a way for us to look at patterns and historical data and use that to make predictions. Prospectively. I’ll go into more detail about that. The other one that you hear frequently is deep learning. Deep learning is a type of machine learning. It’s just a very specific way of looking at the patterns in historical data, and using that to make predictions.

8:11

The thing, which is potentially special about deep learning, is that, I guess they’re probably Yeah, all right, two main things that are important to know. The first is that we’ve tried machine learning with images, and with video, but really, with the visual information for decades. And we’ve only become reasonably good at it, using this particular type of data, images and video.

8:41

And the same is true of text and audio, Really only become good at that in the last seven years for images, and even less than that for text. And that’s because of deep learning techniques. Again, just using historical data to make predictions on new about what? How to classify new data. You also take care of neural networks. Neural networks is really. I think that’s a synonym for deep learning. Once upon a time decades ago, when they were trying to figure out how to how to make this work, it was inspired by connections to how they thought the human brain worked. Now, most people who really know what they’re doing, I think that, that’s not a useful analogy. It’s just a term that’s less leftover from the past, and you should be careful not to get too sidetracked. When people talk about, oh, it’s like a replacement human.

9:32

It’s not, It’s doing some something which, you know, certainly, you could almost call pattern recognition. If you go to the next slide, I’ll show you the parts of how machine learning works. And deep learning is not that much different.

9:45

So this is just a table of data, You know, you might store it in a database. You might store it in a CSV file. You can even have this type of data stored in in a spreadsheet. And there are a few terms that are really essential to getting the right mindset for how machine learning works. So first is Unit of Observation that we’ve got in this case, a row, it’s just, how do we divide up the data from when we want to make predictions. What sort of the unit that we’re using to make predictions. In this case, we could say, it’s each row. What does it mean? It could be the behavior of a given customer on a given date. So in this case, these first two columns are how we define the unit of observation. And deep learning. Frequently, one image, we make prediction for one image at a time. Whenever you, whenever you’re making these predictions, you want to make sure that you’re crisp on, every time we make a prediction for every time we look at that one unit and say, what are the patterns across these units. What is what does that sort of atomic unit that we’re working in?

10:57

The second piece is, I’ve said that this is useful for prediction. So we need something called a prediction target. In this case, that could be how much is someone going to spend. Because it’s a given customer on a given date. How much is this customer going to spend on this date? In this case, spend would be the predicted target. And the last is everything that is used to make that prediction. So here we say, we got some historical data, a bunch of customer date pairs for each of them. We have, how much they spent. And then we’ve got different things that we could use to find patterns that help us to track spend. And I’ll show you in the next slide. I really like the simplest algorithm that I know. it’s not, really, It’s not the best way to make these predictions. It’s one of the worst ways, but I think I don’t get the ideas in your head.

11:49

Or, alright, how could we divide data, use historical data, to find patterns in it and then make predictions moving forward. So this is something called a decision tree. Again, it’s a pretty naive way of doing machine learning, but we start with 50,000 customers. And then we’ve got an algorithm, which was breaks them into different groups based on their behaviors, does that in a way. Which is sensible. And so the first time that we split our data into two groups we can say, alright. For each row in that dataset, did they visit more than two times? If so we have a group, which is on the right. If they’ve been less than two times, we have a group that’s on the left. So here you have 10,000 people on the date of their visit in that data set, they had made at most two previous visits to that site. If we want to make a prediction for someone in that sort of bucket of users, we just take the average value of people in our historical data that went into that branch of the tree in the future.

12:55

You need to make a prediction for how much would this person spend if they come to our website today, well they’ve made less than two visits, they go down that branch of the tree. Our prediction, seven bucks. You can also have some nesting here and in many cases this will nest more deeply and you say well, we have forty thousand people who made more than two visits within that group. We say, well how, what’s the total of their previous purchases, and then we could. you know, of people who made more than two visits but had never made a previous purchase, on average when they came on any given day, they spent $11 on that day. And that’s how we would make a prediction for someone in this group. Again, you know, we can do things that are, that are much savvier, we can make a bunch of these different trees that split in different ways than like averages of those. So there are better ways you can make these predictions. But at work we’re just finding the patterns in the historical data that we collected.

13:52

And then from that, when we need to make a prediction for someone, we look at this model, say alright, given those patterns, what’s our prediction? What’s been common? And people in that, within that sort of bucket of that behavior in the past, that’s how we make predictions. So I think I’ve given you the structure and so yeah, that’s good.

14:15

Then the next slide, and I’ll, I don’t want to do, but I want to just briefly make a clarification about, what do these models do and what are the limitations? So you saw in the last slide, really what we’re doing is prediction. There are many things that you see where you go alright, it’s really doing something quite clever. It’s not obvious about prediction, but in every case, machine learning, around 99.9% of cases we’ve found a very clever way to use prediction. You know, something that seems unlike prediction. So let’s say a chat bot that seems like it’s not prediction but actually it’s collected historical data, how did a person respond to give a text? And then it responds in a similar way. So it’s doing predictions for how would a human have responded. You know, if you have probably the most famous model is Alphago, is this game, this program, that could be the best humans in the board game go, it was predicting for each potential move, am I going to win if I make that move.

15:26

So it’s really focused. In many cases that’s not prediction, is not enough to tell you what you should actually do, you need to bring in other sources of knowledge and in many cases that’s knowledge that lives in someone’s head, you know. What’s our markup? Maybe that’s something where. you could go and ask someone but it’s not in your predictive data. And so to go from, I can do something predictive so I can do something which is prescriptive, and helps me achieve some business level. KPI is a technology that really is not widespread. It happens to be, you know, I work at Decision AI or I found a decision Ai, and that is exactly a tool so that you can bring in. Or we’ve got some models that make predictions. But then I could say, what are the other sources of data I have? What are the other sources of knowledge I have? Integrate those and try and come up with better decisions that optimize bottom level, bottom lime business KPIs.

16:23

So let’s do a quick quiz. hey Dr. Becker, do you mind if I ask a quick question here before we get into the quiz? So I think this is the previous slide is a really important point right? Of machine learning is a predictive right? In nature versus prescriptive. And I think like, you know, one thing I was trying to get to with the article that I wrote was trying to figure out where to call BS right, and I think pretty much everyone in the advertising space or the ad tech industry is out there touting some sort of machine learning AI type solution. But I, you know, my personal belief is a lot of them don’t really have that. And it’s more so smoke and mirrors and maybe there’s some very very basic algorithms, behind the scenes and opposed to actual AI or machine learning. So I would love to hear from you as the expert, you know. I think this is really helpful, predictive versus prescriptive. But are there any other ways or how would you specifically advise folks to be able to make a distinction between AI and then you know, maybe just a basic algorithm or some very basic kind of calculations that are happening in the background?

17:30

Yeah and you know, this comes back to when I had that, when I had that venn diagram. you know, I tend to think that we over emphasize whether it’s the importance of whether something is AI or not. Sure, you know, I’ve got a problem and someone could say, if I’m in a nail, you know, longer construction, someone could say I have an AI solution or someone could say I have a hammer and I actually don’t care whether it’s AI or a hammer, I just care is it gunna get this thing build? And so I think that when you say, is it BS or not, I think we are better off calling BS or not BS on, does it solve this problem well?

18:16

Yeah, and if the thing that powers it, AI or not AI, I think it could maybe be academically interesting, but I actually thing it’s frequently a mistake for us to even worry about that. And if someone says, I’ve got this thing that’s AI that’s probably not enough for me to say I should buy it because it could be AI, it could be doing them as far as, it could be the best human to go, and it’s not going to help you, you know, sell more. Yeah probably not. Yeah so I absolutely agree on the outcomes piece. That’s the most important thing that ultimately matters. And honestly because there is so much, I think misdirection, not just around Ai, but around a lot of things in this industry. I think you know, buyers more than ever need to just be focused on, hey does this solve my problem? But my only counter to that, my only caveat would be like, I think you know if you’re a buyer and you have a pick of you know, 10 different solutions to solve one specific problem. And you can kind of understand that one of those solutions, or many of those solutions have a tendency to misdirect maybe more than some others. I think that could be a part of a decision-making process, right?

19:30

Because I think still in the industry, even though so much is automated, so much is also still relationship driven, and trust driven right? And so that’s the only reason I ask. But I absolutely 100% agree with your point that buyer should be focused on outcomes more than anything.

19:47

I was recently at a pitch competition and every startup at the end of it kind of said, oh, and we are developing AI, or we use AI and it’s that kind of thing that I think we have to be careful of, right? Because it’s, you kind of ask them, what does it do or what is it going to do? There’s really, they have no idea. So I think that’s to your point, shifts in advertising you do are making these decisions and I think one of the things we’re really trying to do with all of this project in general is, let’s cut, let’s break it down to what really matters. That is getting a profitable conversion right? Or getting your, getting business out of the marketing spend so that’s kind of what we want to focus on. But there’s a lot of work that goes into it.

20:27

So okay, go back to you Dan, I guess, am I in the right place again yeah. So let me say you know, you’ve raised this, Shiv sort of touched on this issue of, when it’s predicted, so think more about predictive versus prescriptive. You know, I think that there are cases where good predictions is enough, you know, if I’m gunna predict how much someone’s gunna, let’s think about two different websites, one is I’m gunna predict how much someone’s going to spend today and they’re never going to come back and I, and that’s just, they’re buying something, it really is a one-time purchase. If I can predict how. much they’re going to spend today, it’s probably actually a good measure of you know, customer lifetime valuables is the same thing. If my markups 100% there’s not a lot of other information I need to bring it. Just a prediction could be enough, you know, if I have another business and I’ve got a lot of different products that I sell, and they have different markups and I’m trying to build a relationship and the value of having someone come to my, make a purchase today is not just the value of that purchase today, but it’s the fact that now they’re familiar with me. They’ve got some other customer lifetime value and even the value of forming that relationship depends on, what’s the way that I interact with them in the future.

21:55

Now, this sort of things are easy to predict. There’s no simple answer to that and that’s when you say, we need to do something which embeds a lot other knowledge and there’s no way to automate that stuff like, if that technology that’s selling you to use it, well it needs to reach inside your head and say, how are. you going to change how you market to your customers in the future, you know. Someone who says , we’re going to tell you the customer lifetime value and they’re not getting that information out of you. it just can’t be yeah, it even conceptually can’t be done.

22:33

Interesting, okay. So you have a, you want to do a quick quiz? yeah, and this is you know, I think we always learn that’s like thinking about things hands-on. So look, let’s do a, I’m gunna ask a couple of quick questions. I’ll give you three to five seconds to think about it and then I’ll just tell you, you know, if it’s possible or not possible. So all of these, the quizzes, do you think this can be done okay? So the first one, predict how much someone will spend with. you in the next six months. So can this be done or not be done?

23:05

Gifting something is probably like, maybe even the most and I, my especially it’s not tech but it’s probably been done for a super long time. In marketing it’s kind of like the canonical use case. So it absolutely can be done. What is the way that it’s done. Oh we look at a bunch of people who made their first purchase more than six months ago. For each person we collect data, we say, how much do they spend in a subsequent six months. That’s our training data. We use patterns before that to predict that someone new comes to our website and we make a prediction. So this is basically something everyone on this call might be familiar with. This is right, very possible okay. Predict whether a user will purchase a product that you just added to your inventory for the first time. So totally, new project product totally. New product, can you make predictions about who’s going to purchase this.

24:00

Alright, so I’ll give you a few seconds to think about it. You know, this is, the answer here is mostly that you cannot, maybe you could use a standard and say, this is similar to a previous product and I’m just going to ignore the differences and now I can say, would they have purchased the previous product, you know. The thing that Amazon does because they are at such a large scale, is they say we’re just going to show, we’re just going to recommend it to a million people over the next day. And now we do have historical data, but for someone who operates at a smaller scale for a totally new offering, you just need to run experiments and see who’s going to buy it. Because the historical data doesn’t exist in ML or AI, can’t help with that.

24:45

Awesome. And just so far, we’re just about to get into a Q and A, so if you have questions for either any of these people, just start, post it into the chat window so. Awesome, yeah. So I’m going to go through, actually this will be one more. So last one right, yeah. Find a description that will cause the user to tell their friends about your product so that sort of organic growth is hard to measure. And as a result it’s hard to predict with AI.

25:27

I’m kind of hearing like, scale is important and so you have lots of data and a little bit of human input is important. Does that kind of summarize? Yeah. And being able to be market, and being able to figure out what the previous pattern is that you know, make the decision of what the right prediction target is.

25:49

Here we can skip this one, and I think, and keep moving, okay. So that’s interesting. I think hopefully. you guys are getting, and there’s some questions coming in. Hopefully you’re getting like a general overview of kind of what’s from machine learning in general. I want to just do a couple of things, a couple of problems that I see. When I look at this, is the first thing is everybody’s black box, so there is a AI running but you don’t necessarily know what it is right? So you’re trusting that Google is driving the best possible conversions for you. Yup. Who knows? Right. So that’s one problem. I’m not saying that’s good or bad right? Everybody has a right to their technology. That’s one problem with this. Second thing is, we get very confused and I think this is Dan’s, one of Dan’s points is you can get correlation and it’s not causation. Should have talked about this in his article, which is an XML. We’ll give a link to it, but it’s this is a common one that I’ve seen for years and stats is ice cream sales correlated with shark attacks. Well yeah, they’re they have similar patterns, but they’re not necessarily related right?

26:48

The other problem that I run into all the time with this is it just doesn’t work like it’s supposed to. Like it doesn’t work like you want it to. Here’s an example. This is Google’s image classifier, which is an amazing tool. You upload an image and it can tell you all kinds of stuff about it. You can see, it’s not this chart, this was a bracelet right? So it’s like, doing its thing. But if you’re not involved in it and you’re not watching or. you don’t know what you’re even looking at. You can get some real false positives. So real quick, I’m going to show you an example we did. But there’s probably some SEO people on here. This was my first exposure to like big data, is trying to solve Google’s SEO. So they obviously have an algorithm. It’s fairly simple. It’s based on who else thinks you’re popular and it’s most, basically, I just want to show you what it looks like behind the scenes. So you don’t get so mystified by all this stuff right? This is just a python running an open source thing called Numpy that helps you with big arrays of data and it’s running that algorithm, right? So that’s what it kind of looks like on the back end. And when you put it together you can kind of start to see how some of these guys, some of these players like, what their algorithm is, this is actually off their website, Facebook’s website.

27:51

But they’re basically just saying, when you go and try to create an ad, we’re gunna charge you based on what other people have done with similar ads. What we think the action will be, what you’re bidding and how good the ad is, right? So they have like four or five variables behind that. They have all kinds of information from past ads. They’re using that combined with how your campaign works to try to optimize. So we have at Perfect Audience, and it’s not a Perfect Audience pitch by any means. This is a, it’s an engine that’s an intelligent engine that’s designed to kind of optimize your ads. So it shows ads to people that already visited your site and goes out and finds new ones and so on. So, the way ours works, and I just want to be as transparent, so you can see, this example. So it starts, for Dan’s point, we have an advantage that we have.

28:38

We sit in the middle of hundreds of millions of shoppers every month. So we have some data right? We know roughly, kind of who these people are. We have audience data built up, we have activity bit data. But then what it does is simply, is just taking out the stuff that’s not working right? So it’s saying this ad was shown here, it never did anything or whatever. Let’s remove it and it just goes through that systematic process until it gets better and better results. And the nice thing about this and this is like, how this is, how it should be. So there’s a hugely complex algorithm right in the background in real time. You don’t have to know anything about that. I don’t even have to do anything about that. I have to know what I’m trying to optimize for and I have to make sure it’s set up in a way where it could work right? So if I’m optimizing for a conversion goal for example, but I never get any conversions, maybe my conversion goal is too far down in that funnel and it never fires in that early part of the campaign.

29:30

It can’t kind of train and learn so that’s a problem. And just when you’re as a marketer as you’re kind of thinking about it, this is a real example, this is a fortune 500 customer of ours. They basically have all these campaigns running all these different ads, years and years been trying to manually optimize and get the right combination and then boom, you just turn on the AI and you can kind of see where they’re at. They’re at about 25% average CPA before they had a target of 10. You can actually see over there was a system error where I turned it off. And then it started to drive it back down and then they introduced new products. There’s a whole new funnel now, right? The model just suddenly changed right? So that changes the whole thing, and but that’s the basic thing. So it’s looking for that conversion goal in my case. And it’s saying what impression would tie back to that, most likely to tie back that it’s using a mix of stuff it already knows and stuff that it’s learning as I go, so. Bu the reason I show this is there’s two things, one, we can, we could take an advertiser off the street and get them optimized.

30:34

It works faster and better the more you, more data you feed it right? Just because we need kind of information and it works better if you have somebody actually managing it and figuring out what is the right target and so on. Here you may have seen this ad as you guys were searching around. This was designed to follow people that basically came, you can kind of see it. I had about 10 days to run not enough data really to kick in, but the blue is the clicks, it’s the same span, same ad, blues, and clicks red is the conversion. So it’s starting to figure out what’s working, right? So if I ran this another couple months or if I’d run a much bigger set of data would have worked a lot better.

31:10

You guys probably, just as an example, you guys probably all do Google Adwords or Facebook Advertising or something like that. The title of this webinar was how to optimize, so we want to hit on that here. And we’re just about ready to get into questions here, so put your questions, but when you’re running Google, this is kind of something to think about it this is where actually Eric and I, Kathleen is on here in charge of the marketing, we all are working on this ourselves right? We have this campaign, Google’s driving traffic. We’ve been moving that conversion goal further and further into the funnel. So that started it, was optimizing for leads. Then we found the leads were kind of fraudulent or a bunch of fake, or bunch of junk right? It was like a false positive. So then. you move it to sales and now we’re all the way down to like the person actually logged in to the account use the system that’s telling the AI on Google’s case to optimize for that. So just when you’re setting up this stuff, that’s kind of something we can do.

32:02

So we’re just gunna get in, we’re gunna, we’re gunna let you guys kind of ask your questions. This is all gunna be available recording. We’re gunna send it back around, everybody and clean the deck, but the big thing is to start with your goal, right? So start at the top of your funnel. We just go. I just got off a call with a new advertiser. They have like a six step registration process and they have no data yet. So the kind of, the recommendation is, let’s turn on the AI. Let’s make the conversion goal they just visited a key page right? So we start to get lots of results. Then we start to test and incrementally change channels and I think this is one thing we get a lot of questions about attribution and advertising and Facebook got the credit, the Google drove the lead and we were, they saw our banner and so it’s this big mouse or it’s a spaghetti bowl and the way that you get around that is, you just systematically start testing different funnels one by one.

32:49

So anyway. A couple of tips like that I want to go ahead and just open up the questions here. We’ve got some questions coming in and you guys can just hop on these. So first one is from Brian. Brian asks at what point do you determine a model has to be retrained? Is it from a set deviation from the accepted values? So I don’t know Dan, do you want to take that one?

33:12

Yeah I mean, so the most important part is that you have a way of getting the data over, getting the change in model accuracy over time you know. It’s quite common unfortunately for people to build a model then deploy is and it just, it’s getting used but they’re not able to bring the data back in about it’s still functioning well. And so the most important part is to have the infrastructure so that you know, if your ads are not performing as well, you are aware of that. And then yeah, the second part of that, which is alright, so I’ve got, I can see whether it’s performing worse or better, there’s no, that’s just a judgement call. There’s no simple answer or threshold where you say above a certain number is okay and below that number is not okay or vice versa. Okay Eric I. know you had some questions for Dan.

34:13

So an interesting thing about this is, Eric is the general manager of Perfect Audience and he’s kind of tasted with how should the machine learning, how should this product evolve. So I know you had some questions while we have these guys on here. Do you want to ask those while the rest of the people are coming on? Yeah. I think I’d be remiss if I didn’t ask like the experts you know, and it helps us right? I mean we’re very deep into overhauling some of our own underlying tech, both you know, just on the product side, and so you know, its interesting like earlier on Dan you were sort of talking about you know, how like to compare you know, and one of the things that I was thinking about you know, for us, I would say historically we’re been where I say we I mean Perfect Audience, has had been sort of guilty of putting some fairly sophisticated algorithms in place right? But nothing overly sophisticated. So we, I mean, we’re guilty right you know of sort of playing the AI card. And that to me, it really it sort of bothered me a little bit.

35:30

And of course when I say we is in Perfect Audience, Sharpspring, which is the parent company of Perfect Audience, acquired Perfect Audience last November. We closed November 21st. And so you know, when we came in we looked at the business, we said this is a great business, there’s tons of really fundamental things going on that I’m really excited about. But one of the things that we really wanted to overhaul was you know, what was driving the conversions that our customers were getting. And so we’ve spent a lot of time and resources and money and development hours really, wrapped around a newer version of the underlying engine that drives Perfect Audience and maybe Jalali will talk a little bit about that. But Dan, you know, I guess one of my questions to you was okay, so it’s not you know, it’s not necessarily binary right? So in other words either you are doing it or you’re not doing it. There’s also degrees with which you are more and more effective and so over time, as other tech you know, catches up, other competitors to us, you know they get better and better. You have to be able to us Ai if you’re going to be able to compete over time right?

36:57

You have to utilize these large data sets and I remember you and I were talking, I don’t know, four or five six months ago, something like that about this exact subject. So I was just kind of curious like what your thoughts were. That like what does that look like to you. Like what should I, the GM of Perfect Audience, sort of be focused on right now?

37:17

Yeah and it’s going to come back to which was talking about Shiv earlier, maybe the distinction is not where is there AI, but you don’t want the chasing AI to drive your business goals. You want to say alright, I’ve got some funnel or I’ve got, what are the metrics that we want to move. Yeah and then now AI is going to be in service to those metrics. So you know, whether it’s conversion rates or CPA whatever whatever it is, you’re going to say alright, I want to make better here’s the decision I want to make better maybe that’s where do I place an ad, and I do think that there is room at that point to say you’ve got all this data. I mean, you guys are in a unique position to have that amount of data and I think that now, especially some of it is a prediction problem of, give which android place is going to have the highest, that’s gunna have the highest click through rate or whatever it is you’re optimizing for.

38:23

And so I think of it as not as starting what AI do I want, but rather what business goals am I trying to move and then after that, the AI should fall into place or you know, what is it that I need to predict because that’s where machine learning is so effective. What is it that I need to predict so that I’m moving those business schools I love that. i think there’s something too about being about to drink your own champagne right? So we for our own campaigns we use Perfect Audience right, and so figuring out what’s working and what’s not working and being able to iterate and then building around our, the KPIs that are the most important to us. If I’m putting my marketing hat on for a second what are the most important things for me and then those things translate across every other marketer.

39:20

That’s great. So, Shiv, so just sort of shifting the subject real quick, so you were the VP at Criteo right, and so you have years of inside knowledge in the industry. You’ve seen sort of the shifting that’s occurring you know in the landscape over the last few years especially. So what do you think people need right now? I mean what’s sort of like, what should people be working on right now?

39:55

Yeah that’s a big question Eric. We could probably spend a few hours on that one but I’ll try to keep it short. I mean actually we’ve run workshops where we walk through like trends and things that are important right now in the industry. I think top of mind for everybody right now is data privacy right? So data privacy is obviously super important to consumers you know. The government is talking about it a lot. We have some regulation now in California, we have some in other states as well. So as data privacy kind of continues to evolve the landscape, you know it’s going to affect how everybody advertises online right? And that means third-part cookies are going away. That means the IDFA from Apple is going away right? And it impacts all the targeting, the measurement, the planning, the planning, the way buyers buy online today right? And so data privacy is super top of min. You know, go on any one of the big trades any given day of the week and it’s top five to ten articles or talking about something data privacy related.

41:01

So I’d say that’s number one. You know, adjacent to that, I think you also have the discussion around things like measurement, around things like identity right? And all these things again, I say adjacent because they’re very much adjacent to the data privacy conversation. But I think that data privacy conversation then spawns into a lot of different other areas like this, right? So I think that’s top of mind. I think what’s important to recognize, especially for this audience here today and for you guys per Perfect Audiences, like some of it’s noise right? Like it’s important to keep your head down and continue to kind of execute on your vision for yourself and for your customers. And I think you can keep doing that even with everything that’s going on. And I think a lot of that comes down to what Dr. Becker just said around, you know, focus on outcomes. if you can focus on outcomes you’re aligned to your customers and what they want to do. And that’s at the end of the day, what a buyer cares about most right? So I think it’s important for you guys as somebody that’s selling a technology in the space to help your cost for you guys first and foremost to stay focused on what’s important to your customers and then also to remind your customers about what’s important too right? Because they’re out there reading the trades every day too and getting all that kind of tangential noise in the their brains. So that’s what I would say.

42:25

Yeah I think that’s dead on. I think just performance and attribution you know, the at the end of the day, they’re, you’re exactly right. There’s a ton of noise, there’s a ton of things that you could be looking at and you know, it’s funny, we just, I don’t know three days ago, we were just talking about this in a product meeting. We were talking about some of the sexy things that you can sort of bring to the table in terms of new feature development and other things. But really you know, I think rightly so, the focus for us anyway has been, how can we get better performance for the campaigns that we’re running and just stay sort of like singularly focused on that. And then that sort of leads to so many other things. But I think attribution is way up there right? I mean attribution and really understanding where that conversion is coming from and being able to make sure that your dollars are being the most effectively spent. That’s critical. So yeah, to Jalali’s point earlier right about causation versus correlation right? Like I think five, ten, fifteen years ago in this space, everyone was kind of obsessed with okay, how granular can I get with attributing, you know, x dollars to y impression or touchpoint. And I think the industry over the last few years, partly because of some of the data privacy shifts has realized like okay, actually we’ve been solving the wrong problem right? We’ve been trying to assign credit for way too long as opposed to seeking out true causation. And I think that’s an evolution of the space that’s been really positive because the industry’s kind of pushed us in that direction as cookies go away, as the walled gardens get higher.

44:13

You can’t just attribute, you can’t tag everything and attribute a value to every single touch point. So how do we get smarter as a community? Well okay, we need to actually zoom out and seek out causation, not correlation.

44:26

That’s absolutely right. So Jalali we’re getting a ton of these questions that are coming in. Do you want to take us through some of them?

44:33

Yeah sure. So this correlation or causation versus correlation is interesting to me. I saw it described as, I mean I’m not picking on Facebook at all, it’s a tremendous platform, but they, somebody was describing it as, they hand out coupons at the cash register line right? So it’s like, they, I’m not saying that’s good or bad, but that’s different than going out and finding people that would be potential buyers right? So that leads to, Daniel actually has a great question that ties this, so he’s saying for Facebook algorithm, you need 50 results in seven says to exit the learning phase and set a proper probably deliverable so he’s asking us like how does our platform work in comparison. But I guess to just touch on that so 50 results in 7 days, for this learning pace basically I think what he’s referring to is just this initial period where it spins up campaign and tries to determine kind of how it’s going to work and determine which placements to show. Now I do think in Facebook’s case, they actually do use a fair amount of historical data. So you could come into it blind. It would have a sense based on similar product category and etc. In terms of Perfect Audience we don’t have a set amount of time. It’s not, it’s a continuous optimization. So in the beginning it’s going to be your poorest performance and depending on kind of how many conversions you’re getting as a function of impressions, it’s going to start to improve and there should be no limit to how far it could improve.

46:02

Like there’s obviously a market cap, but that’s a great question Daniel. Does anybody else have anything to add to that? let me pull up another one here. I think your other slide, can you go back a couple of slides just real quick? I mean that’s sort of what you, that’s exactly what was happening here. Let’s see, one more I think. No yeah, there you go. So that, yeah there we go. So that’s exactly what’s happening here right is you know if you think about this in terms like a time series, the algorithm, you know, zero data you’re starting like clean slate right? And over time what’s happening is, as the data is being sort of ingested and figured out right, which you know, which publishers, which ads are working better, and cutting out the stuff that doesn’t work. Methodically and as quickly as possible, you start seeing lower and lower CPAs. So yeah, I mean that’s, this is like the graph that sort of answers that question. Yeah it’s honestly, from my perspective, I’m fairly experienced. Optimizing these can work with data. I could never have done this without this model right? I could never accidentally turning it off in the middle of the campaign.

47:18

Yeah yeah. That was me. Like they’re wondering what they actually get why is the humans can help and they can also get in the middle of it. So great questions. Let’s just, so okay, how does machine learning compare to regression modeling? That’s a good question. Yeah, so alright. So regression modeling, can that phrase can get used in two senses. The one which you probably mean is that for a long time people were running regressions in the sense of linear regression or logistic regression. Machine learning is doing the same thing, but it just does it more. Image makes more accurate predictions and it does those because of the way in which it picks up interactions between different variables. But you know, it’s just doing the same thing conceptually the same thing, but it does more accurately, some people make a distinct, which I think is not the way that this distinction is the question. This question is asked this distinction. Some people ask for I think in terms of regression, which is predicting a number versus classification and that machine learning can do either of those.

48:32

But I think you really mean, ten years ago people were using a piece of software called SAS or Stata or SPSS to run regressions. Machine learning is doing the same thing more accurately. So the most basic way to describe it Dan is this course, like you’re going to see a pattern from some historical behavior and then your machine learning is going to kind of try to match up those patterns right? Or find things that are, don’t match the pattern. From a stats perspective, is that like a rough way to describe that? Yeah, and especially find things that match, sorry figure out what the path is. I really say figure out what the pattern is and then when we get a new data point we just apply those patterns.

49:16

So in a regression framework we say every time or in a classical regression we said, for every time the person had previously visited our site, the likelihood of our sale went up 2%. So we’ve got some baseline. Now someone comes, they visit our site 10 times. We just take the baseline at 20% and that’s our predicted probability. Okay yeah, but causation correlation though, I mean that’s the thing. That’s where everybody sort of gets tripped up. I mean, we’ve talked a little bit about that and Shiv, I can’t remember if this was in your article or not, it talks about the correlation between the time of day, and your cell phone charge and no that wasn’t you. No my chart was the correlation between the Cats movie coming out and the economy going to, excuse my language. And that was correlation as opposed to Covid being the issue. And that actually being the causation. So yeah. That’s exactly. And then so sort of same sort of story. The correlation between insurance underwriting and your cell phone battery charge when you fill out the application. And we have something that I would love to test on my, which by the way, fascinates me. We could spend like 20 minutes just talking about that.

50:45

But setting that aside and Dan like maybe we ought to talk about this after the call or something, but then there’s this other one that I think very much is like us, like you know in just a very real case study for Perfect Audience. There’s a correlation between the time of day sign up right, and the quality of that account and the quality of that advertiser and the quality of the campaigns that we’re running based on when they signed up during a 24 hour period. And even if you’re cutting out international and everything else, you’re just talking about domestic US. There’s a correlation between those two. So yeah, I mean I think just there are so many things that you know, AI is going to be able to pick up, you know, that we would just never in a million years be able to draw the connection to. And you know, just fascinating to me like I said, we could spend an hour on this. So anyway, sorry, a cell phone charge one is into me before I close out. It’s like that correlation your causation, yeah. Just correlated that you have a poor cell phone or whatever. And. you’re more higher risk or is it you know, that’s one that would get me I guess. I don’t know.

52:05

Yeah I would say that’s correlation. It’s coalition that just happens to correlate. It’s not well correlated. That’s right. Yeah. What I think correlation is a little bit of a grey area right? Because I think things can be tangentially related right? And, but the key is, one doesn’t impact the other right? Like things, two variables can move along the same line and be like related to each other in some way. But it’s not that the battery being drained is causing the other, the insurance thing to happen right? I think that’s the key thing. Is there’s no direct impact from one variable to the next that’s causing it to move in the same direction right? You know, one of the ways that data scientists think about this, which I think it quite useful, is just to when we say, what does causation mean really, it’s right what are the things we control if we were to do something that in, you know, so you’ve got this example of we see this historical relationship between ice cream sales and shark attacks. They both peak in summer alright. If we were to give everyone a discount for ice cream, that would change the number of shark attacks and that’s how that, sort of like, we can make do some intervention and that’s how we sort of tease out that difference.

53:26

And it’s super useful because in marketing, like the thing you’re doing is changing. What people see and so is it correlation or causation. Like the right definition is, if we do more of this thing, are we gunna, or we’re gunna sell more. That’s an amazing example. And thank. you for explaining that. And much more eloquently that I did. So thank you. Yeah, super helpful. Well to that point, as we close out here, I guess for you guys, so let’s say, I’m an ad network, I’m serving ads. I can tell through my modeling that someone’s about to make a purchase and I show them an ad. Did that, is that correlation or is that causation? And I think that’s really, does it help if I see an ad right before? And should I get credit as the network for, to Dr. Becker’s point, the way you test that is shut that ad down right? Like run it and then don’t run it. And see what happens. Right? if you’re gunna tell 100 of these things without the end, you saw 100% of them with the ad, I don’t care what they saw last, you were going to sell the same amount. That’s not, I would say, that’s not a casual relationship.

54:33

But maybe if you’re the last, still might want to try and claim credit for it right? You do want to right? Because you’re trying to show optimal CPA and show that you contribute. I go back and forth because we know like, we’ve seen things. You’ve run a big prospective campaign with your banners. The SEO goes up right? Because people are seeing it and they’re looking up the name, or you know, we do see these things. I think one of the challenges is all this, and what we want to upgrade. Yeah, what should we focus on you know? It’s like yeah, how do we best help you as an advertiser? Make more money. It’s really right, not we serve that’s people never buy, you know. I think too, I mean you have to look at it holistically. If you start getting lost right, going down different rabbit holes and you’ve gotta start with, we’ve done, I don’t know, two, three different webinars, you know, over the last several months about this topic. Where we just, we talk about like, what are the established KPIs that are really going to move the needle for you and your business Right? Focus on those and then work your way backwards.

55:44

If you think about it in terms of a funnel right, what are the most important things. I mean, that’s why we talk about retargeting all the time and you know, it’s Jalali I mean, we talk about that being sort of the low-hanging fruit of the advertising world, right? I mean that is, you start there and you start working your way back up the funnel based on whatever. I mean, just apply an 80/20 rule right? And get the three, four, five KPIs that really matter that drive your business right? Focus on those and get those right. And then you can start moving out in concentric circles. But yeah, I know, like I said, we could probably spend a lot of time on this so. So Jalali, we’ve got a ton of questions. It feels like maybe you and I can do a follow-up video maybe on Monday or something I guess, Tuesday. Where we just go through, we answer some of these questions offline and then post them. But you know, is there anything else that you wanted to cover as we’re going through the slides here. Yeah and we’re just about, sorry guys, we’re out of time, so if anybody’s interested in, and again this is not a, we’re not trying to do sales pitchy, but I will basically post, and we’ll follow up with links to these gentlemen’s LinkedIns and websites and so on. If anybody’s interested in actually participating in the AI lab, so we’re just rolling this out, but this is where we actually work with you. There’s no cost to it, it’s just a function of your ad spend. We work with you to get this optimization in place.

57:24

So all we’re looking for are people with known conversion funnels, so like you have some channel that’s working and you’re getting leads and sales. We even give you an ad credit to kick it off. And we’re just looking for people that want to kind of leverage this and work as part of a small cohort so you can just email me. Also put my stuff in the chat and then just keep an eye out. Kathleen and the team will get this around everybody that recorded all this. So just truly thank you Dan and Shiv for taking the time to get on here. i think it’s, I’ve learned a lot. Hopefully you guys all have to. And I think it’s just the beginning of this. I think this is a huge shift in industry happening. i think part of it is just like, let’s all get on the same page a little bit so thank you all for coming by the way. Yup thanks for having me. Yeah, this was fantastic. I had a lot of fun. Yeah. Thinking, so. Awesome. Well, have a good rest of the week, everybody, and we’ll talk to you soon. Thanks.

<a href="https://blog.perfectaudience.com/author/jalali-hartman/" target="_self">A.I. Lab</a>

A.I. Lab

AI Lab Director - Perfect Audience: Serving as both AI Lab Director of Perfect Audience and CMO of Robauto, Jalali is a proven thought leader and engineer in the field of innovation and new technology. He is credited with creating breakthrough technology products, companies, and lasting IP in e-commerce, A/B testing, social media, and artificial intelligence.

Enjoy this post?

If you’ve enjoyed this post, please share with your friends on social media. You can also share your thoughts in the comments below.

>