

Episode Description
Frederick Vallaeys, CEO and Co-Founder at Optmyzr sits down with Jyll Saskin Gales, Google Ads educator, coach, and author of Inside Google Ads, to unpack how automation has changed the craft of PPC.
They cover why “guiding, not controlling” is the new marketer’s mantra, how to optimize Performance Max campaigns without illusions of control, and why bidding may now matter more than targeting. Jyll also shares real-world stories from clients, her process for scaling expertise with AI tools like Gemini and NotebookLM, and a fresh perspective on AI Max—Google’s latest automation layer.
Here’s what you’ll learn:
- From control to guidance in PPC
- Audiences: Signals vs true targeting
- What audiences actually do in PMax
- How to influence PMax without controlling it
- The 3 biggest PMax levers
- Structuring campaigns & asset groups the right way
- SMBs vs big brands: Who has it harder?
- Why small budgets struggle in 2025
- The bidding, targeting, creative triangle
- Manual vs Smart bidding (and why most get it wrong)
- How to let Smart bidding learn (without burning budget)
- Using micro-conversions to train the algorithm
- Signal duplication & campaign structure mistakes
- Strategic automation vs blind delegation
- The human role in an AI-first PPC world
- How Jyll uses Gemini Deep Research
- AI tools for marketers: Gemini, NotebookLM & “JyllBot”
- The problem with ChatGPT’s Google Ads advice
- AI makes averages, humans make winners
- AI Max: the new Google Ads experiment
Episode Takeaways
After six years at Google working with big brands, Jyll now helps everyone from solo business owners to giant agencies, and while she is widely known for her audience expertise, she is currently deep into her next big project: a full book on bidding strategy.
The conversation is really about what it means to do PPC in 2025. The old world of “I pick exact keywords, set my bids, and profit” is gone. In its place is a new job description: you are no longer trying to control every lever, you are guiding smart systems and then checking what they actually did. Jyll and Fred walk through what that looks like in practice, including how Performance Max really works, why audience signals are not the same thing as true targeting, and why manual CPC has quietly become a trap for most advertisers.
From control to guidance in PPC
Jyll talked about how Google Ads has basically moved from “I’m in control of every little thing” to “I’m guiding the machine to do the right things.
“Those of us who’ve been doing Google Ads since it was called AdWords and even before, we are very used to the era of control—these are the exact keywords I want and these are the exact bids I want, and then step three, profit. And that’s just not the way it works anymore because consumer behavior has changed, because the SERPs have changed, because AI has changed.” says Jyll.
Today, operating at a deeper level means understanding how smart bidding thinks: how different bid strategies change which queries you show up for, how your conversion goals act like a compass, and how your data, budget, and assets shape what the algorithm learns. Your job is less “do every optimization by hand” and more “set things up so the system can optimize well.”
Essentially, modern success in PPC isn’t about how often you touch the account; it’s about how well you design the strategy, tracking, structure, and signals so the algorithm can make good decisions for you at scale.
Audiences: Signal vs true targeting
A lot of advertisers still think that choosing an audience in Google Ads means their ads will only show to that group, but in many campaign types that is not how it works anymore. In older formats like Display or Video with optimized targeting turned off, audiences really were limits: if you picked “yoga apparel shoppers,” only those people saw your ads. In Performance Max and in any campaign that uses optimized targeting, your audience selections are signals, not hard filters.
“In a Performance Max campaign, we don’t have that capability. We have audience signals, which means you can tell Google the kinds of audiences you want it to focus on, but ultimately Performance Max cares more about your conversions and your bid strategy than your audiences.” explains Jyll.
That is why Jyll says people think they have more audience control than they really do. In PMAX, you might give Google a yoga-focused audience signal, but if the system discovers that winter clothing shoppers convert better, it will move your spend toward them instead. The algorithm has a simple priority stack: hit your conversion goal, use the budget efficiently, and only then consider your audience signals. If following your signals conflicts with getting more or better conversions, your signals lose.
So what really influences who sees your ads today?
Your conversion tracking, your bid strategy, your budget, and your creative. Google pays far more attention to what is actually converting than to the audiences you suggested. Once you understand the difference between true targeting and signals, you can set better expectations, choose campaign types that match your goals, and focus your energy on the levers you actually control.
What audiences actually do in PMax
In Performance Max, audiences don’t work the way most advertisers think. Adding audience signals does not mean your ads will only show to those people. Jyll puts it simply.
“I think that an audience signal was a beautiful way for Google to rebrand optimized targeting. Instead of optimized targeting, which is ‘oh, check this box and give us leeway to show ads to whoever we want’—no thank you. But when you reframe it as ‘we’re going to show ads to whoever we want, but if you want to bring your intelligence to Google, come give us a signal,’ I can see why practitioners like that.”
This does not mean signals are useless. They can help guide early exploration and they let you document your strategic thinking. They also make conversations with clients or stakeholders easier because you can explain the logic behind your choices. But signals do not control Performance Max. If the algorithm finds better results in audiences you did not choose, it will shift in that direction every time because conversions, budget efficiency, and bid strategy always outrank the signals you provide.
The real value comes after the campaign runs, when you can review reports to see who actually converted, which searches triggered your ads, and which assets performed best. That information helps you understand what the system learned, even if it ignored your original signals. So go ahead and add audience signals, just know what they really are: suggestions, not targeting controls. The real levers in PMAX are your creative, landing pages, bid strategy, and conversion data.
How to influence PMax without controlling it
With Performance Max, the mindset shift is simple: you are not driving the car, you are setting the destination and then checking the trip report afterward. You guide it up front with your inputs, and then you verify what actually happened through the reports. That is how PMax is meant to work.
Instead of trying to force control, your job is to teach the system what success looks like through clean conversion tracking, sensible budgets, and clear goals, then see how it interpreted that guidance.
The main ways you influence PMax are your conversions, your budget, your creative, and your landing pages. Whatever you define as a conversion is what the system will chase, so things like accurate values, good lead quality, and proper events matter more than any audience setting.
Your budget tells it which campaigns or products are a priority. Your assets and pages tell it who you are for and what kind of customer you want. Strong creative and landing pages that clearly match your ideal customer will shape who ends up converting far more than audience signals in a dropdown.
Once the campaign runs, the real work is in the verification step. You look at search term reports, audience insights, channel breakdowns, and asset performance to answer one question: what did PMax actually do, and does that match what we wanted? If not, you tweak the guidance, not the knobs. That might mean refining conversions, improving creative, adjusting budgets, or building a more relevant landing page. Then you let it learn again and repeat. You never fully control PMax, but you can strongly influence it by designing smart inputs and using the transparency reports to keep tightening the fit between your intent and its behavior.
The 3 biggest PMax levers
In Performance Max, there are three levers that matter more than anything else.
“My top three PMax levers would definitely be the conversion settings, the bid strategy, and the assets.” explains Jyll.
PMax can only run on conversion-based bidding, so if your conversion tracking is weak, misleading, or missing, the whole campaign is flying blind.
“Choosing the right conversion goals for the campaign at that point in time—number one—and adjusting those conversion goals can completely change what it does because that’s like the compass of the campaign” said Jyll.
Once your conversions are solid, your next big lever is bid strategy. “Maximize conversions” will try to spend the full budget first and then get as many conversions as it can within that spend. “Target CPA” tries to hit a specific efficiency level first, and only spends as much of the budget as it can while staying close to that target. Just switching that setting can send the campaign in a very different direction, one focused on volume versus one focused on cost per conversion.
The third lever is your creative. In PMax, your images, headlines, and copy are the real audience signal. If your assets strongly appeal to a certain type of person and naturally repel the wrong people, the system will learn that and show your ads more often to the folks who respond. That has more influence on who you reach than audience signals or search themes in the settings. So if you want to move PMax in a meaningful way, fix your conversion setup first, choose the right bidding approach for your goal, and then invest serious effort into creative that speaks clearly to the customers you.
Structuring campaign groups and assets the right way
With Performance Max, campaign structure should follow your business strategy, not random “best practice” templates. Jyll’s simple rule cuts through most of the confusion: you start a new campaign when you have a different conversion goal, or a clearly different way you need the system to optimize. That can mean different goals (free trials vs. enterprise demos), different bid strategies (maximize conversions vs. target ROAS), different budgets or priorities, or very different locations. If the objective and how you measure success are the same, it probably belongs in the same campaign.
“The reason to start a different campaign is because you have a different conversion goal. Maybe some campaigns are optimizing for in-store visits and others are optimizing for phone calls—like that would be a reason to have different campaigns. Different locations. If you are doing different bid strategies or different budgets." says Jyll.
Inside a campaign, asset groups are where you create meaningful creative variation. Use different asset groups when you have different products, different audience messages, or different angles you want to test, and give each one its own headlines, images, descriptions, and sometimes landing pages. What does not make sense is three asset groups with the exact same ads and only different audience signals or search themes. Since those signals are just suggestions, not hard targeting, all you have really done is split your data into three buckets for no good reason.
So before you add more campaigns or asset groups, ask two questions: do I truly need a different goal, bid approach, budget, or geo for this, and am I actually going to use different creative here? If the answer is no, keep things together. Simple, strategy-led structure helps PMax learn faster and perform better, while over-segmentation based on “control habits” just slows everything down.
SMBs vs big brands: Who has it harder?
It sounds backwards, but Jyll is very clear: it is usually easier to manage a million dollars a month than one thousand. Bigger budgets create more clicks, more conversions, and more data, which makes Google’s automation work better.
With a tiny budget, the system barely gets enough conversions to learn, tests take forever, and every mistake hurts. On top of that, small business owners are often learning Google Ads on the fly. They may not know how broad match really behaves, that keywords and search terms are different things, or that the default “helpful” setup they clicked into was actually a PMax campaign spending mostly on Display instead of Search.
Big brands have the opposite problem. They usually understand the tools and have plenty of data, but they struggle with the mental shift from control to guidance. People who built their careers on manually choosing keywords, bids, and structures now have to trust algorithms that they cannot fully see into, while attribution is getting fuzzier because of privacy changes and modeling. Inside large companies, even simple changes, like tweaking product feed titles, can require multiple teams and long approval cycles. So while the machines work better for big budgets, the humans around them can find the new world much harder to accept.
So who has it harder? Small businesses fight “data poverty” and lack of education, trying to make very small budgets perform in a more expensive, automated environment. Big brands fight psychology and politics: letting go of old habits, accepting imperfect tracking, and aligning large organizations around a more probabilistic, AI-driven way of working. Both sides struggle, just in very different ways.
Why small budgets struggle in 2025
In 2025, running Google Ads on a tiny budget is much harder than it used to be. Jyll sums it up simply:
“It is way harder today to take $20 a day and get results with it than it was five years ago or 10 years ago. Inflation, right? Inflation is real in Google Ads as well.” explains Jyll.
Clicks are more expensive, competition is higher, and modern bidding strategies need a decent amount of data to work well.
On top of that, tools like Performance Max and Smart bidding work best when they have dozens of conversions to learn from each month. With a micro budget, the system may never get enough data to properly optimize, which makes performance feel random and slow to improve.
There is also a confidence problem layered on top of the economics. Many small business owners test their ads by Googling themselves, not seeing their ad, and deciding the campaign is not working.
In reality, the ads may be showing to real customers, just not often enough or not at the exact moment they check, because the budget is so limited. The path forward is not pretending 20 dollars a day is still enough, but being honest about what a small budget can and cannot do, using hyper focused targeting, combining ads with strong organic efforts, and understanding that in 2025 Google Ads usually requires a higher monthly spend to behave like a reliable growth channel rather than a lottery ticket.
The bidding, targeting, creative triangle
Jyll explains Google Ads through a simple triangle: bidding, targeting, and creative. But in today’s automated world, those three sides aren’t equal. If you remove targeting, campaigns can still perform well because Google’s systems are now built to find the right people automatically. Performance Max proves this—you give it a bidding strategy and strong creative, and it handles the rest. Targeting still matters when you have access to it, but it’s no longer the pillar everything depends on.
“I think targeting is the least important now because I’ll put it this way—like if AI and automation takes over and you can’t control that piece, can you still get good results? If you can’t control your targeting at all, but you can control your bidding and you can control your creative, then I believe and have proven with my clients you can still get great results. That’s literally PMAX. That’s how PMAX works” says Jyll.
Bidding and creative, on the other hand, are non-negotiable. You can put the perfect ad in front of the perfect customer, but if the creative is bland or generic, nothing happens. And if your bid strategy is wrong, the campaign doesn’t just perform poorly—it behaves like an entirely different campaign.
This is why the triangle matters: your bid strategy sets the direction, your creative attracts and converts the right people, and targeting is now a “nice to have” rather than the engine of performance. If you’re deciding where to spend your optimization time, the hierarchy is clear—focus on bidding and creative first, then refine targeting where it still applies. In 2025, those two pillars do most of the heavy lifting that targeting used to do on its own.
Manual vs Smart bidding
Most people don’t struggle with bidding because they choose the wrong setting—they struggle because they’re thinking about bidding the wrong way. Jyll sees the same mistake over and over: advertisers try to “control” their bids to keep CPCs low, believing that cheaper clicks must lead to better efficiency. In reality, low CPCs usually mean low-quality traffic.
“Manual CPC is almost always a mistake nowadays. I’m not going to say always. You know, there are some strange accounts I see. But for the most part, it’s a mistake these days because high CPCs aren’t the enemy. Low-quality traffic is the enemy.” Jyll said.
Smart bidding flips the entire mindset. Yes, CPCs often jump dramatically when you switch to maximize conversions or target CPA. Jyll has seen CPCs double, triple, even jump 10x overnight.
But when that happens, conversions go up too because the algorithm is purposely bidding into the auctions that matter. A $25 click from someone ready to buy is far more valuable than five $5 clicks from people who will never convert. Smart bidding isn’t guessing—it weighs thousands of real-time signals humans can’t process and pays more only when the probability of conversion justifies it.
The hard part is that smart bidding needs time to learn. That first week or two can look ugly, which is why many advertisers bail early and convince themselves manual bidding was better. In almost every case Jyll sees, poor smart bidding performance comes down to two things: broken or weak conversion data, or impatience during the learning period.
If your conversion tracking is clean and you choose the right goal, smart bidding is almost always the better long-term option. The key is to stop chasing low CPCs and judge success by actual business results—conversions, CPA, and revenue—not by how cheap your clicks look on paper.
How to let smart bidding learn
Smart bidding can work incredibly well, but only if you give it enough data to learn. Jyll’s rule of thumb is simple: aim for about 30 conversions in 30 days. That’s the point where the algorithm has enough examples to understand what “good” traffic looks like. If you’re only getting a handful of conversions each month, the system isn’t bad—it just doesn’t have enough information yet. And when you’re working with lower budgets or lower conversion volume, the learning period can take two, three, even four months. It’s not about time passing; it’s about data being collected.
Jyll uses a perfect analogy here: judging a campaign after four conversions is like looking at a one-year-old and saying, “I talk to him every day—why isn’t he speaking yet?” Machine learning needs exposure and patterns before it can perform. Ending a campaign early because it didn’t “figure it out” with almost no data isn’t a sign that smart bidding failed—it’s a sign that the advertiser misunderstood how learning works.
And sometimes, even when the Google Ads dashboard looks disappointing, the business results tell a different story. Jyll shares an example where a PMAX campaign looked terrible for two straight months—very few tracked purchases. But when the client checked actual sales, the products being advertised were selling far more than before. The campaign was working; attribution just hadn’t caught up yet. By month three, even Google Ads finally reflected the success. The takeaway is this: don’t evaluate smart bidding on week-one CPCs or early in-platform numbers. Evaluate based on conversion volume, business outcomes, and whether you’ve given the system enough data to learn. Patience plus the right guardrails beats panic every time.
Using micro-conversions to train the algorithm
Micro-conversions are one of the best ways to help smart bidding learn when you don’t have enough final conversions, but only if you choose the right ones.
“A micro conversion is something that happens before conversion that isn’t the final thing you want, but must happen on the journey. So in e-commerce, like someone has to start checkout before they can purchase, but someone doesn’t necessarily have to view your ‘about’ page. So like that wouldn’t make a good micro conversion.” Jyll said.
Once you start getting consistent volume, you “graduate” your conversion goals. Jyll often starts new or low-volume campaigns by optimizing for add to cart, then switches to begin checkout once there are ~30 of those conversions, and finally switches to purchase once that also hits the 30–50 range. It’s a ladder: begin with the highest-volume meaningful action, let the system learn who takes that action, then tighten your optimization as better data becomes available. And to keep the data clean, she counts only the first add-to-cart per session so one heavy shopper doesn’t accidentally create 20 “conversions” and skew the algorithm.
In extremely low-volume businesses—those that get only a few leads per month—micro-conversions may still not provide enough data. In those cases, you sometimes have to fall back on weaker signals like scroll depth or page engagement just to give the algorithm something to learn from. It’s not ideal, and it won’t be as accurate, but when volume is that low, the alternative is letting the system run blind. In these situations, you rely more on tighter targeting and more manual guardrails while using the best signals you have.
Signal duplication and campaign structure mistakes
One of the biggest structural mistakes in Google Ads is separating things that aren’t actually different. Jyll sees this constantly: multiple campaigns or ad groups that only differ by audience signals or tiny keyword variations. But if the ads, landing pages, and overall strategy are identical, that segmentation is pointless. Audience signals in PMAX aren’t targeting—they’re suggestions. So creating separate groups just to assign different signals doesn’t create control, it just splits your data into smaller buckets and slows learning. It looks organized, but it performs worse.
The real question when deciding whether to split something is simple: Is the intent truly different enough to justify different creative? Jyll gives a great example: “houses for sale” vs. “new houses for sale.” Are these actually different audiences who need different messaging? If yes, great—make separate groups with distinct creative tailored to each buyer. If not, consolidate and give the algorithm more data to optimize with. Structure only works when the segments reflect real differences in behavior, motivations, or value props. If the creative would be the same for both, they belong together.
And even if you do have different creative, you need to make sure the segments don’t overlap so much that they compete with each other. When ad groups chase the same searches or the same audience pools, you’re not expanding reach—you’re cannibalizing performance and muddying the signals. The rule of thumb is simple: if splitting doesn’t unlock a different message, different landing page, or different strategy, it’s not real segmentation. It’s just self-inflicted complexity.
Strategic automation vs blind delegation
Smart automation in Google Ads is not about hitting a button and hoping for the best. Jyll’s view is that you should absolutely use things like broad match and smart bidding, but you need to understand roughly how they work and what you have asked them to optimize for.
“Something I do notice with some newer Google Ads practitioners who’ve grown up in the world where like it’s AI first, you know what I mean? They’re never touching exact match. They’re never touching manual bidding. They’re all in on the automation. And I love that. I’m not saying that’s a bad thing. But then the challenge is like not knowing what that actually means under the hood” says Jyll.
Her philosophy is “trust but verify.” You trust automation to make millions of micro decisions, but you verify through reports: search terms, queries, audiences, placements, and performance by segment. Your job is to give the system good directions through conversion tracking, bid strategy, and campaign structure, then regularly check what it is actually doing.
If the results are off, the question is not “Why is Google so dumb?” but “What did I tell the system to do that led it here?” The gap between what the algorithm is doing and what you want is your responsibility to close, by refining goals, cleaning up tracking, tightening keywords, improving creative, or adjusting structure. Automation is powerful, but it only becomes strategic when you stay in the loop and know how to course-correct when it drifts.
The human role in an AI-first PPC world
In an AI-first PPC world, the human role is less about pushing buttons all day and more about understanding how things work so you can use automation wisely. Jyll’s point is that education really matters: when you know what bidding strategies do, how campaigns behave, and what different automations actually control, you have options when something goes wrong. Instead of feeling stuck, you can pull back, adjust settings, or rethink your setup because you understand the mechanics behind the curtain.
Once you have that foundation, you can get creative with how you combine tools. Maybe you let Google handle smart bidding, use a GPT-style tool to help generate and refine ad creative, and then stitch those together into a workflow that lets you scale without losing quality. That is what strategic automation really is: deciding where your human judgment adds the most value, and letting the machine handle the repetitive execution.
The key is to avoid blind delegation. These systems are powerful, but they are also very literal. They do exactly what you tell them, not what you meant. So you keep a human in the loop: you set clear goals, review reports, add checks and balances, and adjust when the results are off. Or as Jyll puts it, remember it is artificially intelligent. It is smart, but it is not magic, and it still needs you to guide, monitor, and course-
How Jyll uses Gemini deep research
Jyll uses Gemini Deep Research as a way to think like her customers, not just to get quick answers.
“I use Gemini Deep Research, and it just—rather than trying to spit an answer out to you right away, it’ll go and check like 30 to 50—puts a research plan together, you can adjust the research plan, then it goes out and checks all these different sources and comes back with an answer for you. I love seeing not just the final answer of who it comes back with, but looking at all the steps it takes.
To see how it decides to research certain factors. So if it’s looking at, you know, what is the payment model, and is it online or in-person, and what about time zones, and what else is included, and is it a subscription or—and all these things that I wouldn’t even have thought of that that’s how people decide.” Jyll said.
For example, Deep Research might look at things like payment models, subscription vs one-time, online vs in-person, time zones, compatibility, and key features in B2B software. Some of those are factors a marketer might not have thought of, but they clearly matter to customers. For Jyll as a business owner, that is gold because it tells her what people are likely considering when they compare her offering to competitors.
From an ads perspective, this becomes a cheat sheet for messaging. If Deep Research keeps surfacing certain features or decision points, those are exactly the things to highlight in ad copy and landing pages. It helps you move from guessing what to say, to aligning your ads with what your audience already cares about and is actively checking before they buy.
AI tools for marketers: Gemini, Notebook LM, and ‘JyllBot’
Jyll’s main workflow is to simply talk through her ideas out loud and let the AI clean them up into something structured. When she is writing articles or her newsletter, she will dump all her thoughts or a coaching call transcript into Gemini, tell it which parts she wants to focus on, and let it turn that raw material into a clear draft. It is not out there researching the web for her, it is helping her organize what is already in her head into something readable and coherent. For her, AI as a thought organizer is one of the most valuable uses.
She takes the same idea further with NotebookLM and her custom “JyllBot.”
Instead of training a model on generic internet data, she feeds it only her own content: podcast transcripts, Search Engine Land articles, and other materials she has created. That way, when members use JyllBot, it sounds exactly like her and gives answers that stay true to her philosophy and style. This domain-specific approach lets her scale her expertise without losing her voice or accuracy. In short, AI works best for her when it is grounded in high-quality, specific input and used to shape and structure ideas, not replace them.
The problem with ChatGPT’s Google Ads advice
Jyll is pretty blunt about it: a lot of Google Ads advice from generic AI tools like ChatGPT is flat-out bad. Not because the AI is “stupid,” but because it is trained on a huge pile of old content and Google Ads changes constantly.
“If you just say, ‘hey, how should I improve my PPC campaign,’ with no context of any sort? Well, like yeah, then it’s going to rehash the 10 most common things that everybody would have said. But these AI systems get better if you put constraints on them and tell it what is your business, what have you tried in the past, what has worked, what hasn’t worked” Jyll explains.
She also points out that you can improve AI output by asking better, more specific questions. If you tell the model what kind of business you run, what you have tried, what has or has not worked, and what tools or campaign types you are using, the advice becomes more grounded and useful. But even then, there is a hard limit: if the training data is mostly historical and the platform has evolved, the AI will still lean on outdated patterns.
The takeaway is not “never use AI,” but “do not treat it as a real-time Google Ads expert.” Use it for brainstorming, structure, and ideas, but always cross-check tactical recommendations against up-to-date knowledge, documentation, or trusted practitioners. For a fast-moving platform like Google Ads, recency and context matter more than ever, and generic AI tools simply do not keep up on their own.
AI makes averages, humans make winners
AI is really good at one thing: creating the average of everything it has seen. That means people with little experience can now get “decent” or “fine” results, which is both powerful and dangerous. As Jyll puts it, AI is an average maker. On its own, it is not exceptional. It will happily give any personal training gym the same generic headlines that could apply to thousands of others. That is exactly why those headlines are wrong for you.
The real magic happens when you bring your expertise to the AI. If you feed it what makes your business unique, your customer reviews, your landing pages, and even competitor research, it can help you shape messaging and ideas that are sharper and more specific. You are not asking it to invent brilliance out of thin air, you are using it to organize and scale your brilliance.
The key is understanding how these systems work and where they fall short. Once you see the limitations, you stop being limited by them. You use AI to handle the heavy lifting and pattern matching, while you supply the strategy, nuance, and differentiation. That is how you move from AI-powered average to AI-amplified exceptional.
AI Max: The new Google Ads experiment
AI Max is brand new, and it is very much in its awkward early phase. Jyll compares it to the first version of Performance Max: at the beginning, it felt terrible, then it learned, and now it can work really well if you set it up correctly.
AI Max’s job is to go out and find whatever searches it thinks will work for your business, based on your goals and data. That means the first days or weeks are going to look messy. Judging it on day one and saying “look how stupid this is” is like calling a newborn useless because it cannot walk yet. It has not learned anything yet.
The bigger point is that most advertisers do not actually need AI Max right now. It is not built for small accounts still fighting for basic impression share. It is for advertisers who have already done the basics: strong conversion tracking, at least 50 to 60 conversions in 30 days, broad match in place, and non-brand impression share already in the 30 to 40 percent range.
In other words, you have squeezed most of the value out of your current keyword setup and want to expand further. If you are a small business spending 30 dollars a day with low impression share and lots of missed demand, AI Max is not your missing piece. The problem is not the tool, it is that you are trying to use a high-powered exploration system before your account has the volume and foundation it needs to make that exploration useful.
Episode Transcript
Frederick Vallaeys: Hello and welcome to this episode of PPC Town Hall. My name is Fred Vallaeys. I’m your host. I’m also CEO and co-founder of Optmyzr, a PPC management software. For today’s episode, we have the great pleasure of welcoming Jyll Saskin Gales.
She’s a repeat guest to the show and she’s one of the top educators in the Google Ads space. She’s been talking a lot about audiences—that’s probably what most people know her for—but now she’s also working on a book and courses related to bidding. So can’t wait to have a great conversation and share some insights from talking to Jyll Saskin Gales.
Jyll, welcome back to the show.
Jyll Saskin Gales: Thanks, Fred. It’s great to be here.
Frederick Vallaeys: So yeah, Jyll, before we dive into some of the topics that I’ve teed up here, you do have a long background in Google Ads. You worked at Google. You’ve written one book. You’ve built a whole bunch of five-star courses. So take people through what you’ve been up to.
Jyll Saskin Gales: Absolutely. So I am a Google Ads coach. My main business is working one-on-one with business owners, freelancers, agencies, in-house marketers, and helping them get better results from Google Ads. So sometimes that means literally walking a small business owner through setting up a Google Ads account, not doing the Performance Max campaign, and then setting up their first search campaign. And sometimes it’s working with advanced practitioners managing millions of dollars a month, advising them on strategy.
So coming out of being a coach, I have a lot of other ways that I try to help people get better results from Google Ads. There’s my Inside Google Ads course, my Inside Google Ads podcast, my Inside Google Ads book, which is an Amazon bestseller, and then of course I really enjoy creating content online, whether it’s my YouTube channel or posting on LinkedIn, speaking at conferences where I get to see you and members of the Optmyzr team, writing for Search Engine Land. I just genuinely love Google Ads. I love getting into the nitty-gritty. I think it’s fascinating—I genuinely think it’s fascinating and I love it. And so I love being able to share that passion and excitement, but also that knowledge with practitioners of all levels.
Frederick Vallaeys: Yeah, for sure. A fascinating and a fast-moving field to work in. Now, you being so into the weeds and loving everything about Google Ads and these details, I think that tees up the first conversation. As automation and Performance Max and AI Max continue to take hold and become bigger, and we’re looking at generative AI doing more of the work that we used to do, how does that fit in with someone who likes to have the control and control these things? What’s the big conversation here?
Jyll Saskin Gales: It’s really changed from controlling to guiding. And I know I’m not the first person to say that, but it’s definitely a perspective I’ve been espousing a lot lately. You know, those of us who’ve been doing Google Ads since it was called AdWords and even before, we are very used to the era of control—these are the exact keywords I want and these are the exact bids I want, and then step three, profit. And that’s just not the way it works anymore because consumer behavior has changed, because the SERPs have changed, because AI has changed.
So now getting in the weeds doesn’t mean I’m trying to set a bid for every keyword and I’m trying to create 200 different ad groups to have control. Now getting into the weeds means really understanding the way smart bidding works and how choosing a different bidding target can completely change the queries I show on, for example. Really seeing all these pieces as interconnected and then it’s our job as marketers to give the system what it needs—data, budget, assets, correct guidance—so it can give us what we need, which is conversions, profit, etc.
So controlling to guiding is the key theme, and it’s something I wholeheartedly embrace and preach, but I know that not everyone in the industry feels that way.
Frederick Vallaeys: No, and that makes sense. I mean, the types of settings that we control have shifted. At the end of the day, it’s the outcomes that we care about and it’s how you get there that matters. And so it’s just, you know, evolve along with the settings of Google Ads and control the things you can control. And so speaking of controlling things, let’s talk about the first one, and one that’s near and dear to your heart—audiences. So you’ve written a book about it, you’ve done courses on it, but do you think people sometimes oversegment and is that a risk?
Jyll Saskin Gales: I don’t see that so much in Google Ads, thankfully. I tend to see that sort of really specific audience segmentation and layering is more of a Meta Ads thing. I know that Meta Ads practitioners like to preach, you know, broad targeting, go broad, don’t need audiences at all. So I wouldn’t go that far on the Google Ads side.
But I find, you know, bigger mistakes people tend to make with audiences aren’t going too specific or too deep. Instead, it’s often thinking that the signals that they’re giving a campaign are the actual targeting when very often that’s not the case. I think people think they have more control over audiences today than they actually do in a lot of circumstances.
Frederick Vallaeys: So explain that nuance a little bit more then. So I think with keywords, we’re used to “this is the keyword we give you, that means you should show the ad for exactly that keyword.” And that obviously used to be more the case back in the day. Now it’s broad match and it’s still very much up to Google. But how are audiences different in that regard?
Jyll Saskin Gales: The way audiences are different is you can still do exact matches in audiences if you want to. Like if I choose to target the audience “in-market for yoga apparel,” my ads will actually only show to people that Google identifies as being in-market for yoga apparel. It’s not going to a close variant.
However, in a Performance Max campaign, we don’t have that capability. We have audience signals, which means you can tell Google the kinds of audiences you want it to focus on, but ultimately Performance Max cares more about your conversions and your bid strategy than your audiences. So let’s say I give the audience signal to a PMAX campaign of yoga apparel, and my bid strategy is to maximize conversions, and I have proper conversion tracking, and it goes out there and it finds that actually people in-market for winter clothes convert better—then it’s just going to show ads to people in-market for winter clothes. It doesn’t care about the signal I gave it. Whereas if it tries my signal and, you know what, yoga apparel works well, okay fine, it’ll do it.
And so that is what we call optimized targeting, where the algorithm is just focusing on who’s most likely to convert, for better or for worse. And so when you launch a Demand Gen campaign, for example, you may not realize that you also have optimized targeting turned on. So if you have that turned on in your campaigns, then your audience targeting gets turned into a signal as if it’s like PMAX. But if you turn optimized targeting off, then you do get that exact audience control, which we definitely don’t have on the search side anymore, even with exact match.
Frederick Vallaeys: So that makes sense. So you at some level control whether you’re specifically targeting that audience or whether that just becomes a signal. Now, as you described in PMAX, it’s not really a choice, right? It’s just that’s the way the campaign works. So it’s always kind of a suggestion to Google. So then in your experience, to what degree does adding an audience even help you? Does it help you if they’re just going to figure out anyway that, hey, there was this whole random other group of people that are buying this thing that you’re selling? So why would someone put an audience in in the first place?
Jyll Saskin Gales: Because it makes them feel good is my honest answer. I think that an audience signal was a beautiful way for Google to rebrand optimized targeting. Instead of optimized targeting, which is “oh, check this box and give us leeway to show ads to whoever we want”—no thank you. But when you reframe it as “we’re going to show ads to whoever we want, but if you want to bring your intelligence to Google, come give us a signal,” I can see why practitioners like that.
I think the difference or the utility is more obvious on the search theme side of things in PMAX, because just like an audience signal, search themes are like the kinds of searches you want it to show ads on, and ultimately it can show ads on them or not. And what’s really interesting is when you add search themes to a Performance Max campaign, it’ll tell you if it’s using your search themes or not—like if that actually caused it to go to searches it wouldn’t have targeted otherwise.
And my theory on this is that it’s kind of like a lose-lose in both scenarios. Like if it wasn’t going to show ads on those searches without your search themes, that means that it doesn’t think those searches are related to your landing page and your website. So actually, you’ve got some work to do. You weren’t giving the algorithm, you know, the information it needed. And if it does, you know, start serving on those search themes and Google doesn’t think those searches are related to your website, then it’s probably things that wouldn’t convert as well or wouldn’t do as well.
So my takeaway, to be very clear, is not saying don’t add search themes, don’t add audience signals. Like go for it. They can’t hurt you. You know, they can only help. But don’t think that because you’re adding search themes and audience signals, you’re controlling your PMAX campaign. You absolutely are not. You can’t control PMAX. You can only guide it and then get transparency after the fact to see what it actually did.
Frederick Vallaeys: Right. And I believe that that transparency after the fact, that is still helpful, right? Because that’s ultimately the thing that tells you, like you just said, “this search theme is just not triggering at all.” So maybe there is a mismatch in your landing page offer. So now you know if that search theme was really important to you, then go and figure out a way to fix that. Build a new campaign, a new landing page, set it up so that Google will see the relevance and show your ads for it. So it’s obviously not completely useless, as you’re saying. But it just may be that the control is not upfront, it’s at the tail end.
Jyll Saskin Gales: Or even the creative, right? Like maybe you weren’t serving on that search because you didn’t have assets that would perform well with that search. So maybe it’s a landing page, but it could also be that you need to improve your creative assets to be relevant to people searching for those things.
But we can see after the fact—like there have been a ton of great advances in PMAX this year: channel performance report, search term report, audience insights, asset-level performance. We can now see exactly what PMAX is doing to the same level as search or shopping, which is great. But the control piece isn’t there. So it’s still trying to guide with our data and suggestions and then after the fact see what came in and then use that to decide what we want to do next.
Frederick Vallaeys: Yeah. And so the asset-level reporting, that is an interesting one because there is some modicum of control if you see that a specific asset is not performing well, you can go and optimize that. Is that something? So if you had to think about maybe like your top three optimization levers for a PMAX campaign, where would you start?
Jyll Saskin Gales: Oh, my top three would definitely be the conversion settings, the bid strategy, and the assets. And I say that because, you know, PMAX can only run on a conversion-based bid strategy. There’s no manual, there’s no views. And a conversion-based bid strategy can’t work without conversion tracking. So maybe it’s purchases, maybe it’s leads, but you know, maybe you’re working with a really small business and so actually you’re going to set that PMAX to optimize for “begin checkout” instead of purchase just to help it ramp up.
So, you know, choosing the right conversion goals for the campaign at that point in time—number one—and adjusting those conversion goals can completely change what it does because that’s like the compass of the campaign.
And then next would be the bid strategy itself. You know, “maximize conversions” aims to spend the entire budget and then get conversions, whereas “target CPA” aims to hit a certain level of efficiency, and if we can hit it, spend the budget. So like really different directions there just by checking or unchecking a target box. So that would be number two.
And then assets will be number three. The actual creative you use, I think, is the real audience signal in PMAX. If you have images and text that really appeal to a certain audience and really don’t appeal to another audience, that’s what’s going to inform who your PMAX shows ads to, much more so than the arbitrary audience signal or search themes you choose to provide. So those are the key three levers I would suggest folks focus on to optimize PMAX and to guide PMAX.
Frederick Vallaeys: Yeah. And those first two, I think you would broadly set across all of your PMAX campaigns because your conversion goal is probably pretty similar. The third one, the assets, is pretty interesting, right? Because now does that mean you would have multiple PMAX campaigns or multiple asset groups so that you can have a bit of segmentation for the different audiences that you’re trying to appeal to? Or from a structural perspective, like what does a great campaign look like in your mind?
Jyll Saskin Gales: What does a great campaign look like? I would say, you know, it really depends on the size of the business and the variability of the services. If we are working with a large advertiser spending millions, then yes, we’re probably going to have multiple PMAX campaigns and within them have multiple asset groups. If we’re working with a smaller business, we may have one PMAX campaign with a filter to only advertise five products.
But the general guidance I can give is the same for PMAX or actually for any campaign type. It’s like the reason to start a different campaign is because you have a different conversion goal. You know, maybe some campaigns are optimizing for in-store visits and others are optimizing for phone calls—like that would be a reason to have different campaigns. Different locations. If you are doing different bid strategies or different budgets—so like maybe have one that’s really optimizing just “maximize conversions,” but another one’s on target ROAS. Or maybe if you have different services or different product areas, you want to allocate budget differently. Those are the key reasons. I’m sure I’m missing one or two, but those are the key reasons to have a different campaign.
And then within a campaign, a different asset group should be created when you’re trying to reach a different audience or advertise a different product. And this is a common mistake I’ll see as well when I audit accounts—it’ll be like a PMAX campaign, three asset groups, exact same assets in all three of them, and just different audience signals. And I just want to shake—it’s just a signal. You’re doing the same thing three times over.
Like the way to actually do it if you’re going to have different asset groups is different assets in the different asset groups, which in turn means that your ads will appeal to different kinds of people and therefore serve to different kinds of audiences or searchers.
Frederick Vallaeys: That’s great advice. If you saw me taking notes, I want to make sure that that is one of the audits we do in Optmyzr to point out your mistake. So yeah, great advice there.
Now, you talked about “it depends.” “It depends” is always the correct answer in PPC. And part of that comes from you saying, “okay, sometimes you got a big account, sometimes you got a small client, SMB.” And it’s interesting because I think on your website you position yourself as like, “listen, I spent what was it, six or nine years working with Google.”
Jyll Saskin Gales: Six years working with the big brands.
Frederick Vallaeys: And now I make all of that knowledge available to you. The question that I want to go through is how much of what they do should be the same, and are there misconceptions in terms of like, “oh, that only works if you’re a big brand.” So what mistakes do, say, SMBs make based on false assumptions?
Jyll Saskin Gales: Ooh, I think it is much easier to manage a million dollars a month than $1,000 a month. First of all, because the systems need data, and the more money you spend, the more data you have, the easier it becomes.
So mistakes that I tend to see on the small business side really comes down to not knowing what you don’t know. It’s the unknown unknowns. You know, it’s a business owner who, among the many things they have to do, are trying to figure out Google Ads. So they don’t know that a broad match keyword is going to match them to whatever Google feels like and not just their keywords. They don’t know that a keyword and a search term aren’t the same thing, so they don’t realize what they’re actually advertising on. When you go to create a new Google Ads account now, it spits you right into creating a Performance Max campaign. So they’ll think they’re running a search campaign when actually all their money is being spent on display.
So these are like the small business owner mistakes where the gap is just education, and then they can, you know, help themselves.
I would say with the larger businesses, they tend to struggle more with this transition we were talking about earlier, from controlling to guiding. You know, folks who’ve been doing this for 10 years, 15 years, 20 years—like we all have to change our mentalities, and that’s really hard to do. So I would say that mentality and strategy shift is what the larger advertisers struggle with.
And I think as well, the promise of Google Ads and digital ads in general is like perfect tracking and attribution. You know where every dollar is going. You know exactly what you’re getting in return. And in the past couple of years, due to a variety of reasons—privacy and cookies—like that’s not necessarily the case anymore. We have to model a lot more things. We’re realizing that all the data we have isn’t as perfect as we thought.
And so I find that can also be paralyzing for some of the larger advertisers. They want guarantees. They want to know exactly what’s going to happen. And like, we don’t know what’s going to happen. You know, COVID was unprecedented times. Now with all these generative AI tools, it’s unprecedented times. Like we don’t really know what’s happening. If you have the right strategy and mindset in place, I’m sure we’ll figure it all out together, but there’s a bit of this leap of faith needed now that folks aren’t really used to.
And so that’s something that—it’s interesting, the small business owners, like that’s not even on their radar. They’re chugging along. But for the larger advertisers, I find it’s like the mental stuff that’s more of a challenge than the actual platform stuff.
Frederick Vallaeys: Yes. And I suppose aligning the whole business behind that strategy. And then even little things, right? Like you may have your merchant feed, but that is derived from your inventory system. And now if you wanted to update the titles just for Google Ads, does that impact all of the other channels that you’re advertising in? And so even a small decision becomes a big decision and a big impact, and you have to talk to a variety of other teams, and that’s where they often struggle.
Jyll Saskin Gales: Yeah, absolutely. It just—it’s a lot more complexity, but still like a large budget. Although it brings all that complexity, I do think at the end of the day that is easier sometimes than trying to have a shoestring and make it go really, really far. And I think that has become harder, I’ll say. You know, small business—the promise of AdWords, right, is like get your business online and the ad auction is the great equalizer. And like I do still think that’s all true. I’m not saying it’s not true anymore, but it is way harder today to take $20 a day and get results with it than it was five years ago or 10 years ago. Inflation, right? Inflation is real in Google Ads as well.
Frederick Vallaeys: Yeah, exactly. And it’s easier than ever to get online. If you look at Vibe Coding, for example, I mean two minutes to make a beautiful, beautiful website. But now is anyone visiting that website, obviously, right? And so that’s where PMAX then—like you said, it’s a bit tricky because yes, it does get your business online, but a lot of SMBs want to see—okay, they want to prove to themselves, do the search, see their ad, and if it’s not there, it’s like it doesn’t exist for them, right?
But there’s so many other people that do see it, and getting that confidence level—and that’s where you said it’s an education problem as opposed to maybe a technical issue.
Jyll Saskin Gales: Yeah, exactly. But your next topic—now that you’re getting into more, I believe, and are you doing a course or a book on bidding?
Frederick Vallaeys: I’m working on my next book about bidding, yes.
Jyll Saskin Gales: So in my Inside Google Ads course, I cover all these topics, but after tackling audience targeting with the first book, I knew bidding had to come next. Why? Because it’s fascinating and so misunderstood. Because it hasn’t actually changed that much over the last—like since smart bidding came out. You know, tweaks here and there. It’s been pretty consistent.
And I just think bidding is so misunderstood. I think it’s more important than your targeting. It’s potentially more important than your creative. I think bidding and creative could duke it out. But I just felt like that was an area that was right for exploration. And, you know, when I write a book, part of it is explaining everything, but then I also get into like my theories and advice and strategies to use and stuff like that. And so I just felt like there was a lot to say there. So I’m working on—I’m about 40 pages in right now to writing.
Frederick Vallaeys: Well, I have a lot of questions about that. So let’s go. First, you said targeting is probably less important, and I agree with that, but I want to hear your take. So why? Because Google Ads was the search marketing system. It was all about the keyword. That’s where everything started. And now we’re saying, “oh, well, maybe targeting is like number 17 on the list of the things we should actually worry about.” It’s not that bad, but like why is targeting not as important anymore?
Jyll Saskin Gales: When I think of Google Ads, I think of a triangle: bidding, targeting, creative, and trying to balance those three. And so I think targeting is the least important now because I’ll put it this way—like if AI and automation takes over and you can’t control that piece, can you still get good results?
So if you can’t control your targeting at all, but you can control your bidding and you can control your creative, then I believe and have proven with my clients you can still get great results. That’s literally PMAX. That’s how PMAX works, right? You have to still have that great creative that will appeal to your target audience and just as importantly not appeal to your not-target audience. And you have to have the right bid strategy, and it can do it.
But if you play the triangle the other way, like what if I have bidding and targeting, but I can’t control my creative? Like oh, that’s going to be garbage. You could put an ad in front of the exact perfect person who’s the ideal customer, but if the ad just says “shop now,” like no, you’re not going to make sales.
And then the other way, if you have your targeting and you’re creative, but like the wrong bid strategy, you have no hope. I’ve seen campaigns completely do a 180—search campaigns on exact match keywords do a complete 180 in performance just by switching from manual CPC to target ROAS.
So that was my long-winded answer to say that bidding I see is like really integral in all these places, and while targeting of course—yeah, if you can target, great—but if we had to let something go, sorry to my book, but that’s the one I would let go if it meant I could keep control over creative and bidding.
Frederick Vallaeys: Okay, that makes sense. And then the second question I want to build on what you said—you see a lot of mistakes in bidding. What’s the biggest mistake people make?
Jyll Saskin Gales: Trying to control their bidding rather than guiding their bidding.
Frederick Vallaeys: Manual CPC is almost always a mistake nowadays. I’m not going to say always. You know, there are some strange accounts I see. But for the most part, it’s a mistake these days because high CPCs aren’t the enemy. Low-quality traffic is the enemy. And I have just seen it time and time again in so many different ad accounts where we’re on manual bidding, our CPCs are like three to four dollars. “Oh, our CPA isn’t good. Oh, we’re not getting enough conversions.” And it’s like you turn on “maximize conversions,” CPCs quadruple overnight. I’ve seen CPCs multiply by 10 overnight, but you know what? The conversions go up as well.
So I’ve just seen it enough now that I am a true smart bidding believer. I think AI creative has a long way to go. Not sure if you’ll feel the same way on that. I think AI targeting is getting pretty good and only improving. But like AI bidding, it’s—it does its thing as long as you give it the right data and the right directions.
And so when I see smart bidding not performing, it’s usually more either user error or it’s day two when you just haven’t given it time to learn yet. And that’s probably the hardest part. If you’re switching from manual to automated, the first week or two might suck and you’d be like, “see, I told you I can do it better.” And it’s like, “well, no, no, no. Give it time to learn” before we judge.
Frederick Vallaeys: Yeah. Yeah. Makes total sense. And then how do you put guardrails in place to ensure that that two-week period actually is a two-week period and it doesn’t roll into four weeks and $10,000 and then you still get no results?
Jyll Saskin Gales: Yeah. It’s not even a specific time frame. Generally my smart bidding path, it’ll be like “maximize conversions.” I will even launch new campaigns on “maximize conversions” these days. And our goal is to get about 30 conversions in 30 days. If we’re not going to get that, it can take longer. Like there’s one Google Ads coaching client—I’ve been working with her for years. She’s my favorite. Don’t tell my others. And we recently launched a Performance Max campaign in one of her accounts, and first month was terrible. There was like two purchases, and then second month was terrible. There were like six purchases, and we were like, “we knew PMAX wasn’t going to work.”
But then we went and looked at the business overall, not just the PMAX campaign. And this PMAX campaign was only advertising three of her products, like three very specific products. Overall sales for those products had skyrocketed. Not attributable to anything else. Like the only change—she wasn’t doing any other email or social—just we launched PMAX, and then within a week, sales for this thing went up by a ton and stayed there.
And so that’s something else I say. I’m sorry, it’s a little tangent from the question you asked, but just thinking about guardrails and giving the right guidance—especially with a campaign like PMAX where ads can show everywhere in the platform, data isn’t always going to tell the whole story. I’m glad happy to report that in this case, by month three, even in-platform reporting, you know, showed PMAX is profitable. But to give it the right guardrails, you have to feed it the right conversion data.
For e-commerce, that’s purchases. For leads, that’s ideally an actual customer, not just a form fill or a phone call. And then you need to give it at least 30 conversions in 30 days. Or if you’re a lower-volume account where that’s not going to happen, then it can take two, three, four months for the thing to learn. And if you give up on it too soon, you’re just—you know, it’s like I always say, it’s like if I look at my, you know, one-year-old baby and be like, “well, why isn’t he talking yet? I talk to him every single day and he can’t even talk yet. What an idiot.” Like but that’s what we’re doing to our campaigns and we only give them four conversions and say, “ugh, what an idiot. It’s not doing the CPA I was able to get.”
Frederick Vallaeys: That’s a great analogy. And so then like these low-volume campaigns—so you said just give it a couple of months. Are there other techniques you like to use? I mean, a lot of people talk about micro and macro conversions. Is that a good way to cycle these campaigns up a little bit faster and then eventually switch into the conversion you actually care about, or do you see that as dangerous?
Jyll Saskin Gales: I do find micro conversions very helpful. And so to clarify for those who may not have used them before, a micro conversion is something that happens before conversion that isn’t the final thing you want, but must happen on the journey. So in e-commerce, like someone has to start checkout before they can purchase, but someone doesn’t necessarily have to view your “about” page. So like that wouldn’t make a good micro conversion.
So yes, for e-commerce, if we’re starting with a small business, sometimes I’ll make “add to cart,” you know, the conversion for the campaign just as a starting point. I might set it so that only the first “add to cart” counts. So if someone adds 20 things to cart, you know, the campaign doesn’t go crazy. But then once we have enough of that—and then keep the other actions in there as well. And then once we build up enough of the next action, for example, “add to cart,” “begin checkout,” “purchase.” So we might be optimizing for all three of those things. Once “begin checkout” reaches enough, which is sort of like 30 conversions, then set “add to cart” to secondary. And then once “purchase,” you know, reaches that 30 to 50 conversion threshold, then set “begin checkout” to secondary.
And that way you can still see that data in there. You can understand if some big drop-off is coming off. In lead gen, I mean, a form fill arguably is a micro conversion. Then you’re waiting for that offline conversion data to come in later. So that can be very helpful for smaller businesses, but like it can be tricky.
Sometimes there’s a business that they’re only going to get three or four leads a month, and they don’t have a CRM, they have a spreadsheet. So offline conversion tracking is not going to happen. And so in those circumstances, you can use things like on-page engagement or clicks or scrolls or things like that. But ultimately that might be a scenario where, well, we know the bidding isn’t getting the information it needs. So at least let’s make sure the targeting is super dialed in, exact as much as we’re able to. It’s tricky.
Frederick Vallaeys: It is. You wrote on Search Engine Land about signal duplication as well as signal fragmentation. Talk a little bit about what that means and how your structure can basically help or hurt your data levels and how to make sure that, like you said, you’re not counting 20 cart ads as individual conversions, but making sure that, you know, you count what matters.
Jyll Saskin Gales: Yeah. So this is in relation to what we spoke about earlier with audience signals—like not having four different ad groups to test different audience signals. If you’re running a Demand Gen campaign and you have three different ad groups that are genuinely targeting different audiences, then by all means.
But even in that scenario, you know, you would want to make sure that your audiences are different enough or in search, making sure that if you have your different ad groups that they’re different enough and not all just stealing from one another.
I was on a coaching call earlier today with a client who had one ad group of people looking for houses for sale and another of people looking for new houses for sale. And so we had a long conversation about, you know, do we consolidate those together? Do we try to split those apart? What does it depend on? And there’s not like a one-size-fits-all answer for that. It’s similar intent. Does it make more sense to have it together, or do we think that people looking for a new home are just so fundamentally different from other homebuyers that it deserves to have its own thing?
So I don’t have a blanket, you know, “here’s what you should do.” But I would look at your keywords or your audiences and make sure that they are different enough if you put them into different ad groups and that the creative, importantly, is different as well and conveys the different messaging that will appeal to those different searchers or those different audiences.
And yes, signals are just signals, not targeting.
Frederick Vallaeys: Great. So that’s bidding. Let’s talk a little bit about the broader topic of automation and what are your thoughts around strategic automation versus blind delegation.
Jyll Saskin Gales: Ooh, strategic automation versus blind delegation. I guess something I do notice with some newer Google Ads practitioners who’ve grown up in the world where like it’s AI first, you know what I mean? They’re never touching exact match. They’re never touching manual bidding. They’re all in on the automation. And I love that. I’m not saying that’s a bad thing. But then the challenge is like not knowing what that actually means under the hood, you know?
So broad match keywords—like yeah, broad match keywords can absolutely be the right choice, but if you look at your search terms report and 90% of it is irrelevant, maybe there’s something going on here we might want to look into.
So I’m very much with people and with Google Ads. I am a “trust but verify” kind of gal. So I like to make sure I give the automation good directions via the conversion tracking and the bid strategy, but then I will be checking the reports, which we now have on all campaign types, to see what it’s actually doing.
The key is if it’s not doing what I want, I don’t sit there and curse Google or, “oh, Google’s shaking the cushions. Why would they match me to this search?” It’s like no, it’s doing exactly what I told it to do. That might not be what I want, but it’s doing exactly what I told it to do. And so the gap between what I told it to do and what I want, I as the practitioner need to figure out how I adjust my directions so that those become the same—what it’s doing and what I want. And if you’re just all in on automation when it’s going well, that’s great. But if it doesn’t go well, then you don’t know how to fix it and you don’t know what’s wrong and you don’t know how to do that troubleshooting process. So that’s where it can become dangerous to say, “oh, turn my PMAX on. All right, I’ll check in on that in 30 days. I’m sure it’ll be fine.” Like well, it might be fine, but it might really not be.
Frederick Vallaeys: So a good reason to take those courses and build a real strong foundation in how Google Ads works, because ultimately, yes, I completely agree. You have to understand what these automations are doing, what they are controlling, so that if it doesn’t work out, you have recourse. You can actually dial it back and understand what wasn’t working.
And the other thing I love about understanding how things work is now you can start piecing things together in unique ways. You can say, “okay, well, let me use the best of bidding automation, and let me maybe figure out how to use GPT to help with the creative components that you were talking about, and let me string these things together so that I can scale my output tremendously.”
And I suppose that’s where we get into strategic automation. It’s about thinking, where is it we can add the value? And then the delegation goes to the machine. But like we’re saying, don’t blindly delegate—have checks and balances, have human-in-the-loop, have controls, because ultimately, you know, these systems are not dumb, but they can be very literal in following instructions, right? And then that doesn’t always end up with the right results.
Jyll Saskin Gales: I always say remember, it is artificially intelligent. I think we always focus on the “so intelligent” part that we forget about the artificial part. So even another, you know, coaching call I had today—I had four calls today before this. This client had used ChatGPT to write all the headlines. You know, she’s a business owner. So ChatGPT gave me some headlines. Boom. I did it.
And looking at them, I could tell because they were just so generic. This person runs a personal training gym. And I said to her, I was like, literally any personal training gym in the entire world could use these exact headlines, which is why they’re not the right headlines for you. So you need to give ChatGPT or Gemini information about what makes your gym unique, what’s unique about your services, give it your customer reviews, give it your landing pages, have it do deep research on your competitors to see what they emphasize, you know, and then based on all that, okay, have it help you write headlines that uniquely position your personal training and aren’t just generically, “yay, personal training is great.”
Frederick Vallaeys: And that’s interesting. So you mentioned deep research. So what exactly did you have Deep Research do?
Jyll Saskin Gales: I actually—it’s my Gemini sweatshirt I’m wearing right now. So I use Gemini Deep Research, and it just—rather than trying to spit an answer out to you right away, it’ll go and check like 30 to 50—puts a research plan together, you can adjust the research plan, then it goes out and checks all these different sources and comes back with an answer for you.
And so I use it selfishly in my business. I’ll be like, “I’m a small business owner and I need help with my Google Ads. Who should I hire?” And I love seeing not just the final answer of who it comes back with, but looking at all the steps it takes.
This is actually something I learned from Steve Toth. I attended SEO IRL last week, an SEO conference in Toronto he runs. And in his talk, he mentioned how useful this is for SEO. I think it’s great for PPC as well—is to see how it decides to research certain factors. So if it’s looking at, you know, what is the payment model, and is it online or in-person, and what about time zones, and what else is included, and is it a subscription or—and all these things that I wouldn’t even have thought of that that’s how people decide.
He gave great examples of how in like the B2B software space, you know, what is it compatible with, and what are—does it have these key features, and so looking at how Deep Research thinks can be really helpful because it can tell you what your potential customers might be considering and searching for as they’re looking at you versus competitors. And so I mean, as a business owner, that’s super helpful info for me. But then specifically when it comes to ads, that’s really helpful to see the kinds of features or talking points to include in your ad text.
Frederick Vallaeys: Yeah, I love that. Any other examples of using Gemini or others to do amazing things?
Jyll Saskin Gales: For Google Ads specifically?
Frederick Vallaeys: Anything that as a business person or as a human being we might find interesting.
Jyll Saskin Gales: Oh yes. I mean, I use Gemini quite a bit in my business. I think the main way I’m using it now is I will speak to it ideas and then it kind of cleans it up for me. So if I’m, you know, writing an article—like I write for Search Engine Land and WordStream and other publications—I will just talk, talk, talk all my ideas, and then it cleans it up for me. So it’s not going out and researching and pulling anything else in, but it’s helped me organize my ideas in a new way.
Same when I’m writing issues of my newsletter. I have a bi-weekly newsletter called “The Insider,” where in every issue I share a real story from a real Google Ads coaching call, like what the challenge was and how we solved it. And so again, I’ll give Gemini the transcript of the call, and then I’ll say, you know, “I really want to focus on this part and make sure we mention that, and I think this is interesting,” and then it drafts it for me.
So like as a thought organizer, I would say, is a really key way I’m using AI. And then something else that I just started testing with my Inside Google Ads course members is I created a Jyllbot using NotebookLM. Have you played with NotebookLM at all, or I guess you’re way beyond that in your AI skills?
Frederick Vallaeys: Well, yeah, I have played with it a long time ago.
Jyll Saskin Gales: Yeah. So I gave NotebookLM for now just like all my podcast transcripts, all my Search Engine Land articles, all my stuff. And I’m letting my members beta test it, and about 25 of them are. And they’re saying like the feedback—it’s saying exactly what Jyll would say because it’s only trained on exactly what I give it, you know, no more, no less. So I’m really excited in the potential there to scale my expertise and knowledge through AI.
Because oh man, I’ve seen the Google Ads advice that ChatGPT can give, and it is sad. It’s good for my business, but it is sad for the people who trust the advice.
Frederick Vallaeys: Interesting. And maybe that speaks to the point then, like you were saying before, that you just need to ask it the right questions and give it enough context.
Jyll Saskin Gales: Yes.
Frederick Vallaeys: Because I agree, if you just say, “hey, how should I improve my PPC campaign,” with no context of any sort? Well, like yeah, then it’s going to rehash the 10 most common things that everybody would have said. But these AI systems get better if you put constraints on them and tell it what is your business, what have you tried in the past, what has worked, what hasn’t worked, and it can be really helpful in terms of suggesting some things. But of course, always be careful too when you give it numbers about your business because that math may not pay out.
Jyll Saskin Gales: Actually, I’m curious what you think, Fred. I’ve also thought some of the issues might be because Google Ads changes so quickly, and they’re trained on everything that’s ever been published ever, that they have stuff in there from three years ago, five years ago, 10 years ago, and probably way more of that than like the more recent stuff. And so because of that, like I find they don’t know that exact match doesn’t mean exact match anymore or that Demand Gen is a campaign type and I’m not talking about the concept of demand generation. You know these systems better than I do. So do you think that might be contributing to it, or are they pretty good at knowing like the importance of recent versus older data?
Frederick Vallaeys: No, I don’t know that recency of the post gets taken into account when they calculate the weights. And I suppose that’s a good reason to force it to use its search capabilities or to do Deep Research so that it actually goes and finds what’s relevant. There’s even like new search engines, so you can specifically say, “find me things that have been announced in the last 30 days and factor that into how we should manage these campaigns.”
Currently I’m playing with Vibe Coding to give me a diff of what’s in the Google help materials so that I can point out, okay, here’s a little nuance of how they’ve changed things, and okay, we should talk about that. We should integrate it into our help materials for Optmyzr obviously, and articles that we write.
Again, I think it goes back to understanding how these systems work so that you can take the limitations and not be bound by the limitations, but actually figure out how to do it better than anyone else. Because obviously the ability of these systems to do amazing work is there. You just have to know how to pull that out of it. And if that’s what you can do, you’re just going to scale yourself so much faster than anyone else.
Jyll Saskin Gales: It’s true. For that to say, you know, AI is an average maker because it allows people who don’t have the skills or knowledge to get average results. So the way to get amazing results with AI is for you, the human, to bring your amazing expertise to the AI to accomplish amazing things together. But just AI on its own is not amazing. It’s average. It’s literally the average of everything we know. So don’t forget your human amazingness amplified with AI.
Frederick Vallaeys: Exactly. Well, good. Let’s wrap up. I want to talk about one more new shiny toy from Google. I’m not sure if you have much of an opinion about this, but AI Max. So the replacement in some way for DSAs, automatically generated creatives. It is an add-on to search campaigns. It’s not a separate campaign, but where does this fit in for you? Like how has it been useful, or are we still waiting?
Jyll Saskin Gales: It’s early days. I mean, the fact that you mentioned AI Max means we’re now going to get twice as many views because it’s literally all anyone wants to know about nowadays. Actually, my first LinkedIn post to ever get 100,000 impressions was when I posted about AI Max the day it came out. Fun fact.
But yes, AI Max is one of those things where it’s just so new. We don’t—we don’t know what we don’t know. Like when Performance Max first came out, it was terrible. It was terrible, and it learned and it got better, and now it’s less terrible. No, now it’s pretty good if you have the right things in place.
AI Max is in the early terrible phase, not because the product itself is terrible—I want to be really clear—but because we don’t really know how to use it yet. Again, I bring it back to the like—it’s the humility and human error, not that the tool is wrong.
AI Max’s job is to go out and basically find whatever searches it wants that it thinks will work well for your business. And so it really grinds my gears when I see people posting on LinkedIn, “you know, I turned on AI Max and look at these search terms on the first day. It’s so stupid.” And again, to bring it back to the baby analogy, it’s like, “I gave birth to this baby yesterday. He can’t walk yet. He’s so stupid.” Like yeah, you literally just turned it on. Of course it’s going to be stupid. It hasn’t learned anything yet.
So I think there’s great promise with AI Max, but I also don’t think most advertisers need AI Max right now. So I’ll turn this into an action. Like if you have maxed out your impression share, which for non-brand usually means 30 to 40%, but of course it always depends, and you want to spend more budget, but you’ve just maxed out the opportunity on your current keywords and you’re already using broad match, then sure, turn on AI Max and potentially expand even further.
But you’re going to want to have your conversion tracking in place. I would say more than 30 conversions in 30 days, probably more like 50 to 60. At least target CPA—target ROAS is more finicky, so maybe more target CPA actually in search. And, you know, have all those things in place. And, you know, if you’re a small business owner spending $30 a day, limited by budget, less than 10% search impression share, and then you turn on AI Max, like yeah, you’re not going to like it. But the problem’s not AI Max. The problem is you.
Frederick Vallaeys: And with that, Google Ads is complicated. For as much automation and new campaign types are coming out, it just means more decisions for us knowing exactly what parameters to put in place to make these new campaign types or settings successful. And that’s why sometimes we need a coach. So Jyll, thank you for sharing all your great insights. If anyone who’s listening wants to get in touch with you, what should they do?
Jyll Saskin Gales: Thanks, Fred. The best place to find me is my website. You’ll find everything about my coaching, newsletter, courses, etc. there. And you can also follow me on LinkedIn. I post every weekday about Google Ads, and that’s where I get to meet and get to know lovely folks like you, Fred, and others in the industry, so we can all share ideas and grow and learn together.
Frederick Vallaeys: Great. Well, thank you, Jyll. And thank you everyone for watching. If you’ve enjoyed this episode with Jyll, please give it a like, a thumbs up, and share it with your friends. Subscribe to the channel so you know the next episode when it comes out. Thanks for watching, and we’ll see everyone for the next one.





