How To Measure PPC Performance When AI Controls The Auction (Answer: You Can’t. Sorry.)
Google gave you Smart Bidding and told you it was a gift. Like a box of chocolates except you don’t get to see what’s inside and also you can’t stop eating them and also they cost eight thousand dollars a month.
You used to know things. Cost per click. Which keyword converted. What time of day your best leads came in. You had dashboards. You had control. You felt like a marketer.
Now you have AI that bids for you, conversion tracking that’s “privacy-focused” (their words, not yours), and a Google rep who keeps saying “trust the algorithm” like it’s a mantra they learned at a mountain retreat where everyone wore the same polo shirt.
And you’re supposed to measure performance. Prove ROI. Show the boss what’s working.
Except you can’t. Because the auction is a black box, the data is a suggestion, and the guy who told you Smart Bidding was the future is the same guy selling Smart Bidding.
Let me be clear: you cannot measure PPC performance the way you used to when Google’s AI controls the auction. You can measure something. But it’s not performance. It’s a story Google tells you about performance, and the story changes depending on what they’re trying to sell you next quarter.
The Metrics You Trusted Are Gone (And They’re Not Coming Back)
Remember when you could see exactly which search term triggered your ad? When you could track a conversion back to a keyword, a device, a time of day, and make actual decisions based on actual data?
That’s over.
Google started by lumping your search terms into “low-volume” buckets. Then they expanded Broad Match until it meant “whatever the hell we think might be related.” Then they rolled out Performance Max, which is just Google saying “give us your budget and your creative and shut up.”
You can’t optimize what you can’t see. And Google has made damn sure you can’t see much.
The SEO without the BS version of this conversation would be: they took your data, kept the insights for themselves, and sold you back a simplified dashboard with big green arrows that make the C-suite feel good during quarterly reviews.
Smart Bidding doesn’t tell you what it’s learning. It doesn’t explain why it bid higher on one auction and lower on another. It just… does things. And then it asks for more budget because the things it did “need time to optimize.”
Cool. Very scientific. Much data.
What Google Says You Should Measure (LOL)
Google’s official line is that you should focus on “outcome-based metrics.” Conversions. Revenue. ROAS. The stuff that matters to the business.
Which sounds reasonable until you realize that Google also controls how conversions are attributed, how revenue is reported, and whether the data you’re looking at is modeled, sampled, or just vibes.
Let me translate Google’s advice:
- “Focus on conversions” = Stop asking which keywords work and just keep spending
- “Trust Smart Bidding” = We’re not showing you the levers because you’d pull the wrong ones
- “Give the algorithm time” = Keep the money flowing while we figure out if this thing actually works
- “Use broad match with Smart Bidding” = Let us show your ad for whatever we want and call it machine learning
If this were SEO snake oil, they’d be selling you a course about it. But it’s PPC, so they just sell you more clicks.
The Conversion Tracking Theater
You set up conversion tracking. You fired the pixels. You validated the tags in Tag Manager. You even triple-checked that the thank-you page was firing correctly.
And then Google decided that actually, they’re going to model some of those conversions. Because privacy. Because iOS updates. Because they said so.
Modeled conversions are Google’s way of saying “we think this probably happened but we can’t prove it so we’re just gonna put it in the report and hope you don’t notice.”
And the kicker? You can’t turn it off. You can’t opt out. You can separate modeled from observed conversions in the interface, but good luck explaining to your CMO why half the conversions in the dashboard have an asterisk next to them that says “educated guess.”
The bad SEO advice equivalent would be someone telling you that impressions are the same as traffic. Except in PPC, modeled conversions are literally in your conversion column, inflating your numbers, making Smart Bidding look better than it is.
It’s performance theater. And you’re both the audience and the funding source.
Attribution Is a Controlled Hallucination
Google offers you six attribution models. Data-driven. Last-click. First-click. Linear. Time-decay. Position-based.
And every single one of them is a story. A narrative. A way of looking at the same set of touches and deciding which one “counts.”
Data-driven attribution sounds impressive until you learn that it’s based on aggregated data from other advertisers and weighted by Google’s model of what “usually” drives conversions in your vertical. It’s not your data. It’s everyone’s data, averaged, smoothed, and served back to you as insight.
Last-click attribution gives all the credit to the final interaction, which makes your branded search campaigns look like gods and your prospecting campaigns look like expensive hobbies.
First-click makes your top-of-funnel stuff look great and ignores everything that actually closed the deal.
Every model is wrong. Some are useful. But when AI controls the bidding and the attribution model and the reporting, you’re not measuring performance—you’re watching Google’s version of your performance, edited for maximum ad spend.
Performance Max: The Ultimate “Just Trust Us” Product
Performance Max is what happens when Google gets tired of you asking questions.
You give them your assets. Your audiences. Your budget. And they show your ads… somewhere. To someone. At some cost. And they tell you it’s working.
You can’t see search terms. You can’t exclude placements with any real precision. You can’t control which asset shows in which context. You just… let it run. And then you look at the “Insights” tab, which is about as insightful as a horoscope written by someone who doesn’t know your sign.
“Your ads performed well in the Shopping category.” Cool. Which products? Which queries? What time of day? What device?
“We recommend increasing your budget to capture more conversions.” Based on what, exactly? What signal tells you there’s headroom? Or is this just the PPC equivalent of SEO gurus saying “content is king” and then ranking Reddit threads from 2009?
Performance Max is the final form of PPC: a black box that spends your money and shows you a dashboard that says “good job, keep going.”
If you wanted real SEO results, you’d ask for transparency. But this is PPC in 2024. Transparency is a legacy feature.
The Metrics That Still Exist (But Mean Less Than You Think)
You still have some numbers. Impressions. Clicks. CTR. Average CPC. Conversion rate. ROAS.
But here’s the thing: when AI controls the auction, those metrics are outputs, not inputs. You’re not controlling them anymore. You’re observing them. Like a nature documentary, except the narrator is a Google rep and the animals are your credit card charges.
Impressions: Sure, you got 50,000 impressions. Were they good impressions? High-intent? Or did Smart Bidding decide to show your ad to anyone who vaguely thought about your category six months ago? You don’t know. The algorithm knows. The algorithm isn’t telling.
Clicks: You got clicks. Congrats. Did they come from search terms you’d actually want to rank for? Or did Broad Match decide that “best running shoes” and “how to run away from problems” are basically the same thing? Check your search terms report. Oh wait, half of it says “other search terms.” Right.
Conversion rate: This one’s fun because it’s based on conversions that might be modeled and clicks that might be from garbage traffic. So you’re dividing a guess by a mystery and calling it performance.
ROAS: Return on ad spend. The holy grail. Except the “return” is based on Google’s attribution model, the “ad spend” includes clicks you never wanted, and the whole thing is optimized by an algorithm that has a vested interest in you spending more.
These metrics used to mean something when you controlled the levers. Now they’re just the numbers Google gives you so you have something to put in a slide deck.
What Actually Matters (And How Little You Can Do About It)
If you can’t measure PPC performance the old way, what can you measure?
Incrementality. Does PPC actually drive new revenue, or is it just intercepting people who were going to buy anyway? Run a geo holdout test. Turn off PPC in one region, leave it on in another, measure the difference. It’s not perfect, but it’s one of the few ways to see if your spend is doing anything beyond catching branded traffic you already owned.
Blended CAC. Customer acquisition cost across all channels. If PPC is part of your mix, measure what it costs to acquire a customer when you include PPC, SEO, email, social, the works. Then measure it without PPC. If the number doesn’t change much, guess what: PPC might be getting credit for conversions that other channels drove. Thanks, last-click attribution.
Actual profit. Revenue minus cost of goods sold minus ad spend minus everything else. If PPC is profitable after all costs, keep running it. If it’s not, Google’s dashboard full of green arrows is lying to you. This sounds obvious, but you’d be shocked how many companies are “hitting their ROAS target” while losing money on every sale.
Lift tests. Google offers conversion lift studies for some advertisers. They’re not perfect, but they’ll at least tell you if your campaigns are driving incremental conversions or just soaking up credit for shit that was happening anyway. Spoiler: most campaigns are doing less than you think.
But here’s the truth: even these metrics have asterisks. Incrementality tests assume Google won’t change the algorithm mid-test. Blended CAC assumes your attribution across channels is accurate (it’s not). Profit tracking assumes your conversion data is real (see: modeled conversions). Lift tests assume Google is running them in good faith (LOL).
You’re not measuring performance. You’re estimating it. With data provided by the company that profits when you keep spending.
The Part Where I Tell You What to Actually Do
You can’t measure PPC performance the way you used to. That’s not me being dramatic. That’s just the state of the game.
So what do you do?
Stop pretending you have control. You don’t. Google has the data. Google has the algorithm. Google has the auction. You have a dashboard and a budget. Act accordingly.
Measure what you can outside of Google. Use your CRM. Track customer lifetime value. Measure profit, not revenue. Ask your sales team where leads are coming from. Build your own attribution model if you have the data and the spite.
Run experiments. Geo tests. Holdout tests. Before-and-after tests. It’s the only way to see if PPC is doing anything beyond spending money and looking busy.
Stop trusting benchmarks. Your Google rep is going to tell you that your CTR is “above average for your industry.” Cool. Average compared to what? Other advertisers using the same black-box bidding system? That’s not a benchmark. That’s a participation trophy.
Treat Smart Bidding like a vendor, not a partner. You wouldn’t let a vendor refuse to show you invoices and then just trust they’re billing you fairly. Don’t do it with Google either. Demand transparency. When they don’t give it to you, assume the worst and plan accordingly.
Diversify. If PPC is your only acquisition channel and you can’t measure it properly, you’re one algorithm update away from a board meeting you don’t want to be in. Build other channels. Invest in SEO analysis that doesn’t require you to trust a black box. Email. Partnerships. Anything that doesn’t involve handing Google your budget and hoping for the best.
The Uncomfortable Truth Nobody Wants to Say
Most companies are flying blind on PPC. They’re spending five figures a month, sometimes six, and they genuinely do not know if it’s working.
They have dashboards. They have reports. They have ROAS targets and CPA goals and budget pacing spreadsheets. But they don’t actually know if the conversions Google is showing them are real, incremental, or just the algorithm taking credit for a branded search that was happening anyway.
And the industry is fine with this. The agencies are fine with it because they get paid on spend. The tools companies are fine with it because complexity sells software. Google is obviously fine with it because confusion drives compliance.
The only people who aren’t fine with it are the advertisers who actually have to defend their budget in a room full of people who think PPC is “the Google ads” and SEO is “when we show up first on Google.”
If you want SEO that works or PPC that works or anything that works, you need to start from the assumption that the people selling you the solution have an incentive to keep you confused.
Google benefits when you don’t understand the auction. Agencies benefit when the results are ambiguous enough that you can’t fire them. Tool companies benefit when the problem seems too complex to solve without a $400/month dashboard.
Your job is not to measure PPC performance the way you used to. That ship sailed when Google decided automation was more profitable than transparency.
Your job is to figure out if PPC is making you money despite the lack of visibility. And if it’s not, your job is to have the guts to say so in a meeting where everyone else is nodding along to the Google rep’s slide deck about “AI-powered bidding innovations.”
Why This Won’t Change (And Why That’s Actually Fine)
Google is not going to give you more data. They’re going to give you less.
Every update, every new product, every “innovation” moves in the same direction: more control for the algorithm, less visibility for you, more budget recommendations, fewer questions answered.
Performance Max is the blueprint. Broad Match with Smart Bidding is the future. Data-driven attribution is the default. And if you don’t like it, well, there’s always Bing.
(Just kidding. Bing has the same trajectory, just with fewer users and a chatbot that sometimes tells you to leave your spouse.)
This isn’t going to change because it’s working. Not for you—for Google. Revenue is up. Automation adoption is up. Advertisers are spending more and asking fewer questions.
And honestly? That’s fine.
Once you accept that PPC measurement is broken and it’s not getting fixed, you can stop trying to optimize a system you don’t control and start optimizing around it.
Measure profit. Measure incrementality. Measure customer lifetime value. Measure anything that Google doesn’t have an incentive to inflate.
And when the next SEO report or PPC white paper tells you that AI-powered bidding is “transforming the way advertisers achieve performance,” remember: the people writing that report get paid when you believe it.
You don’t have to believe it. You just have to make money despite it.
Frequently Asked Questions
- Why can’t I track PPC conversions the way I used to?
- Because Google shifted control from advertisers to AI-driven automation. Smart Bidding, broad match expansion, and limited search term visibility mean you no longer see which keywords, queries, or audiences drive results. Google replaced granular tracking with modeled conversions and aggregated data-driven attribution, giving you outcomes without the underlying insights that let you optimize manually.
- Is Google Ads hiding data from me on purpose?
- Yes. Google groups search terms into “other” or “low-volume” buckets, models conversions you can’t verify, and locks key Performance Max insights behind vague recommendations. This isn’t accidental. Reducing transparency pushes advertisers toward automated bidding products that increase spend and reduce the need for Google to justify auction mechanics or attribution decisions. Complexity favors the platform, not the advertiser.
- What metrics actually matter if AI controls bidding?
- Focus on incrementality, blended customer acquisition cost, and profit after all costs. Run geo holdout tests to measure whether PPC drives new revenue or just intercepts existing demand. Track lifetime value and CAC across all channels to see if PPC is truly contributing or inflating attribution. Relying solely on Google-reported ROAS or conversion rate means trusting a black box that profits when you spend more.
- Are PPC gurus lying about their conversion tracking dashboards?
- Not always lying—sometimes just ignorant or incentivized to ignore the gaps. Many dashboards show modeled conversions, last-click attribution, and algorithm-optimized metrics without questioning whether those numbers reflect true incrementality. Agencies and influencers benefit from complexity and confident reporting, even when the underlying data is incomplete, sampled, or shaped by Google’s attribution model rather than actual customer behavior.
- How do I prove PPC ROI when Google won’t show me what’s working?
- Measure profit, not platform metrics. Calculate actual margin after ad spend and fulfillment costs. Use CRM data to track which leads convert and their source beyond Google’s attribution. Run controlled experiments: turn off PPC in test markets and compare revenue. If your business grows when PPC is paused, Google’s dashboard was giving you credit for conversions that were happening anyway. Real ROI lives outside the Ads interface.
- Is Smart Bidding just a blackbox that spends my budget faster?
- Essentially, yes. Smart Bidding optimizes toward the goal you set, but it does so using data and auction signals you can’t see or verify. Google controls the attribution model, the conversion tracking, and the bid adjustments. The algorithm isn’t neutral—it’s designed by a company that profits when you spend more. Smart Bidding works for some advertisers, but “trust the algorithm” is advice that conveniently benefits Google whether or not it benefits you.
- Can I still measure PPC performance without trusting Google’s reporting?
- Yes, but it requires building measurement outside the platform. Track conversions in your CRM or analytics tool independently. Use first-party data to calculate customer acquisition cost and lifetime value. Run incrementality tests to measure lift. Compare revenue trends when PPC is on versus off. You won’t get keyword-level attribution, but you’ll know whether PPC is genuinely profitable or just taking credit for traffic you already owned. Real performance measurement happens outside Google’s dashboard.