With the advent of machine learning in various facets of our lives—whether it be at home, with personal AI assistants like Alexa, or at work—the technological and digital landscape is rapidly changing. The nature of search has begun to shift and adapt in response to these changes as well. Digital marketers, often find ourselves working manually in our efforts to help improve results, pulling back the curtain and shifting the gears of paid campaigns down to the most minute details. With Google’s new automated bid strategy, called “Smart Bidding,” machine learning offers a logical solution to test in the effort to optimize your campaigns.
Smart Bidding Strategies are automated bid strategies that use machine learning to optimize for conversions or a conversion value, including Target CPA, Maximize Conversions, Target ROAS, and Enhanced CPC. More information about Smart Bidding can be found here.
In this post, we’ll focus on Target CPA and Maximize Conversions. According to Google, Target CPA will get as many conversions as possible within your budget at your set target CPA. Maximize Conversions gets you the most conversions possible while spending your budget.
In May, we wrote a blog post on our experiences with the Maximize Conversions beta, and since then we’ve tested this setting in its rolled-out version. There may be some differences in results for these two experiment types, but we’re unsure exactly which differences may have been caused by updates in Google’s algorithms, since that’s not publicly shared. All the more reason to test!
Automated bidding, when it works well, can help reach your goals faster, especially if they’re well-defined. Have a strict ROAS goal? Try Target ROAS to stay on track. Need as many conversions as possible and have a flexible budget? Maximize conversions might be the one for you.
Automated bidding also frees up time to work on other critical tasks. If you spend 15 minutes per day reviewing and updating your bids, automated bidding can help you save over an hour per week. Throughout the year, that’s over a full week of time to explore other strategies.
Additionally, Google has been emphasizing (and improving!) automated bidding strategies, so it’s not going away anytime soon. Time to hop on board!
We’ve tested Target CPA across a few clients and have seen some great results. (All examples in this section happen to be B2B, but note that Target CPA successes are not limited to B2B!)
One B2B client wanted to reduce CPA in their remarketing campaigns, so they tested Target CPA. Conversions increased 69% while CPA decreased 61%. Total cost decreased 36%.
Another B2B client’s non-brand CPA was over three times what they were comfortable paying. To start, the team targeted a CPA 15% below the current CPA. Initially, CPA decreased 42% while total spend decreased 48%. Conversion volume stayed nearly flat, ultimately decreasing by 1.
Some advertisers may see CPA decrease only slightly, while other metrics improve. One of our B2B clients saw CPA decrease by 8% while total conversions increased by 46%. Although CPA did not hit the goal level that was set, it did still fall.
Other times, clients saw more mixed results. A higher education client targeted CPA 20% below their current CPA for 30 days: overall, CPA fell 9%, while conversions stayed relatively flat. Performance varied by campaign—one non-brand campaign saw CPA drop $78 and conversions increase by 67%, while another non-brand campaign saw CPA increase by $150 and conversions decrease by 50%.
Our team adjusted target CPA levels at the campaign level and ran this experiment for another 30 days, seeing improved CPA performance—CPA decreased 23% and conversions decreased 16%, but CVR increased by 5%.
Just like Target CPA, we’ve seen some positive and not-so-positive results with Maximize Conversions. (The two examples of positive performance we’ll share are both, coincidentally, higher education advertisers. But that doesn’t mean that good performance is limited to higher education, or that all higher education advertisers will always see positive performance.)
We tested Maximize Conversions for one of our higher education clients when it was still in beta (the same higher ed client above that saw mixed results with Target CPA). Conversions increased 71%, while CPA decreased 45% and average CPC decreased 37%.
For another higher education client, we tested Maximize Conversions after the setting was fully rolled out to advertisers. Conversions increased 7%, while CPA decreased 11% and average CPC decreased 9%.
One of our B2B clients ran two Maximize Conversions experiments (the same client that saw conversions increase 69% and CPA decrease 61% with Target CPA). The first experiment was highly successful—conversions increased from 39 to 53 and CPA decreased from $108 to $79. Maximize conversions was implemented in this campaign due to the great performance, but since rolling out, CPA has increased to about 3 times higher than it was before implementing. In the second experiment, conversions increased from 13 to 22, but CPA increased more than four times, from $39 to $165.
Google’s algorithms are constantly learning and improving, but that doesn’t mean they’ll always work for every campaign. Set yourself up for success and consider the following tips that we’ve learned from Smart Bidding tests:
- Testing Maximize Conversions can put your CPA at risk, so test this setting if you’re more concerned about conversion volume than actual CPA and spend.
- If you need to stay within a specific CPA range, test Target CPA instead. Keep in mind that your conversion volume may decrease.
- Rather than change all your settings and risk performance, run an AdWords Experiment! You can run a 50/50 split to get results more quickly, or run a lower risk split—like 80/20—if you’re concerned about poor performance.
- Test specific campaigns rather than your whole account at once. Whether your results are good or bad, you won’t risk performance in your entire account. Make sure you choose campaigns that get enough volume to produce statistically significant results. Then, regardless of performance, try testing a second round of campaigns to see if performance differs.
- Check in on performance roughly once per week. Understand that the algorithm takes time to learn, and it’s likely you won’t see performance pick up until at least two weeks after the experiment began. It usually takes at least four weeks for the algorithm to optimize to the true winner. If performance is significantly worse before you hit the four- week mark, end your experiment early, but keep in mind that experiment results can significantly improve if you allow for more experiment time.
- For Target CPA specifically, set your target above your goal to see what changes occur initially. Then, adjust your target depending on conversion volume. If the Target CPA is “too low,” proper optimization may not be possible, and your results could tank.
- Weed out inefficient ad spend that could slow down the algorithm’s machine learning. Audit your search terms report for irrelevant impressions and clicks that could dilute the data. No time to sift through the queries? Recruit Seer to provide a PPC Efficiency Audit that will tell you exactly which keywords to negate, and also put checks and balances on Google’s smart bidding features.
The biggest takeaway for Google’s Smart Bidding is to test, test, test. As marketers, we have the ability to approach our campaigns scientifically and to thoughtfully introduce new optimizations. Not every campaign may bloom with exciting CPA or CVR results, but making the effort to test these new bidding strategies is a key factor in your optimizing campaigns.