Insights

Google’s Bard Looks Bigger than Bing’s GPT-4 Integration: Seer Perspectives on Google's ‘Live in Paris’ Release

By Sam Poelstra, with perspectives from Wil Reynolds, Teresa Lopez, Lisa Devieux, John Lovett, Sam Poelstra, Brittani Hunsaker, Alisa Scharf, Cori Graft

Yesterday (2/8/23) during a live event from their Paris HQ, Google answered to Bing’s AI integration of GPT-4.

Though some of the new features were previously unveiled at SearchOn 2022, their intended message was clear: 'We’ve been using AI for years, but in case you forgot that we lead the industry, here’s how we’re bringing our own AI technology to multimodal search. 

Meet Bard, our own artificial intelligence powered by LaMDA (Language Model for Dialogue Applications).' 

We’ve compiled information from the event and included perspectives from some of Seer’s most knowledgeable subject matter experts to help you understand what’s happening. 

“I believe that keywords that trigger PAAs are more likely to be disrupted. If Google or Bing are already showing ‘People Also Ask’ style answers, that means that they have a grasp on the follow-up questions to come, which is exactly what we see in Bing's chatGPT suggestions.”

WIL REYNOLDS | FOUNDERWil Reynolds - 150x150px


Here’s how Bard and AI are impacting the SERP, Google Lens, and Google Maps:

Google Search Changes: Chat AI front and center, above ads? 

It’s clear that Google has shifted its priorities in response to the waves created by GPT-4. While Bing has integrated a version of ChatGPT into the sidebar of its desktop search results page, Google plans to integrate Bard front and center at the top of the SERP. 

The examples show Bard appearing even above Google Ads. Click-through rate disruption, anyone?

 

 

That’s just the beginning. 

For queries that are “NORA, or No One Right Answer,” as described by Prabhakar Raghavan during the announcement, users will be able to refine their search queries with additional, generative questions suggested by Bard. 

To us, this sounds similar to what People Also Ask rich results attempt to do, but in a more intuitive and responsive manner, one that’s more conversational but may lead to more zero-click searches. 

In the examples shown below, a small box allows users to “Check” the results from Bard, but we don’t yet know what that looks like. Will users be able to see exactly where the information is synthesized from to verify its validity?

Bard is now open with Trusted Testers - and Raghavan was sure to center Google’s AI Principles and emphasized the importance of an AI that met “quality, safety, and groundedness” standards that match the quality guidelines Google sets for all its products before scaling Bard globally.

Seer Team Perspectives:

“Where I see AI having the strongest hold is on searches where user intent leans more on the research and informational stage, where we also most often see People Also Ask results. In paid ads, these are often the users (or queries) we are working to negate when optimizing towards a primarily 'ready to buy' or acquisition audience.

I would expect changes over the coming year to PPC Impression Share and Click-Through Rate as the technology gets up to speed.

I also expect that Google will start to adapt the SERP based on the learnings. I wouldn't be surprised if these changes and learnings prove to be an ad performance boost in the end, automatically weeding out low-intent users.

In preparation for this shift, I recommend monitoring your core metrics across key intent-based query groupings. At Seer, we're already looking at ways to adapt our existing technology to map this for our clients.”

BRITTANI HUNSAKER | DIRECTOR OF PAID MEDIAbrittani-hunsaker-headshot


"From a user perspective, I'm really curious about the "trust" aspect. With PAAs, we could see where answers are coming from and determine if it seems like a reliable source to us. Does that ability to know where it's coming from affect trust?”

TERESA LOPEZ | SENIOR LEAD, CONSUMER INSIGHTSTeresa-Lopez-Headshot-60x60@2x-1


“This has huge potential to change the search terms themselves, much like the promise of voice search. I’m most excited to watch how the language of what we search changes, becomes more long tail, and provides additional detail into user’s decision-making factors. Previously we needed a lot of additional signals to ID those wants and needs, but if users become more descriptive marketers can meet those needs in a much better and more meaningful way.”

LISA DEVIEUX | Associate Director, Strategy & AnalyticsLisa-Derieux-Headshot-Photo-2


“As the AI bandwagon steamrolls its way into public consciousness during this first half of 2023, it will undoubtedly become a new reality in the way consumers search. Without question, marketing copy, creative, and campaigns will be generated with the help of AI fodder. I am keenly interested to see how many Marketers will utilize analytics tagging to determine the effectiveness of the glut of AI generated content versus good ‘ol human generated material. The true test of value will be whether consumers buy into the AI content on SERPs and click through to actually convert - or if they’re not enticed into buying what the robots have to shill."

John Lovett, VP Analytics & Insightsjohn-lovett-seer-headshot


“I’m most interested to see the impact of these advancements on “NORA, or No One Right Answer” queries. Google has long made an effort to keep users on the SERP by providing direct answers to objective queries. This new direction takes aim at mildly subjective queries, i.e. “Which is easier to learn, guitar or piano?” For me, the obvious question following the result for that query is, “Says who?” Is this according to an expert I should explicitly trust? According to quantitative data? For some innocuous queries, users may not need to look further. But for meaningful searches, users will often seek the same qualities Google itself alleges to prioritize: Experience, Expertise, Authority, and Trust.

Alisa Scharf | Director, SEO
alisa-scharf-headshot-photo


Google Lens: If you can see it, you can search it

“Your camera is the next keyboard,” and it will allow you to search for specific places, products, and food in order to visit, buy, or cook exactly what you’re looking for without knowing exactly what words to search. This “if you can see it, you can search it” ideology aims to push users towards searching with multiple senses to integrate text, voice, and image search into a single query to enhance and deepen the results provided.

 

Here’s what’s launching:

  • In the coming months, generative AI will allow merchants to use a small series of product photos to create a 360 product view, without complex design processes.
  • Google Lens can now translate text from one language to another and then transpose that translation over the original image. 
  • In the coming months, search inside the images friends send you on your android devices. Users will be directed to the web to find out more about what’s in the image, whether that’s a landmark or a clothing item.
  • Using multisearch, you’ll be able to use a photo of an existing patterned shirt to find similar patterns across other items when you swipe up. 
  • In the US, you can now take a picture or screenshot and add “near me” to find businesses that may sell the item you’re looking for. This will roll out globally in the coming months.
  • In the coming months, you’ll also be able to use multisearch on any image across the mobile search results page. 

Seer Team Perspectives:

“This could be a really valuable improvement for CPG organizations, specifically those participating in Shopping Ads. With limited targeting capabilities, the Google Shopping experience has a lot that can be improved upon in matching products that exactly meet users’ needs. You have to assume Google will utilize their investments in image search to better automate ad serving, which could start to cut into shopping from social platforms if done right."

LISA DEVIEUX | Associate Director, Strategy & AnalyticsLisa-Derieux-Headshot-Photo-2


“This is a wonderful advancement for users. Text can only take you so far with some queries, and the implications of being able to effectively visually search can infuse new search volume demand into the products and services our Clients are offering today. It remains to be seen how Clients can best win for these queries, but I agree with Lisa’s take on starting with the Google Shopping experience. Whether you’re responsible for Paid or Organic Search, the time to optimize your Merchant Center was yesterday."

Alisa Scharf | Director, SEO
alisa-scharf-headshot-photo


“In the future, this could even impact the way stores stock their shelves and clothing racks if users are able to quickly scan the image of a product and see everything from nutrition information to size & fit. Merchants will need to ensure their information on product feeds and sites is as real-time as possible to meet user needs."

Sam Poelstra, Senior Manager, Strategy & Analytics
big-sam-poelstra-headshot


Google Maps: Augment your reality to choose the exact experience you want 

Their stated goal here is “to ensure that AI is powering a more visual and intuitive map,” and when Chris Phillips took the stage he emphasized “finding places on the go”.

Users will be able to use “Search with Live View”, an AI + augmented reality feature that helps users find things nearby just by looking through their phones.

 

This new feature emphasizes how online and offline user journeys impact each other, and how more and more, they’ve become one single journey. 

Users will be able to access images of the store interior, reviews, and availability information about a location as they walk down the street toward it. This further enhances the power that Google Business Profile has to impact the moment a potential customer chooses whether to walk into a brick-and-mortar store and buy a product. 

Using Immersive View, users will be able to view a rich model of the location they want to explore before walking inside. The feature is intended to help users find out the busiest times of day before they visit, how the location will look at different hours, and how weather conditions might impact their trip.

Philips also mentioned this feature will help users “find areas that are busy and identify building entrances”. 

Seer Team Perspectives:

“This announcement is a logical next step in the Google Maps platform, and from a business owner perspective, conceptually could allow for more opportunities to reach users. From a user perspective, however, this functionality still relies on UGC, which has always been Google’s Achilles heel. A fully accurate index of business information continues to be a struggle to this day"

LISA DEVIEUX | Associate Director, Strategy & AnalyticsLisa-Derieux-Headshot-Photo-2


With so much unknown about the rollout of these features, it’s difficult to predict exactly how user behavior will be impacted but Cori Graft, Seer’s Associate Director of SEO, says it best in this post about Google vs. GPT-4:

“Who knows [what will happen next]? But what I feel confident saying today is that Google will do what it always does: it will leverage its unmatched index and user base to refine its algorithms and deliver the best results it possibly can. It’ll be exciting to watch how this all unfolds over the next few months!”

Cori Graft | Seer’s Associate Director of SEO
cori-graft-seer-headshot


Stay tuned for our upcoming post that will show you what data you need to organize now in order to get ahead of these changes, and how benchmarking today can allow for data-backed decision-making in the future, no matter what Google decides to do next.

Want more posts like this? Subscribe to the Seer Newsletter:

Sign up for newsletter

 

We love helping marketers like you.

Sign up for our newsletter to receive updates and more:

Sam Poelstra
Sam Poelstra
Sr. Manager, Data Engineering