Will Google AI search kill search traffic and discoverability?

David Pierce, writing on the Verge about the future of Google search and how the company wants to use AI to transform its homepage:

A few seconds later, the glowing is replaced by an AI-generated summary: a few paragraphs detailing how good sourdough tastes, the upsides of its prebiotic abilities, and more. To the right, there are three links to sites with information that Reid says “corroborates” what’s in the summary.

It looks very useful indeed, but who is surprised that Google wants to bypass the sources? Not many people end up clicking on source links and Google prefers to keep everyone on its homepage. This “Corroborates” thing looks a lot like a thank you note at the end of movie credits. Of course, I’m being overly dramatic here, but it makes it easier to explain what I fear is coming sooner rather than later.

What I’m wondering is why Google is not using AI on the algorithm itself, to do a sort of reinvented “I’m feeling lucky” thing? It could do something that isolates the best three or so links for a specific search for instance. They could then add an AI summary thing option, with a “I don’t have time” button or something like that.

It would be a use of AI that would both:

This AI snapshot thing doesn’t seem to change Google search results at all, it just adds a layer on top of them so users don’t have to navigate through the crappy results.

Pierce adds:

Not all searches will spark an AI answer — the AI only appears when Google’s algorithms think it’s more useful than standard results, and sensitive subjects like health and finances are currently set to avoid AI interference altogether. But in my brief demos and testing, it showed up whether I searched for chocolate chip cookies, Adele, nearby coffee shops, or the best movies of 2022. AI may not be killing the 10 blue links, but it’s definitely pushing them down the page.

They want the eyeballs to stay on Google: the summary thing first, and the links after that. They want to be the point of entry of the web, the front desk of the internet hotel but instead of quickly and efficiently pointing you towards what you need, they want you to just stay in the lobby all the time and only leave if you really, really need to. Fuck that. That’s not a portal or a lobby, that’s a cruise ship or god damn open space.

I also wonder what the replacement of the 10 blue links on search results means for the web itself. If Google’s AI snapshot bypasses websites and reduces considerably the amount of traffic sent to them, what does it mean for website owners? Sure, the regular results are listed below the AI box and the AI results are generated a few moments later, but we know how this works, people won’t scroll. If the incentive of having regular visitors coming from Google search is lost — which can represent a big chunk of traffic, many publishers might want to slow down or just stop, or don’t even start a website in the first place. And then what? What will Google’s AI feed on if most of the sources dry out? No words on how the monetisation of this homepage will be beneficial to the sources used to create the AI snapshot, unless you consider being listed in the “corroborates” section an valuable incentive.

Imagine if Amazon suddenly removed all branding from the products sold on its website, and if customers only received generic packaging and products, with a tiny note at the bottom of the box mentioning the actual name of the brand and product they bought. Would you start a new brand and sell products on Amazon then? Probably not. When Google does this AI thing, people will just say “oh I read it on Google” when asked about where they learned something: the website sources will become the equivalent of Reddit users posting great answers: basically anonymous.

At least the AI era will make Google care a little more than before about their monopoly on search, now that it could one day be under threat, something that wasn’t really conceivable before, but it brings up questions about web discoverability and the next generation of websites.

In this article on The Verge, I’m a bit perplexed by the fact that no one mentions AI as a way to improve the existing website rankings, but just a way to summarise information on top of the regular, mostly shitty results. AI is apparently not what I was hoping for: not the search engine assistant that can make things better between users and websites, but a whole new middle man turning its back on websites. Again, I’m being overly dramatic on purpose, but I’m just not sure about the extent of the “overly” word here.

Should we all begin to change our robot.txt files and block bots from indexing our content? Too soon? Too late? For the first time in my life, I’m actually wondering what’s best.