Google has released research of a new algorithm that takes existing web pages, cuts out the irrelevant content, and summarises the important information into an original article. These summarised paragraphs can answer users’ queries without the need to visit another web page, creating Google’s very own Wikipedia.
In the research, Wikipedia topics are used as the search queries, and the SERPs are used as the source for the algorithm’s summarising and paraphrasing.
So, how does Google’s new algorithm work, and will it hurt your website’s traffic? – let our search optimisation experts explain
Google’s new algorithm is similar to those used to generate featured snippets (see below), where only the most relevant words and phrases are kept to answer a question in one useful excerpt.
The new algorithm works in two parts: extractive summaries and abstractive summaries.
An extractive summary is used to summarise web pages by ditching the unnecessary stuff, extracting the most relevant, informative, and useful parts.
Next, an abstractive summary is used to paraphrase the original text into unique content.
The only problem is, almost a third of abstractive summaries end up containing fake facts.
“We show that generating English Wikipedia articles can be approached as a multidocument summarisation of source documents.
We show that this model can generate fluent, coherent multi-sentence paragraphs.”
Google hasn’t stated whether or not it will generate its own content from yours, but seeing the extracting process can take information from any public webpage – and even books – the answer is probably yes.
Does this mean your website is at risk of losing traffic? Potentially.
Time will tell whether Google’s summarised snippets will steal your potential customers or not.
In the meantime, this SEO algorithm could mean easy answers for users of Google Voice Assistant, offering useful and concise responses to voice search queries.
Feature image editorial credit: mirtmirt / Shutterstock.com