The last few months have been quite exciting! Google rolled out the major Panda 4.0 update in May, Panda 4.1 is currently sweeping the US and Penguin 3.0 is likely to be rolled out sometime in the next few weeks. Why is this exciting you might ask?
While I sympathise with websites that have suffered from traffic reductions or even been deindexed from Google, these algorithms were created with the sole purpose of cleaning up webspam, penalising malicious manipulations of the algorithm and ensuring the best, most useful content is given prime exposure in search results.
I’m excited that I get to see my clients that have committed to removing poor quality links and focus on developing high quality content finally see their hard work come to fruition, outrank their competitors and achieve the rankings they deserve.
If you’re flipping out about Penguin 3.0, check out my last post for a complete guide for Penguin-proofing.
While Penguin & Panda algorithm updates caused tremors in their wake, these only affected sites that engaged in offsite link spam or produced poor quality content – around 3% of English search queries. Then there’s the Hummingbird algorithm, the biggest alteration to the Google algorithm in history that affects a whopping 97% of search queries!
So while everyone is talking about the Penguin, I thought it would be a good opportunity to revisit the Hummingbird algorithm, highlight what we’ve learned about it, and provide a guide to writing amazing on page content that’s semantically optimised for Hummingbird.
Hummingbird was introduced in August 2013 as a major rewrite to the Google search algorithm. Its
purpose was to refresh the way the search engine understands a query and which results are the best to display.
Previous to Hummingbird (and this is still fairly common today), the way to appear for keywords in Google is to pick the keyword and place it:
But Google is getting smarter. They believe users should search with a query, rather than a keyword. Enter the Knowledge Graph. Now when you type queries like what is the circumference of the earth, the Knowledge Graph, which is generally populated by Wikipedia, will answer basic queries.
But what about my search terms and how can I optimize for the Hummingbird, you might ask? Let’s look briefly into how the algorithm works:
When a query is submitted to Google, it’s broken down into intent, entity & location categories, and the algorithm uses these points of reference to best understand the query. Then it displays the best results, based on Knowledge Graph and website content it has crawled.
Back in 2011, Matt Cutts coined a phrase that would be repeated over and over. “Content is King”, he said, and the repetitions, memes and infographics exploded. Content Marketing got a huge spike in search queries, and I stood there in disagreement. I understand why Matt said what he said: to bolster webmasters into thinking laterally into the creating great content for their site.
But here’s what I don’t like about this statement: It implies that so long as you have a decent amount of content on your site, that’s all you need. Without further iteration on the subject, business owners and marketing managers with little to no SEO knowledge would have gone away and got their team to develop “quality content” with no strategic direction, or accepted a proposal from a freelance content writer, paying a premium price for some great content, but again lacking a strategic direction.
Semantic Search is all about relevance; how relevant on-page content is to the focus keyword, how relevant LSI keywords are to your focus keywords, and how relevant your backlinks are.
Your content should be highly relevant to your identified keywords; there’s no use in pushing writing a Science Jobs blog post if you’re just going to talk about Science, or Recruitment, in general. Google is smarter than ever and putting together some nice sounding content with a keyword isn’t going to cut it anymore.
Content needs to be targeted, relevant, compelling, enjoyable to read and free of spelling and punctuation errors. Best practice is 500 words per targeted page. Might seem like a lot, but once you get writing about your topic (what it is, why it’s important and how your company helps with it), you’ll reach that word count with ease!
Whether it’s services, products or topics for blog posts, each piece should be planned around a keyword that has search volume; your content writing efforts are wasted if no one is searching for your content’s targeted terms!
Latent Semantic Index (LSI) Keywords are best explained as search query synonyms. As pages with LSI keywords can appear in both their own search query and that of their relevant counterpart, as per the below example, these can act as supplementary keywords that strengthens the relevance of a webpage.
Below is a very quick guide to the use of the Keyword Planner for finding appropriate keywords.
You can get to the Keyword Planner via adwords.google.com and selecting Tools > Keyword Planner
From here you enter your targeted keyword (e.g. science jobs) and check your settings under Targeting to match the above if you’re targeting Google in Australia.
Next click Get Ideas and then select the Keyword Ideas tab. You’ll get a breakdown of the keyword(s) you searched for, and related terms.
From the above we can see science jobs has 590 searches per month and there are some related terms below (ensure you are sorting by Keyword by relevance).
Without using any advanced tools, you can get an idea of Latent Sematic Index (LSI) keywords, or keyword synonyms, that you can use throughout the body of your content to add extra relevance.
In this example, your Science Jobs page would use the following keyword targeting in a 500 word length:
Focus Keyword: Science Jobs (use 3 times)
LSI Keyword 1: Forensic Science jobs (use 1-2 times)
LSI Keyword 2: Environmental Science jobs (use 1-2 times)
You can add additional LSI keywords if you want; however, it’s best to stick to 3 at max as you don’t want to make the content too keyword-heavy, or cause the shift away from the focus keyword.
Volume of links is still the biggest factor for ranking in search engines. Google is well aware of unnatural link building activities, hence the implementation of Penguin, and the recent penalty of Private Blog Networks.
The most valuable backlinks are the ones that contain relevant content. For example, a page about Science Jobs would benefit most from a post on a science site, with content specific to jobs in science and the inclusion of Branded Anchor text (your brand name) that is directed to the Science Jobs page.
If you’ve got events or products, Schema.org structured data markup (microdata) helps highlight this data to search engines, and can affect the way this information is presented in search results. Rich Video snippets may be dead, but it’s still worth adding schema markup to provide Google with a better understanding of on-page content.
There are a range of schema generator tools around. Google’s Structured Data Markup Helper works well, and its Structured Data Testing Tool shows how a page will be displayed in search results, based on the schema used in the submitted HTML code or URL.
Is your content lacking strategic direction? Chat to Search Factory’s Content Development team today for rich, targeted, strategic content creation or a Content Audit on your current site!
This information was recently presented. Below are the slides: