Every second of every day, Google processes 40,000 search queries from all over the world. That’s more than 3.5 billion searches every year. With more than 100 million active websites hosted on the internet, how does the company sort through all this information?
It’s no secret that Google uses algorithms to match search results with queries. Algorithms are the computer processes and formulas used to match the search to the best possible webpages. Google’s algorithms rely on more than 200 unique signals or “clues” that make it possible to guess what the searcher wants. These signals include things like the keywords on websites, the freshness of content, location and authority.
When you conduct a search, the algorithm assigns a weighting to each web page, which determines where the website appears in the Google Search Engine Results page (SERP). As well as the established search engine results we’re are all used to Google regularly tries out new features to help users get the information they need more quickly, such as:
- Answers: Displays immediate answers and information for things such as the weather, sports scores and quick facts.
- Autocomplete: Predicts what you might be searching for.
- Freshness: Shows the latest news and information. This includes gathering timely results specific search dates.
- Images: Shows you image-based results with thumbnails.
- News: Results from online newspapers and blogs from around the world.
- Video: Shows you video from aggregated from various sources including YouTube, Vimeo and onsite
Google’s technology is also fined tuned to adjust searches from mobile devices such as tablets and smartphones, highlighting mobile friendly websites, along with advanced search options that allow the user to tailor the results to exclude unrelated fields and narrow the results returned.
Around 500–600 times a year, Google makes minor changes to its algorithm. However, occasionally there is a major update that can affect search results in a big way. The purpose of Google’s frequent revision of its algorithm is to weed out the efforts of people who intentionally violate webmaster guidelines in an effort to get a higher ranking for poor quality sites.
The latest update, released October 2014, is Penguin 3.0. Just like its predecessors, Penguin 3.0 is designed to improve search results by eliminating spam or links that don’t appear to be authentic. Webmasters who use too many back links with the same anchor text and landing or doorway pages will need to revise or remove them to preserve their page ranking.
Another algorithm, Google Pigeon was launched in July 2014 and focuses on improving local search results. It uses location to return the most relevant results, which is especially good news for small businesses listed in the Yellow Pages or local directories. Google Hummingbird was released in 2013 and seeks to understand the query by incorporating synonyms and context in the results. It not only considers each word but also how the word was used in the query.
With the knowledge of how these algorithms work, there are several other factors that impact page rankings.
-
Content There is a measurable correlation between the quality of content and rankings. Pages with poor quality or duplicated content should be removed, rewritten, or blocked from being indexed by the search engine. A keen webmaster will continue to update the content to prevent their pages from being downgraded in the search rankings. Experts suggest that longer content (over 500 words) are trending higher in the page rankings.
-
On-Page Technical SEO With On-page SEO, the keyword remains an important part of the overall concept. The keyword should be used cleverly in the title, as well as the description, body, and sub-heading. However, there is no need to write it in every sentence. Overuse of the keyword serves no purpose and should be avoided.
-
Backlinks Backlinks are still important but the number of keyword backlinks continues to decrease. Big brands appear to benefit the most from the use of back links. Every backlink is like a vote, however Google takes into account the source of the link and adjusts the value, therefore a backlink from a high quality relevant source of good standing has much more value than a link from a directory for example.
-
Site Load The amount of time it takes for your site to load is very important, especially since more people are accessing the internet from mobile devices.
-
User Behaviour The click-through rate and time-on-site are usually higher in better ranking sites. These sites will also have a much lower bounce rate than lower ranking pages.
Google will continue to change its algorithms to adapt to the way we use the search engine. To maintain a high page ranking, the strategy for webmasters should be to produce great content and target the right keywords to match. Even if you have done a poor job with SEO in the past, a keen understanding of the algorithms and focusing on improvements to the user experience are bound to work in your favour.