What is the Google Algorithm?

Google’s philosophy on search has always and ever been about the perfect search engine, one that gives you exactly what you are looking for in search results. To that end it constantly seeks ways to bring up better and more relevant web search results, and thus improve user experience. That is why the Google algorithms are always changing, a work in progress towards bringing up accurate and first-rate information in response to search engine queries.

When retrieving and assigning rankings to web pages for display in its SERPs (search engine results pages), Google’s algorithm takes into consideration the relevance of the web page content to the query. It is a major consideration because Google wants to bring up results that address exactly what the user needs. In order to do that, Google uses continually-updated algorithms to examine the content and importance of web pages and determine the best fit for any given query. It also personalizes results based on the user’s web history, as well as his location. Furthermore, web pages are evaluated by human reviewers based on a set of guidelines to ensure relevance and utility.

PageRank (PR), a link analysis algorithm that was developed by Larry Page and Sergey Brin as computer science PhD candidates at Stanford University and which is the basis of the early Google search engine, was initially the main determinant of the rankings of web pages on Google. Over the years, however, many other factors were added in order to avoid rank manipulation, and the importance of PR has diminished although it has remained a major ranking factor.

One of the recent significant changes to the Google algorithm, made in the first quarter of 2011, is the emphasis on high-quality content. This means that not only should the content on a web page be highly relevant to the search terms, but also be well-written (no spelling and/or grammar errors, ideas well-organized and well-presented), informative/authoritative, and preferably in-depth. So, any web page with gibberish on it will not stand a chance, and writing for the human reader – and not for web bots – is more important than ever.

In addition, the page should not contain web spam, elements that facilitate phishing, nor malware that initiate attacks on users’ computers. Google does not only de-index such, but also totally blocks them in the interest of optimum user experience. Spam includes, but is not limited to, scraped or copied content, keyword stuffing, pure PPC (pay per click) pages, as well as pages with unhelpful content. This is not new in the algorithm, of course. It has been around for a long time.

One thing that the Google algorithms is said to have been ignoring for some years now are the meta keywords on a page, in order to avoid rank manipulation to which other early search engines – that relied more on meta tags and on-page content for ranking pages – were susceptible. However, meta keywords are still beneficial for Bing (which also now powers Yahoo! Search in the US and Canada, and by 2012 all of Yahoo! worldwide). But do not overdo it, so as not to be tagged as spam for keyword stuffing by Google.

The number and quality of backlinks to a web page is still very important on Google. This is one facet of SEO that should never be ignored. One must work steadily on getting other websites to link to one’s own site, preferably websites that have high PR pages, authority, good web history, and/or age.

The one sure thing about Google search is that its algorithms will keep changing to foil the efforts of those who seek to manipulate their rankings on the SERPs. But, in the same way that you can’t keep a good man down, you also won’t be able to keep a good website down. Make your web pages user-friendly, authoritative, and high-quality, then get other user-friendly, authoritative, and high-quality websites to link to it, and you will definitely not go wrong.