Google is known to update its core search ranking algorithm to improve its search engine performance and offer better search experience to its users every few months with the latest update in January 2020. But these core algorithm updates often become a headache to website owners and search engine professionals as every update bangs on their website rankings impacting their traffic.
Here is an example of a medical website that experienced fluctuations after major Google core updates in the past years. The black dots in the figure represent the days when the updates were rolled out.
While some sites see a drop in their traffic, other sites may see an increase in their traffic. Obviously, if some sites lose their rankings, other sites would come at the top results in the search engine results page, causing their traffic to increase.
Websites that experience a decline in their traffic may find it challenging to get back to their previous positions and some sites may never be able to get back to their prior levels at all. SEO professionals often have to put a lot of efforts in understanding what went wrong, why your rankings went down and what needs to be done to recover.
With more than 200 Google ranking factors, sifting through reasons for why your website ranking went down can be quite overwhelming. But, indeed you cannot afford the luxury of ignoring SEO by any chance at any point, as it only leads to your website traffic going to the south. So ensure that you stick to SEO and comply with the Google ranking factors to avoid any penalties whenever Google rolls out an update.
But then, how exactly do you get to know how well you comply with Google ranking factors when Google reveals very little about their algorithm and its updates? And how can you finetune your webpages to please the Googlebot and drive more traffic to your website?
Sadly, most of the time there is no single answer or a shortcut for this. All you need to do is Play along and try to learn from trial and errors to test what works and what doesn’t. Work towards significantly improving your site overall.
Rather than matching with your competitors, you should always aim to beat them in every category. If you can do this successfully, you are positioned better to improve your rankings during major core updates.
News publisher surges after dropping heavily in June & “Discovers” an interesting side-effect:
The next site I’ll cover is a news publisher that was heavily impacted by the June core update. The site dropped by 74% overnight in June and the site owner reached out to me for advice. I wasn’t able to take them on as a client due to my schedule, but I did take a look at the site and provided several recommendations.
News publisher drops by 74% after June core update.
As with many sites that are impacted by core updates, there were a number of things that could be problematic (and not just one). For example, there was an aggressive ad situation, including a potential above the fold (ATF) problem. There were many ads on each page, and some larger ones at the top of articles pushing the main content down the page. In addition, there was a lot of older thin articles on the site. And some were written by writers without a lot of expertise.
So, the site owner went through the process of analyzing and removing a lot of thin content while continuing to publish higher quality content. I see a drop of about 2K URLs in the coverage report when reviewing valid and indexed pages (showing the drop of thinner articles).
Beyond that, the site is in a dynamic niche and has been earning links from other sites in that niche (naturally, based on the content). That can also help from an authoritativeness and trust standpoint. Remember, Google’s Gary Illyes explained in the past that E-A-T is heavily influenced by links and mentions from well-known sites. Also, Google published a whitepaper where it explained more about how it assesses expertise, authoritativeness, and trust. Google explained that one of the best-known signals it uses is PageRank, or links across the web, to understand authoritativeness. So, the right links do matter.
How to beat Google’s Algorithm updates and drive traffic to your website?
As most of us already know, Google gives immense importance to the content quality and structure of a webpage. So the best thing to do always to beat any algorithm update is to keep creating great content that Google these algorithms can’t ignore.
So what does ‘great content’ mean? What was considered quality content 5 years back may not qualify to be great content today. The amount of content available online is way too much these days compared to 5 years back and website owners and digital marketers are rigorously producing content for their websites each day. Hence, Google is identifying ways to filter out the best content from this ocean of content to its users to provide them the best search experience using its machine learning algorithms that they are trying to evolve continuously.
Google’s advice to survive its algorithm changes
Here is a list of questions Google suggests every website owner/content creator/digital marketer to consider while evaluating your content.
Content and Quality questions:
Does the content provide original information, reporting, research or analysis?
Does the content provide a substantial, complete or comprehensive description of the topic?
Does the content provide insightful analysis or interesting information that is beyond obvious?
If the content draws on other sources, does it avoid simply copying or rewriting those sources and instead provide substantial additional value and originality?
Does the headline and/or page title provide a descriptive, helpful summary of the content?
Does the headline and/or page title avoid being exaggerating or shocking in nature?
Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
Would you expect to see this content in or referenced by a printed magazine, encyclopedia or book?
Does the content present information in a way that makes you want to trust it, such as clear sourcing, evidence of the expertise involved, background about the author or the site that publishes it, such as through links to an author page or a site’s About page?
If you researched the site producing the content, would you come away with the impression that it is well-trusted or widely-recognized as an authority on its topic?
Is this content written by an expert or enthusiast who demonstrably knows the topic well?
Is the content free from easily-verified factual errors?
Would you feel comfortable trusting this content for issues relating to your money or your life?
Presentation and production questions.
Is the content free from spelling or stylistic issues?
Was the content produced well, or does it appear sloppy or hastily produced?
Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care?
Does the content have an excessive amount of ads that distract from or interfere with the main content?
Does content display well for mobile devices when viewed on them?
Does the content provide substantial value when compared to other pages in search results?
Does the content seem to be serving the genuine interests of visitors to the site or does it seem to exist solely by someone attempting to guess what might rank well in search engines?
Google also advises SEO professionals to have a look at the EAT Search Quality Guidelines for Raters to know how your content adheres to E-A-T guidelines (EAT stands for Expertise, Authoritativeness, and Trustworthiness). E-A-T is a very important factor in ranking for most of the sites and demonstrating good EAT can potentially improve the search rankings for your webpages.
Now that we have seen what Google’s advice is here are a few suggestions from us. In case you are being affected negatively by a Google update, there are indeed a lot of things that you should check on. You should ideally check your frontend and backend technology differences with your competitors whose rankings have improved.
For example, in case of content, you may want to consider elements like ‘Is there use of intensive marketing language’, ‘are there excessive CTAs’, ‘any unreliable author errors’, lack of information in the content or presence of un-useful content, etc.
Wooh, that’s a lot of things to check! Right? We know it too. So here are 7 quick tips that you could use to recover if you have been hit by a Google core algorithm update.
Depend on real expertise and always try to create accurate content –
Many websites may have different authors for the articles. Linking all articles to the bios of the corresponding author’s author bio page is a must to bring transparency about the authors, who they are, their expertise. make sure that all authors of articles have their associated bios on a dedicated author bio page. In case the articles are not written by an expert, it’s still fine. But make sure, it is reviewed by experts or professionals in the domain to ensure its accuracy. Accuracy and trustworthiness matter a lot in improving your rankings.
Refer to Google’s crawl error report to spot gaps –
As you all probably know already, for your website to features in top searches, Google bots should crawl your website first. Site owners can give granular choices on how Google needs to crawl their website using the Google Search Console. But in addition to this, there are few more things to consider to ensure your site is being crawled by Google like using accurate status codes on your site. For instance, using a 301 redirect instead of 302 redirects can be misleading to the Google robots crawling your website as it really doesn’t inform the robots that the particular page has been removed.One of the best ways to ensure there are no gaps in Google crawling is to refer to Google’s crawl error report which can keep you covered in terms of code errors or anything else that is preventing your website not being crawled properly.
Audit Content Accuracy and Scientific/Medical Consensus–
Including any content on your web pages that directly contradicts scientific, medical, or historical consensus is dangerous. In Google’s Search Quality Guidelines, Google itself suggests that any content contradicting consensus should be rated as “Fails to Meet” which indeed is the lowest rating level.
Reach out to top sites for quality backlinks-
Google search algorithms always give some weightage to quality backlinks when it comes to ‘Authoritativeness’. But acquiring backlinks for sites with low authority may help only very less. For you to get quality backlinks, the best thing to do is focus on creating excellent content that addresses the questions or concerns of your target audience and is worthy to be shared. Once you have really interesting content, you may reach out to sites with fairly good authority that would be interested in sharing your content. Great content can even automatically fetch some quality backlinks for you.
Generate epic or 10x content –
While creating content you should not be trying to create content like your competitors for the search word, but your focus should be to create epic content which is 10 times better than the best result that appears currently for the search results for the keyword or phrase. This kind of content is generally called 10x content which should stand out from existing content for the search term with respect to Design, Data, Drama and Depth.
Make use of semantically linked words in content to establish relevance to the search term –
Today, you really need to use a specific keyword very often in the content to rank for that keyword. This indeed can cause a negative impact that we will be discussing later in this article. Google bots are really intelligent and can identify synonyms and related keywords quite well. And with advanced algorithms existing today, Google can provide exact results for the user’s search queries including long tails keyword. It is Google’s rank brain algorithm that allows Google to give context to the complex search queries from users. To optimize for the RankBrain algorithm, it is important that you use the terms or related group of terms that you are targeting more frequently to establish to the Google bots that your content is relevant to what the user is searching. This can help improve your rankings significantly.
Focus on user experience (UX) –
A good user experience can help with rankings too. Page load speeds, click-through rates, and even mobile-friendliness are also major factors for Google ranking as Google firmly believes that these factors are important to give its users the best search experience. So, to ensure a good click-through rate, you may have to work on coming up with catchy titles and meta descriptions, and also make sure they are in sync with your content on the page. Also ensure that your webpages are mobile-friendly and easy to navigate. You should ultimately be looking to offer a great user experience to your visitors.
Focus on relevance more than optimization –
Recently, Google started rolling out an update that uses a deep learning algorithm called BERT (Bidirectional Encoder Representations from Transformers) which aims to better understand natural language and intent of phrases, rather than specific keywords and make Google less influenced by keywords. The intent of the algorithm is to identify content that is more relevant and rank it better compared to content that is optimized for keywords by just stuffing it with keywords. This implies you should start creating relevant content for your users naturally and abstain from trying to force keywords that make the content unnatural to a reader.
At the end of the day, the intent of Google algorithm updates is to improve the search experience of Google users looking for any information. With its every algorithm update, Google is trying to find the best possible content with respect to quality and relevance to the user searching for any topic.
Ultimately, you too have to think like Google and try to come up with customer-friendly websites. Then your website would also become Google friendly. We believe this is the best way to play with Google bots and become strong enough to be not impacted by its algorithm updates.