Essential Checks for Organic Traffic and Ranking Drops

  ●   April 4, 2022 | SEO
Written by
April 4, 2022 | SEO

Whether you work in-house or agency side, most of us have come across an organic traffic drop on our website within our career. And the SEO team at Semetrical are no exception! There can be multiple reasons for an organic traffic drop, but before jumping to conclusions and investing hours into finding solutions, it’s important to step back and work through a list of checks to pinpoint the reason for the drop. Once you have established when the traffic drop occurred and have a potential reason, it’s wise to investigate further and come up with solutions to reverse the trend.

I emphasise the need to take a step back from the issue and pinpoint when the drop happened, as there can be logical reasons for a traffic drop that might not be down to technical issues, website changes or algorithm updates. The last thing you want is to spend a day investigating all the technical changes implemented on your website over the past few weeks to see if there is a correlation with the traffic drop, when actually seasonality was the trigger all along. When an organic traffic drop happens you will want to check right from the start if it corresponds with your rankings dropping, or if your rankings are stable or have improved.

The most common causes of traffic drops on a website include:

  • Seasonality or external events
  • Technical changes
  • On-page changes
  • Algorithm updates
  • SERP layout changes
  • Lost numerous backlinks
  • Tracking tag issues

Seasonality or external events

If search demand suddenly decreases for your products or services, it will cause your traffic to drop. Seasonality will not affect all industries, but if your products or services are not consistently used or bought throughout the year, your website can be affected by seasonality. Seasonality could be the root cause of your traffic drop if you see:

  • A sudden traffic decline organically and across all other channels.
  • Keyword ranking positions stay the same or increase, but traffic is declining.
  • Impressions declining, but rankings have stayed consistent or increased.

Here are the steps to take when checking if seasonality could be the reason behind the sudden drop in traffic.

  1. Compare the date range of when the traffic drops to the equivalent period the month before, and identify the core areas of the website that have seen a decline in traffic. (Not all areas may be affected).
  2. Look at both “queries” and “pages” as queries may be too granular to pinpoint the areas affected.
  3. Zone in on specific areas or URLs that have been affected the most to see if rankings have had any impact on the drop, or if it is due to impressions dropping. (Review the last 12 months).
  4. Review the main countries your website ranks in and identify whether it is seasonal to a specific country or if it is worldwide.
  5. Cross-reference relevant search queries on Google Trends and Google Ads to see if these two tools highlight a demand drop.
  6. Review traffic the year before to see if there was a similar drop in traffic around the same period.

At Semetrical, we have come across sites that have had a traffic drop due to seasonality causing a significant drop in demand. The traffic drop came as a surprise to the client as this was a new area of the site, so they had not experienced YOY seasonality before. For this specific website, not all areas of the site were affected and not all markets they operated in saw the seasonality drops.

In the example below, the client’s average rank stayed fairly stable and had actually slightly increased (Purple), but impressions saw a significant drop (Blue) in the US market. For the equivalent terms in the UK market, the client did not experience the same seasonal drops. This is a clear example of when seasonality was the main reason for the drop in traffic.

Search Console:

Google Trends & Google Ads:

Technical changes

Technical issues and changes on your website, whether expected or unexpected, can have an effect on your traffic and rankings. If a change has occurred on your website that prevents Google from crawling and indexing your pages that were ranking previously, it will eventually drop those pages from its index, resulting in ranking and traffic drops. The most common technical issues that would affect your website include no index rules being placed on the incorrect URLs, robots.txt rules blocking whole sections of your website and redirections being implemented on pages that should be active. Technical issues/changes could be the root cause of your traffic drop if you see:

  • A sudden traffic decline across the board or to a specific part of your website;
  • Keyword rankings suddenly drop across the board or to a specific part of the website.

These are the steps you should take when checking if technical issues could be the reason behind a traffic drop:

  1. Review Google Analytics to see which paths on the website have seen a decline in traffic – this will help segment the problem.
  2. Review Search Console to see which paths have also declined in traffic and identify if rankings have also dropped or have stayed consistent. If rankings have stayed consistent, but impressions have dropped, then it most likely won’t be technical related.
  3. If both rankings and traffic have declined, then conduct basic technical checks, which involve robots.txt review and no index tag reviews.
  4. Cross-reference the paths that have dropped to the robots.txt and no index report in DeepCrawl.
  5. If you see a rule or no-index tag blocking the path that has seen a drop in traffic, then it most likely will be the addition of the rule causing the drop in traffic.
  6. Additionally, look at the “warning” alerts in SC to see what Google is flagging as an issue, which could also help identify the issue.

We have faced the challenge of unexpected noindex rules or robots.txt rules being added to a client’s website, resulting in traffic and ranking drops.

An example of this is where we segmented different paths on a client’s website in Search Console to identify which paths caused the decline. Once the path was identified, we cross-referenced the path to the robots.txt file to see if a rule was blocking all URLs running off that path from ranking. In this scenario, there was an unexpected robots.txt rule added, blocking content from ranking. There are now lots of tools on the market such as ContentKing that can help prevent this from happening with real-time alerts when a robots.txt file has changed, so it can be picked up before Google finds the new rule.

On-page changes

On-page changes on your website can impact traffic and rankings, so it is important to test before rolling them out across your whole website. Unexpected on-page changes such as metadata rollbacks can also contribute to drops, and this is more common than you think! The most common contributors to traffic drops include:

  • Metadata changes 
  • Restructuring page templates and repositioning content
  • De-optimising content
  • Change in brand direction, causing page content to be changed with different keywords

On-page changes could be the root cause of your traffic drop if you see:

  • A sudden traffic decline across the board or to a specific part of your website;
  • Keyword rankings suddenly drop across the board or to a specific part of the website;
  • No technical rules in place blocking content from ranking.

Work through these steps when checking if on-page changes could be the reason:

  1. Review Google Analytics to see which paths on the website have seen a decline in traffic – this will help segment the problem.
  2. Review Search Console to see which paths have also declined in traffic and identify if rankings have also dropped or have stayed consistent. If rankings are the same, then it most likely won’t be down to on-page changes.
  3. Once you have isolated the areas of the site that saw a decline, visit a tool such as Wayback Machine. Enter a URL that sits on the URL path that was generating traffic prior to the traffic drop.
  4. Compare the Wayback Machine stored version to the current version to investigate if any changes have been made that could have contributed to the drop.
  5. Speak to internal stakeholders or cross-reference go-live changes from your roadmap to the dates of the URL traffic drops to see if there is a correlation.

At Semetrical, we have undertaken numerous rank and traffic drop projects where companies have experienced a gradual decline or an unexpected drop.

One client saw a continual traffic drop to their homepage over time due to changes in brand direction and messaging. Historically they optimised their homepage for a group of high traffic keywords and ranked very well for those terms, but over time their brand took a different direction to go after a different customer persona. The shift in brand direction led to the homepage being optimised for a completely different set of keywords that had less search volume and were a lot more competitive to rank for. Ultimately, the website started to bring in a higher quality customer base due to the shift in direction and change in keywords, but it did considerably reduce the amount of monthly traffic to their website.

For another client’s website, changes to their title tags caused a drop in traffic and rankings over time. Historically, their top-level category title tags included different product variations, but after de-optimising the pages and removing variations, it prevented category pages from ranking for that keyword set. We managed to uncover these changes by spot-checking category page title tags in WayBack Machine and getting a snapshot of title tag changes over time per URL. We then cross-referenced the timestamps of the metadata changes in Wayback Machine to traffic and ranking drops in Search Console. This process can be very manual, but in our case, we created a script that automates title tag checks in bulk at a URL level on Wayback Machine. (We will be writing a blog post on how we created and used this script shortly!)

Algorithm updates

Google updates are occurring more frequently these days. In 2021, Google officially confirmed 11 updates and there were 8 unconfirmed updates according to RankRanger.

Changes happen daily on Google, but when an algorithm update is officially announced, it can have a positive or negative impact on your website.

To identify if an algorithm update is the cause of your drop, establish:

  • The date of your traffic/ranking drop correlates with industry news publications from the same time period.
  • Visibility drops of your website were quite drastic in third party tools.
  • Visibility drops of your website correspond with other competitors in the industry, where their visibility either increased or decreased at the same time.

Work through these steps when checking if an algorithm update is the reason behind the drop:

  1. Take the dates of your website traffic or ranking drop and compare it to industry news articles from the same time period to see if it correlated with an algorithm update.
  2. Log in to SEMrush, SearchMetrics or other third-party tools and overlay Google updates on the client’s visibility graph to see if there have been any sharp declines in visibility.
  3. Once it has been decided that it is linked to an algorithm update, it’s important to segment the areas of the website most affected. It could be the whole domain or certain keyword types/paths.

At Semetrical, we have undertaken post algorithm analysis for a number of different websites where their traffic suddenly dropped overnight. There can be lots of reasons why a website will be demoted as part of an algorithm update.

When we investigated an algorithm drop for one of our clients we noticed that they did not drop for all keyword areas being tracked, only for a specific area of the industry. It is crucial to segment your keywords to pinpoint if it is sitewide wide, or if the site has been impacted in a specific area.

Once we identified the keyword areas being affected, we undertook market/competitor research to understand who was rewarded and who got demoted. It is important to review the sites that were rewarded, as this can usually help formulate a recovery strategy. Our findings indicated that the client’s website was hyper-relevant to a specific keyword vertical (the area that did not drop,) but was not seen as relevant enough for another keyword vertical (the area which dropped). This was due to their backlink profile and on-page optimisations across the website. Once we identified the reasons, a roadmap was put in place to re-optimise the website and invest heavily in digital PR and link building efforts to increase the website’s authority in the affected keyword area.

SERP layout changes

Search engine result pages (SERPS) can change in layout on a regular basis and it can affect your traffic. New features can reduce the CTRs a website may have got when ranking in the same position pre and post SERP change.

Some of the changes that have been made over the past few years that can impact traffic include:

  • Featured snippets
  • Knowledge graphs
  • Additional paid ads at the top of result pages
  • People also ask
  • Top stories carousel 
  • Indented results

To identify if SERP layout changes are the cause of your drop, check if:

  • The date of your traffic drop correlates with industry news publications referencing the introduction of new SERP features.
  • The date of your traffic drop correlates with new or an increase in SERP features being referenced within your rank tracking platform for the keywords that have seen a drop.

We recommend taking these steps when checking if SERP layout changes are the reason for your drop in traffic.

  1. Take the dates of your website traffic drop and compare it to industry news articles from the same time period to see if it correlated with any changes.
  2. Isolate the URLs that have seen a drop in traffic and identify the keywords that the URL ranks for.
  3. Login to your rank tracking platform of choice and isolate the keywords that are linked to the URLs, which have seen a drop in traffic.
  4. Take a date range pre traffic drop and compare it to a date range post traffic drop.
  5. See if the number of new SERP features has increased or decreased within the interface, or download out the SERP report from your ranking platform and cross-reference SERP features between both date ranges to see if there is an increase.

Lost links

Over the years, you may have built a large number of high-quality backlinks to your website via different strategies. However, once you gain a backlink from an external website, it doesn’t mean that backlink will exist forever.

Building backlinks to your website helps signal to search engines that your website is authoritative and trustworthy, but when your website loses numerous backlinks from an authoritative website, it can have an impact on both your traffic and rankings.

To identify if a loss in backlinks is the cause of your drop, check:

  • The date of your traffic/rank drops correlates with a large number of backlinks becoming “lost” in third party backlink tools.

Take these steps to see if a loss in backlinks is the reason:

  • Log in to your preferred third-party backlink tool and visit the number of new and lost backlinks reports and see if a large influx of lost links correlates with your traffic loss.
  • Segment the lost links report by domain rating to see if you have experienced a loss of links from highly authoritative sites.
  • In addition to the above, segment the lost links report by target URL on your website to see if the URLs that have experienced a loss in traffic/ranks are the URLs that have experienced the loss of links.

Tracking codes

This is an interesting one! If you have seen a drop in traffic in Google Analytics but you are not seeing the drop in Search Console, then it could be down to unreliable data and your site’s tracking code not working correctly.

This can occur when technical changes have been made to your website or when a new version of your site has been released. It is always important to have checks in place when technical updates do occur on your website to make sure site tracking stays intact.

To identify if your tracking code setup could be the cause of your drop, check the following:

  • Google Analytics or similar analytics platform shows a significant drop in traffic, but Search Console and equivalent platforms do not show a traffic drop.
  • Rankings have not dropped in your preferred rank tracking platform.
  • Tracking tags are not firing when running tests on a sample set of templates.

Steps to take to see if your tracking is the reason for the loss of traffic:

  1. Login to Google Analytics or your equivalent platform and see if there is a significant drop in traffic. The drop might not be site-wide, so you will want to check each major template and associate subfolders that run off those templates to isolate the issue.
  2. Log in to Search Console or the equivalent platform and see if there is a similar drop in traffic. If there is not a similar drop, it could flag that tracking has been removed or not working correctly.
  3. Visit a sample set of URLs that seem to have dropped in traffic and test to see if analytic tags are firing as expected.


In summary, there can be a number of reasons for a traffic or rank drop on your website, but when this does happen you need to narrow down the potential reasons, otherwise you could go down a rabbit hole and waste hours or days investigating.

It is always best to start with quicker tests, such as reviewing tracking tags and checking if your drops correlate with a Core Update, before spending your time reviewing technical or on-page changes.

If you have experienced a traffic or rank drop and are struggling to identify the root cause or are currently experiencing a sudden drop and need an SEO agency to support the investigation, please reach out to our technical SEO team.

Our Blog