What I Found Out Behind The Drop In SEO Traffic And Ranking
Updated on: 10 March 2021
Who has the time to search through pages and pages of Google results? People want information fast – that’s why I work hard to ensure that my websites are visible through high rankings in the search engine results pages (SERPs). As you know, ranking can get you more traffic which means more leads and sales.
But it’s not enough that you’re already on the top position, a lot of work goes into ensuring you remain there. I’ve had my fair share of climbing the Google ranks only to face a drop in search rankings and traffic later on.
If this is what you’re encountering as well, fret not – I’ve done some digging to help you understand why that happens and where you can begin to diagnose what went wrong.
On-page SEO issues
To quickly summarise, on-page SEO involves the elements on a website that affect in ranking. They include things like page speed, keywords and mobile friendliness – which tell search engines whether a page serves quality content and is relevant to a search query.
Your content has grown old and outdated
Freshness is one of the ways Google determines the quality of content, which means if your website remains stagnant without any updates, it is likely you won’t be reaping the benefits of growth. Your strategy could be as simple as posting a blog update every week to maintain the traffic you’re getting.
But you shouldn’t lose sight on your old content too – hop on over to your Google Analytics to find pages that can be further improved. For example, content that has low traffic but high conversions, or has earned links in social media but dropped in search ranks are ideal for optimisation. You can make improvements like updating keywords, checking for broken links, and inserting fresh visuals. Get Google to recrawl your URLs so your website can be indexed and drive more relevant traffic.
You’ve redesigned your website
On WordPress, I also made sure to visit Settings > Reading and turn off the built-in feature “Discourage search engines from indexing this site” after any design update. It’s best to work with an SEO agency when planning for a new website launch project. They can spot design and server issues that will affect SEO performance where a normal web developer might miss.
Your Robots.txt is causing indexing problems
Setting up your Robots.txt files properly will direct Google crawlers to your website as intended. This is a common on-site technical error that I run into during an audit, particularly the “Disallow: /” line in the robots.txt file that tells the crawler not to go to a page on the website, thus leading to a traffic drop. However, you can still use it for specific low-quality pages that you want to prevent Google from indexing.
Off-page SEO issues
Moving on to off-page, these are activities done outside the website to build trust and authority for ranking. For instance, securing links by other reputable websites effectively vouches for the quality of your content which improves the search engine and user’s perception of your site.
Links were removed from sites
If you have 10 high-domain websites that are pointing back to your pages, and 3 of them have 404 status codes, server issues or link removals, it is likely that your traffic might be affected. This happens more often than you may think.
What I do is to perform link profile audits in Google Search Console or use third-party tools like Ahrefs to check for lost backlinks. While you may not be able to necessarily reinstate those links, you can always kickstart your link building strategies again.
A website linking to you is down or penalised
Similarly, if a website linking to you went down permanently or penalised by Google, you could be facing a loss of traffic from that website as a result. This is something that isn’t preventable on your end, which is why ongoing link acquisition activities are critical to find other websites that can link to you and possibly increase your traffic.
Issues related to Google SERPs and Google Analytics
Google is constantly unveiling a handful of new updates on SEO and ranking factors. Like many SEO professionals, I always have to keep myself up to date on whether I’m adopting the current best practices for SEO. On that note, being well-versed in using Google Analytics correctly also helps me avoid errors that could affect my ranking.
Rich or featured snippets update
Rich and featured snippets typically appear on “position zero”, at the top of the first page, which provide the opportunity to drive traffic and get more clicks on the SERPs. If you have a snippet that’s driving traffic, and due to SERP changes, your snippet is no longer displayed on that coveted spot – it might explain your traffic loss.
Changes in user search behaviour
User search behaviour can influence search rankings since they are searching for something that is relevant to solve their problem. Over time, it is possible for search behaviour to change which result in less traffic to a page that was ranking previously.
This can happen due to global events like the Covid-19 pandemic, where Google searches for face masks soared in 2020 while those for industries like travel and hotels have fallen significantly. However, there are also seasonal changes and trends that cause consumers to change their minds and behaviour, thus affecting search rankings.
Incorrect tag setup
As great as Google Analytics is, one thing I’ve learned is to make sure my tag setup is correct for accurate reporting, which will affect my traffic numbers. A common error that happens is where the incorrect Google Tag Manager account was selected – especially when working with multiple Analytics account – when setting up your report. Incorrect filter settings and invalid code formatting can also pose tag issues. One way to verify if your tag is working is through Google Tag Assistant.
Underuse or overuse of Disavow Tool
Doing regular link audits is crucial to root out undesirable backlinks that are spammy or low-quality, which can harm your site’s SEO and affect your ranking. With Google Disavow Tool in the Search Console, you can request the search engine to ignore those links and avoid facing penalties. Besides your own manual analysis, you can also use external backlink audit tools to spot toxic backlinks and clean up your link profile.
However, I wouldn’t be so quick to disavow links indiscriminately as this can bring down my own SEO efforts. Try manually requesting for the removal of that link first before disavowing a link as a last resort.
This wraps up some of the reasons that may be behind your drop in search ranking and traffic. Hopefully you have a better understanding of the issues that are plaguing your site’s SEO – and their fixes.
SEO is a great strategy to include in your digital marketing efforts, but you might hit bumps in the road that will leave you downright clueless and overwhelmed. Let our team do the heavy lifting for you with our SEO services and improve how your website is ranked in search results!