URL parameters are special codes added to a site’s address immediately after the question mark. The parameters are used to follow marketing campaigns, session IDs, and languages among others. They are often called query strings.
Here are some examples:
These URL parameters are mainly applied in e-commerce stores that use faceted navigation models. This model helps to generate URL parameters that assist users/visitors follow various items in different categories. However, the categorizations make the work of search engines very difficult.
The key problem with URL parameters
When URL parameters are generated using site filters, there is a high risk of creating issues that impact the ranking of the page. The filters only add URLs that do not bring any value to the site. This is because the new categories have the same content. If you do not address the issue promptly, the site’s performance could get compromised.
One of the best methods of testing these filters is establishing whether every search affects each page. In the example outlined above, the search engine would see the two pages as duplicate content though they are in varying categories. This becomes a huge dilemma because the search engine bots are unable to determine which of the two pages should be indexed.
This problem happens when various pages of a site target similar keywords. In such a situation, search engine bots find it difficult to establish the pages that are more important. If the pages you consider less important get ranked, the Google Analytics report could look distorted. While search engines will rarely get flagged as duplicate, you will still land into huge problems.
Avoiding pitfalls of URL parameters
Though some solutions can help address the problem without impacting on your SEO, the first step is determining whether the cause of the issues is indeed the URL parameter. When you are implementing URL parameters, always check whether the ultimate copy changes significantly. You should particularly check on Meta description and titles.
Here are the main solutions you can implement to address the issue.
- Utilize the right URL parameter tools: The URL parameter tools supply Google bots with information on how to handle URL with specific parameters. If you are using Bing, it comes with a tool that can help all unwanted parameters.
- Use the site’s robots.txt: This file is used to disallow access to content, pages, and posts by bots. You can use the file to block specific search query parameters. Remember that extra caution should be taken when using the robots.txt file to avoid disallowing important content. The code (Disallow: /*?*) should always be added to the robots.txt file to disallow specific URL or pages.
- Canonical Tags: These are special tags used in search engines to demonstrate that search engines are copies and specify the one that should be indexed. Though your site webmaster can assist to set the re=canonicals, there are some tools such as Yoast that can help to implement them automatically. These tools can be very helpful especially for e-commerce sites.