Back
30 Jul, 2024

3.5 better robots exclusion checker 🤖

In our first iteration of the robots checker, we only checked against the Meta Robots tag to figure out if the page you were viewing gave a direction to the robots to â€śindex”, “noindex”, “follow” or “nofollow” the page you were viewing.

Of course we added appropriate red, orange and green icons to help you quickly identify the status, but we knew we could do better.

So with this update we also:

  • Do a live check against the robots.txt file. When you visit a URL impacted by an "Allow" or "Disallow" directive in robots.txt, the extension will display the exact rule, allowing you to copy it or view the live robots.txt file effortlessly.

    Additionally, the entire robots.txt file will be shown with the relevant rule highlighted (if applicable). With one click you can open up the robots.txt itself, or copy it to your clipboard. Pretty neat, right? 
  • Look at the x-robots-tag headers. Identifying robots directives in the HTTP header used to be challenging, but not anymore with this extension.

    Any particular exclusions will be clearly highlighted, along with the complete HTTP header, making the specific exclusions easy to spot! And of course you can also copy the full HTTP header to your clipboard to easily share with your development team.

So with this update you can be sure to save some more time again, by understanding whether a particular page on your website is accessible to search engines yes or not.