It is also recommended taking a look at the robots.txt file to see if it contains a crawl delay. In this new window, just decrease the maximum number of URLs per second, as seen in this screenshot: To reduce the number of requests per second in order not to overload the server and to avoid activating potential protection, select from the menu Configuration> Speed. Reduce the crawling speed (number of pages crawled per second). In order to solve this problem, there are a few solutions at our disposal: Usually, this is a security measure on the server side, in order to protect the website against malicious attacks and to avoid a potential server crash. Indeed, during your numerous explorations, you may notice certain irregularity, such as the 429 page status (Too Many Requests).Īs described by Mozilla in this article on status code 429, this response returned by the server indicates that the user has made too many requests in a given amount of time. When the exploration work is finally done, and before moving on to the final export, I strongly recommend that you briefly check the status of the pages directly through the Screaming Frog interface: Our “SEO Pro Tip” of the day is therefore to always insert the sitemap before the crawl, as shown below: Consequently, the final export may not contain all the pages if the interlinking is not done correctly, the orphan pages will certainly be missing. It is very common to crawl a site that does not have its sitemap in the robots.txt, and sometimes, the robots have trouble finding this specific file. Unchecking the less relevant of them can save time in your export and your classification work. The different options and tabs allow you to select precisely the elements that will be explored by Screaming Frog robots (ex: images, PDFs, external links, etc.). The configuration tab is full of options that will allow you to make decisive adjustments, more specifically the Spider sub-menu option: However, we suggest fine-tuning your settings to ensure a successful crawl! Once the software is open, the process can be as simple as pasting the URL to crawl in the field located right of the logo, and clicking on start: In order to please those wishing to get to the heart of the matter, here is an overview of how an exploration can be done with our favorite tool. Create solid transfer plans, containing all the essential information for the migration or redesign of a website. At Rablab, we use Screaming Frog on a daily basis to: It is a very versatile tool with many applications. However, it will be preferable to obtain a Screaming Frog license in order to experience the full power of this software. For occasional SEO work, this free option may be sufficient. The free version of Screaming Frog allows you to crawl 500 URLs for free. The price of Screaming Frog, free or paid? The audits performed by this software are very extensive, from 404 errors, to broken links, through the analysis of meta-descriptions and Title tags, Screaming Frog is certainly one of the best allies of the modern SEO expert. Screaming Frog SEO Spider is a software allowing to crawl a website, just like Google robots, allowing to extract necessary information to the improvement of SEO.
0 Comments
Leave a Reply. |