Respect noindex: if we mark this option, the spider will not crawl those pages with the “noindex” meta tag.
Request authentication: if any of the web pages that are part of the site is password-protected, by checking this option, the program will ask us to enter username and password to access and analyze protected page.Limit depth search: with this option we can set the spider to only crawl few clicks away from home page.Limit total number of pages to crawl: with this option we will limit the number of pages that will be crawled by the “ spider”.If we want our spider ignore that file and inspect all areas of the website we need to check this option. Ignore robots.txt file: if we have blocked certain areas of our site by using the file robots.txt.Crawl subdomains: if our site has multiple subdomains, and we want to “ spider” it, we need to check this option.This option is very useful if we want to know the total number of “ dofollow” pages that our site contains. Follow “nofollow” internal or external links: with this option the spider will follow “ nofollow” links or ignore them.Check External Links: if our site has links to other sites, this option will check that these links are not broken.Check Images, CSS, Javascript, SWF: this option enables the program to check files CSS, JS, etc, on the web pages, reporting a any broken link found.Some of the options that we can set are the following: The first thing we must do is to set the “ spider” (this is how the name for the program that will visit and collect information from the web pages that make up the site). After a minute or hours, depending on the size and depth of the site, we will get a report with useful information that we will be able to filter and rearrange, so we can look at possible failures or errors in the website. The program is easy to use, we only need to set some basic settings and input site's URL that we want to analyze. The spider allows you to export key onsite SEO elements (url, page title, meta descriptions, headings etc) to Excel so it can easily be used as a base to make SEO recommendations from.Screaming Frog SEO Spider is an ideal tool to analyze and report website's problems. It’s particulary good for analysing medium to large sites where manually checking every page would be extremely labour intensive (or impossible!) and where you can easily miss a redirect, meta refresh or duplicate page issue. The Screaming Frog SEO Spider allows you to quickly analyse, audit and review a site from an onsite SEO perspective. You can view, analyse and filter the crawl data as it’s gathered and updated continuously in the program’s user interface. It fetches key onsite page elements for SEO, presents them in tabs by type and allows you to filter for common SEO issues, or slice and dice the data how you see fit by exporting into Excel.
The Screaming Frog SEO Spider is a small desktop program you can install on your PC or Mac which spiders websites’ links, images, CSS, script and apps from an SEO perspective.
Top Software Keywords Show more Show less