SEO Crawler Documentation

Learn how to use the SEO Crawler to analyze your website

Overview: The SEO Crawler is a real-time website analysis tool that identifies SEO issues, broken links, and optimization opportunities across your entire website. Free tier includes up to 500 pages.

Features

  • Crawl up to 500 pages for free
  • Identify broken links and 404 errors
  • Analyze meta tags (title, description)
  • Check heading structure (H1-H6)
  • Real-time progress monitoring
  • Export to CSV, TSV, or JSON

How to Use

  1. Enter your website URL - Start with your homepage or any page you want to begin crawling from
  2. Set max pages - Choose how many pages to crawl (up to 500 in free tier)
  3. Select crawl speed - Choose from Gentle to Intensive based on your server capacity
  4. Start the crawl - Watch results appear in real-time as pages are analyzed
  5. Export results - Download your report in CSV, TSV, or JSON format

Configuration Options

Crawl Speed
Setting Concurrent Requests Best For
Gentle1Shared hosting, rate-limited servers
Careful2Standard shared hosting
Standard4Most websites (default)
Efficient6VPS or dedicated servers
Intensive8High-performance servers
Preset Templates
  • Basic Website - 100 pages, standard speed, respects robots.txt
  • Blog Only - 200 pages, filters to /blog/ paths only
  • E-commerce - 500 pages, excludes cart/checkout/account pages
  • WordPress - 300 pages, excludes /wp-admin/ and /wp-includes/
Advanced Options
  • User Agent - Simulate different browsers or search engine bots
  • Include Only - Regex pattern to limit crawl to specific paths
  • Exclude - Regex pattern to skip certain paths
  • Search For Text - Find pages containing specific text
  • Crawl subdomains - Include subdomains in the crawl
  • Respect robots.txt - Honor the site's crawl rules (recommended)

Understanding Results

Column Description
URLThe page address that was crawled
StatusHTTP status code (200=OK, 301/302=Redirect, 404=Not Found, etc.)
TitleThe page's <title> tag content
IssuesNumber of SEO errors or warnings found
LinksTotal number of links on the page
SizePage size in KB

Best Practices

Crawl Responsibly

Only crawl websites you own or have permission to crawl. Use appropriate crawl speeds to avoid overloading servers.

  • Start with a small max page count to test settings
  • Keep "Respect robots.txt" enabled unless you have a specific reason
  • Use filters to focus on specific sections of large sites
  • Export results to share with your team or import into other tools
Quick Tips

Pro Tip: Use the "Export to CSV" feature to share findings with your team or import into other SEO tools like Screaming Frog or Google Sheets.


Performance: For sites over 500 pages, run multiple targeted crawls using the Include/Exclude filters.


Monitoring: Run regular crawls to track SEO improvements and catch new issues.

Need Help?

If you encounter issues or have questions about the SEO Crawler, feel free to reach out:

Contact Support