SEO Crawler Documentation

Overview: The SEO Crawler is an advanced website analysis tool that identifies SEO issues, broken links, and optimization opportunities across your entire website.

Features

  • Crawl up to 500 pages for free
  • Identify broken links and 404 errors
  • Analyze meta tags (title, description)
  • Check heading structure (H1-H6)
  • Detect duplicate content
  • Image optimization analysis
  • Internal linking structure

How to Use

  1. Enter your website URL - Start with your homepage or any page you want to begin crawling from
  2. Configure crawl settings - Set depth limits, include/exclude patterns, and crawl speed
  3. Start the crawl - The tool will systematically discover and analyze pages
  4. Review results - Get detailed reports on issues found and optimization recommendations

Best Practices

Do:
  • Start with a small section of your site first
  • Review robots.txt before crawling
  • Set appropriate crawl delays
  • Use filters to focus on specific sections
Don't:
  • Crawl sites you don't own without permission
  • Use aggressive crawl settings on production sites
  • Ignore rate limiting recommendations
  • Crawl password-protected areas

Common Issues Found

  • Missing meta descriptions
  • Duplicate title tags
  • Title tags too long (>60 characters)
  • Meta descriptions too short or long

  • Missing H1 tags
  • Multiple H1 tags per page
  • Broken heading hierarchy
  • Duplicate content across pages
Quick Tips

Pro Tip: Use the "Export to CSV" feature to share findings with your team or import into other SEO tools.


Performance: For large sites (1000+ pages), consider using multiple targeted crawls instead of one comprehensive crawl.


Scheduling: Set up regular crawls to monitor your site's SEO health over time.

Need Help?

If you encounter issues or have questions about the SEO Crawler, feel free to reach out:

Contact Support