SEO Crawler Documentation
Learn how to use the SEO Crawler to analyze your website
Overview: The SEO Crawler is a real-time website analysis tool that identifies SEO issues, broken links, and optimization opportunities across your entire website. Free tier includes up to 500 pages.
Features
- Crawl up to 500 pages for free
- Identify broken links and 404 errors
- Analyze meta tags (title, description)
- Check heading structure (H1-H6)
- Real-time progress monitoring
- Export to CSV, TSV, or JSON
How to Use
- Enter your website URL - Start with your homepage or any page you want to begin crawling from
- Set max pages - Choose how many pages to crawl (up to 500 in free tier)
- Select crawl speed - Choose from Gentle to Intensive based on your server capacity
- Start the crawl - Watch results appear in real-time as pages are analyzed
- Export results - Download your report in CSV, TSV, or JSON format
Configuration Options
Crawl Speed
| Setting | Concurrent Requests | Best For |
|---|---|---|
| Gentle | 1 | Shared hosting, rate-limited servers |
| Careful | 2 | Standard shared hosting |
| Standard | 4 | Most websites (default) |
| Efficient | 6 | VPS or dedicated servers |
| Intensive | 8 | High-performance servers |
Preset Templates
- Basic Website - 100 pages, standard speed, respects robots.txt
- Blog Only - 200 pages, filters to /blog/ paths only
- E-commerce - 500 pages, excludes cart/checkout/account pages
- WordPress - 300 pages, excludes /wp-admin/ and /wp-includes/
Advanced Options
- User Agent - Simulate different browsers or search engine bots
- Include Only - Regex pattern to limit crawl to specific paths
- Exclude - Regex pattern to skip certain paths
- Search For Text - Find pages containing specific text
- Crawl subdomains - Include subdomains in the crawl
- Respect robots.txt - Honor the site's crawl rules (recommended)
Understanding Results
| Column | Description |
|---|---|
| URL | The page address that was crawled |
| Status | HTTP status code (200=OK, 301/302=Redirect, 404=Not Found, etc.) |
| Title | The page's <title> tag content |
| Issues | Number of SEO errors or warnings found |
| Links | Total number of links on the page |
| Size | Page size in KB |
Best Practices
Crawl Responsibly
Only crawl websites you own or have permission to crawl. Use appropriate crawl speeds to avoid overloading servers.
- Start with a small max page count to test settings
- Keep "Respect robots.txt" enabled unless you have a specific reason
- Use filters to focus on specific sections of large sites
- Export results to share with your team or import into other tools
Quick Tips
Pro Tip: Use the "Export to CSV" feature to share findings with your team or import into other SEO tools like Screaming Frog or Google Sheets.
Performance: For sites over 500 pages, run multiple targeted crawls using the Include/Exclude filters.
Monitoring: Run regular crawls to track SEO improvements and catch new issues.
Need Help?
If you encounter issues or have questions about the SEO Crawler, feel free to reach out:
Contact Support