Pre-built crawler profiles for product discovery, competitor monitoring, and network expansion. From light weekly scans to exhaustive deep crawls â configure your data extraction strategy.
Each profile sets crawl frequency, page limits, extraction depth, and rate limiting. The crawler uses schema.org product data for structured extraction.
Weekly crawl limited to 500 pages. 10 req/min rate limit. For monitoring competitor pricing or discovering new partner products.
Daily crawl up to 5,000 pages. Extracts prices, images, reviews, and schema.org data. 30 req/min.
Daily crawl up to 50,000 pages with deep link following. Extracts variants, reviews, availability. For full network indexing.
SellerZoom's crawler uses schema.org product markup to extract structured product data from any ecommerce website. This enables two key capabilities: discovering products from potential network partner stores, and monitoring competitor pricing for stores in your vertical.
The crawler respects robots.txt and rate limits by default. The Light Crawl profile is designed for minimal server impact â 10 requests per minute, 500 pages maximum, weekly frequency. For stores that need comprehensive product databases, the Deep Crawl profile follows pagination links and extracts variant-level data across up to 50,000 pages.
Crawled products are processed through the same embedding pipeline as directly-connected store products. This means crawled products can appear in cross-store network recommendations, expanding the product universe available to shoppers without requiring the crawled store to install SellerZoom. When a crawled product generates a sale, the crawler-based affiliate model handles attribution and commission.
Activate pre-built templates and start seeing results within 24 hours.
Get Started Free