What is Max Crawl Pages and how does it work?
max_crawl_pages is a new parameter added to SERP API and the Google Shopping Products and Amazon Products endpoints of Merchant API. In the following paragraphs, we explain what it is and how it works.
As you probably know, our APIs support the
depth parameter, which indicates how many SERP results to return in the API response. When you set a task with a specific
depth, our API crawls as many search engine result pages as necessary to provide the specified number of results.
However, we didn’t offer an option to limit the number of pages to crawl until recently. So if you wanted to receive SERP results only from the first page of a search engine, you couldn’t. Depending on a keyword and numerous other factors, a search engine might return a different number of results on its first page, and you would never guess what
depth to set to receive the results from the first page only.
To tackle the issue, our dev team developed the
How it works
max_crawl_pages simply limits the number of pages to crawl.
Let us explain how it works with an example.
Suppose you sent a request to Bing SERP API with a
depth of 100.
- The first page of Bing contains 7 SERP results;
- The second page contains 53 results;
- And the third one contains 40 results.
In this case, you would receive three pages of Bing to provide you with 100 SERP results.
By contrast, if you sent the same request with
”max_crawl_pages”: 1, our API would crawl the first page of Bing only, providing 7 SERP results in the response.