Site icon DataForSEO

SERP API pricing & depth update FAQ

On September 19, 2025, we introduced changes to the way Organic SERP API requests are billed.

Now the depth (the number of results) covered by the base price of Organic SERP APIs changes to 10 (15 for Naver), with a 25% discount for every additional page in the same task.

This FAQ explains why the change was necessary, how it affects both desktop and mobile SERPs, what you can do to manage costs effectively, and more.

Is the second page scrolled to / fetched in the same session / using the same IP?

Yes. Additional pages are requested within the same session and typically under the same IP as well, but please note that each scroll counts as a separate page. For the time being, 99% of cases are processed following this logic, however, we reserve the right to adjust our system if we find that varying session parameters ensures more accurate results.

What performance difference should I expect for Live queries with only Page 1 vs Pages 1–10?

A task that fetches a single Page 1 completes fastest. Fetching deeper results (e.g., top-100, or 10 pages) takes on average 3 – 4 times longer than just the first page.

Why is the first page charged at full price, and subsequent pages at a discount?

Page 1 is the most resource-intensive. It typically contains various SERP features such as ads, rich snippets, AI Overviews, and other extra elements. Collecting and parsing these results requires more effort and investment. Subsequent pages are lighter, so we offer a 25% discount on them.

Does the change apply to all search engines of Organic search type and not just Google? Does it affect mobile SERPs as well? Why?

The new depth update applies across all Organic search engine types of SERP API, for both desktop and mobile (where it is available).

Pagination has always existed across all search engines. However, until now, we consistently charged a flat rate per a fixed number of results (typically 100), ensuring a uniform experience for our customers. We chose to do so even though mobile SERPs and some search engines required extra resources for scrolling or page loading to collect the data, because such requests were relatively rare. But now fewer results per page has become typical, so the change was inevitable. Google’s update served as the catalyst for introducing a new billing rule across all Organic search engine types of SERP API. This approach reflects the actual resources involved, keeps billing consistent and transparent, and ensures fairness for every customer.

How do I get 100 results in API v3?

In v3 you should use either max_crawl_pages or depth parameter to control the number of pages you collect. For example, to obtain 100 Google Organic search results (10 pages) for a single task, set "max_crawl_pages": 10 or "depth": 100.

If the specified depth / max_crawl_pages is higher than the number of results/pages in the response, the difference will be refunded to your account balance automatically.

Is there an offset or some other parameter to load specific pages?

Yes, you can achieve this using "search_param": "start={number}" (e.g. "search_param": "start=11") which works like an offset parameter. Since the page size is fixed at 10 results, you can define Page 3 as start=20, Page 10 as start=90, etc. Note that rank values (rank_absolute and rank_group) will count the results starting from the first page of the crawl, so if you fetch results starting from the second page with "search_param": "start=11", the "rank_absolute": 1 will mean the actual rank is 11.

How does the new pricing apply? Does it only affect the Standard queue endpoints?

The new pricing applies to all methods (Live and Standard) and all queues (Normal and High priority) for search engines of Organic search type.

The pricing works as follows. The number of results – depth – covered by the base price of Organic SERP APIs changes to the first page. The default depth value is set to 10 for all search engines of Organic Search type (se_type), except Naver with default "depth": 15. Additional pages in the same task are 25% off: each next page = 0.75 × base price.

Base price = for Page 1 (10 results, or 15 for Naver).

$0.002 for Live method
$0.0006 for Normal priority, Standard POST-GET method
$0.0012 for High priority, Standard POST-GET method

Per each additional page retrieved with the same task (with 25% off).

$0.0015 for the Live method
$0.00045 for Normal priority, Standard POST-GET method
$0.0009 for High priority, Standard POST-GET method

Example cost calculation for Normal priority (10 pages, 100 results).

First page: $0.0006
Next pages (2nd–10th): 9 × (0.75 × $0.0006) = $0.00405
Total: $0.0006 + $0.00405 = $0.00465

The general calculation formula is as follows:

Base price (1st page, according to method/priority)+ 0.75 (25% discount) х Base price х Number of additional pages

Here’s the cost for all queues for 100 results (10 pages).

Standard method, Normal priority queue: $0.0006 + 0.75 х $0.0006 х 9 = $0.00465
Standard method, High priority queue: $0.0012 +0.75 x $0.0012 x 9 = $0.0093
Live method: $0.002 + 0.75 x $0.002 x 9 = $0.0155

How is the calculate_rectangles parameter priced?

Previously, using the calculate_rectangles parameter added 2 base prices to the total task cost. Now, it adds only 1 base price.

How can I minimize my costs under the new model? Can I collect results only from a specific page?

Yes, you can collect results only from a specific page or pages, and it is one of the approaches that can help you decrease your data spending.

For example, if the last known rank is 34 (which corresponds to page 4), you could only check pages 3, 4, 5 to find the necessary website. In this case, use "search_param" : "start=30", and set "depth" : 30 or "max_crawl_pages" : 3 to limit the number of collected pages. Note that rank values (rank_absolute and rank_group) will count the results starting from the first page of the crawl, so “rank_absolute”: 1 will mean the actual rank is 31.

Besides implementing this logic, we can also recommend you to reconsider your check frequency and/or reduce depth, e.g. from 100 to 50, as well as increase the cost for your end users.

Exit mobile version