Adjusting sampleCount in Batch Statistical API for Polygon Data

We are using the Batch Statistical API. When checking the JSON file downloaded to the specified S3 bucket, the sampleCount is 1. We understand that the sampleCount should represent the number of pixels included within the target polygon for download. Since the polygon being used for the download contains multiple pixels, we expect the sampleCount to be 2 or more. How can we configure it so that we can download with a sampleCount of multiple pixels?

Hello Sagri,

if you can provide the statistical API request that you are using that would help debugging your issue.

My first guess without seeing what you are doing is that you have set the resolution of the calculation in meters, i.e. resx: 20, resy: 20, but your geometry is in WGS84 and thus your resolution is interpreted in degrees instead of meters. With 20x20 degrees you only get a pixel per geometry.

The solution would be to reproject the polygons to a CRS using meters (e.g. EPSG:3857) or specify the resolution to an appropriate value in degrees (e.g. 0.0002 degrees for about 20m).

1 Like

First, I’d like to express my gratitude. Thank you so much, Jonas.

Indeed, I had set the aggregation’s resx and resy to 10. Also, the CRS setting for the gpkg file I used for the download was WGS84. As you advised, I was able to download after changing the CRS to 3857. Thank you very much.

I have an additional question: is it possible to use the JGD2011 geodetic system, which is used in Japan? Based on this URL (, it seems like it is not available.

Additionally, I’d like to ask about the definition of pixels determined to be included within a polygon area. When I checked the sampleCount of the polygon used for the download this time, it was 10. When using Google Earth Engine, 5 pixels were downloaded, so I’m curious about why the number of pixels is different. In Google Earth Engine, pixels with their center point inside the polygon area are downloaded, but what about Sentinel-hub?