Statistical API 'EXECUTION ERROR' for larger polygons


I have just signed up for a trail account and I am noticing that the workflow on the python documentation is not working with larger more complex geometries (> ~20,000 sq km).

(Sentinel Hub Statistical API — Sentinel Hub 3.3.0 documentation)

It works beautifully with simple and smaller polygons but anything larger (size of a large county or region) it seems to crash out with the following error:

Is there a limit or something that might cause this?

I am converting all polygons I pass through to EPSG:3857

[{‘data’: [{‘interval’: {‘from’: ‘2019-09-02T00:00:00Z’,
‘to’: ‘2019-09-03T00:00:00Z’},
‘error’: {‘type’: ‘EXECUTION_ERROR’}},
{‘interval’: {‘from’: ‘2019-09-04T00:00:00Z’, ‘to’: ‘2019-09-05T00:00:00Z’},
‘error’: {‘type’: ‘EXECUTION_ERROR’}},
{‘interval’: {‘from’: ‘2019-09-07T00:00:00Z’, ‘to’: ‘2019-09-08T00:00:00Z’},
‘error’: {‘type’: ‘EXECUTION_ERROR’}},
{‘interval’: {‘from’: ‘2019-09-09T00:00:00Z’, ‘to’: ‘2019-09-10T00:00:00Z’},
‘error’: {‘type’: ‘EXECUTION_ERROR’}},
{‘interval’: {‘from’: ‘2019-09-12T00:00:00Z’, ‘to’: ‘2019-09-13T00:00:00Z’},
‘error’: {‘type’: ‘EXECUTION_ERROR’}},
{‘interval’: {‘from’: ‘2019-09-14T00:00:00Z’, ‘to’: ‘2019-09-15T00:00:00Z’},
‘error’: {‘type’: ‘EXECUTION_ERROR’}},
{‘interval’: {‘from’: ‘2019-09-17T00:00:00Z’, ‘to’: ‘2019-09-18T00:00:00Z’},
‘error’: {‘type’: ‘EXECUTION_ERROR’}},
{‘interval’: {‘from’: ‘2019-09-19T00:00:00Z’, ‘to’: ‘2019-09-20T00:00:00Z’},
‘error’: {‘type’: ‘EXECUTION_ERROR’}},
{‘interval’: {‘from’: ‘2019-09-22T00:00:00Z’, ‘to’: ‘2019-09-23T00:00:00Z’},
‘error’: {‘type’: ‘EXECUTION_ERROR’}},
{‘interval’: {‘from’: ‘2019-09-27T00:00:00Z’, ‘to’: ‘2019-09-28T00:00:00Z’},
‘error’: {‘type’: ‘EXECUTION_ERROR’}},
{‘interval’: {‘from’: ‘2019-09-29T00:00:00Z’, ‘to’: ‘2019-09-30T00:00:00Z’},
‘error’: {‘type’: ‘EXECUTION_ERROR’}}],
‘status’: ‘FAILED’}

Any help much appreciated


The limit is the same as for ProcessAPI: 2500 x 2500 px.

StatAPI is typically used over e.g. fields (let’s call them objects), and a request is then done for each object. In principle the whole country could be an object as well, but you have to tweak the resolution of your request sot that the whole country fits into in 2500x2500 px box.

Oh fantastic thank you for the really quick reply.

I changed the 10x by 10x to 250 and it worked

aggregation = SentinelHubStatistical.aggregation(
evalscript = features_evalscript,
time_interval = yearly_time_interval,
aggregation_interval = ‘P1D’,
resolution = (250, 250)