Bounding Box Too Large Error For 500 Resolution (Compared to 100 Resolution)

At a 100 resolution the BBox [-115.0, 35.0, -114.0, 37.0] downloads ok (shape (2, 2242, 844, 13)).

If I change resolution to 500 or 1000 I get an error saying the box is too large.

Making the box much smaller [-114.75, 36.25, -114.0, 36.5] doesn’t seem to help- the resolution at 500 (or 1000) seems to cause a problem.

It’s possible I am mistaken, but shouldn’t the new 5x (i.e 500) resolution result in a new box that has 1/5 of the rows/columns, of the 100 resolution shape box ?

import time
import numpy as np
import pickle
import geopandas
from eolearn.io import SentinelHubInputTask
from sentinelhub import CRS, BBox, DataSource
from eolearn.io import SentinelHubInputTask
from eolearn.core import SaveToDisk, FeatureType, OverwritePermission, LinearWorkflow

layer = ‘BANDS-S2-L1C’
save = SaveToDisk(‘mojave’, overwrite_permission=2, compress_level=1)

input_task = SentinelHubInputTask(
resolution=500,
bands_feature=(FeatureType.DATA, ‘bands’),
additional_data=[(FeatureType.MASK, ‘dataMask’)],
time_difference=datetime.timedelta(minutes=120),
data_source=DataSource.SENTINEL2_L1C,
max_threads=10,
)

bb_1 = BBox(bbox=[-115.0, 35.0, -114.0, 37.0],crs=CRS.WGS84)

#bb_1 = BBox(bbox=[-114.75, 36.25, -114.0, 36.5],crs=CRS.WGS84)

workflow = LinearWorkflow(input_task, save)

result_bb1 = workflow.execute({input_task: {‘bbox’: bb_1, ‘time_interval’: time_i},
save: {‘eopatch_folder’: ‘/deepdata/moj’}})

eopatch_nev = result_bb1[save]

This is the error when I change resolution of box that downloads at 100 but not 500:
DownloadFailedException: During execution of task SentinelHubInputTask: Failed to download from:
https://services.sentinel-hub.com/api/v1/process
with HTTPError:
400 Client Error: Bad Request for url: https://services.sentinel-hub.com/api/v1/process
Server response: “{“error”:{“status”:400,“reason”:“Bad Request”,“message”:“The bounding box area is too large! Please zoom in.”,“code”:“RENDERER_EXCEPTION”}}”

Many thanks;-)

I see you are using BBOX in WGS:84.
In Sentinel Hub API, resolution parameter is in same units as the rest of the requests, in your case in degrees. 500 degrees is not really a relevant resolution.
I am not familiar with sentinelhub-py and it might be that the package converts the units in the background, but I would recommend that you try also with units converted to degrees.

Further more, each dataset has also minimum resolutions defined. E.g. Sentinel-2 with PREVIEW level goes down to 250m.
See
https://docs.sentinel-hub.com/api/latest/#/data/Sentinel-2-L1C?id=previewmode

Many thanks for your prompt response. According to this (https://eo-learn.readthedocs.io/en/latest/_modules/eolearn/io/processing_api.html#SentinelHubInputTask), resolution is in meters.

class SentinelHubInputTask(SentinelHubInputBase):
:param data_source: Source of requested satellite data.
:type data_source: DataSource
:param size: Number of pixels in x and y dimension.
:type size: tuple(int, int)
:type resolution: Resolution in meters, passed as a tuple for X and Y axis.
:type resolution: tuple(int, int)

[I think first ‘type resolution’ should actually be ‘param resolution’]

Changing resolution to 250 doesn’t work…~180 seems to be upper limit. Again possibly different if Eval3 is used directly?

Unlike some of the other classes, there is no resx/resy parameter for SentinelHubInputTask class, even though it is using Eval 3 (which does has resx/resy when used natively), but “def generate_evalscript” does not- hence my use of the resolution parameter (as 500meters).

Are classes such as SentinelHubInputTask being deprecated for direct use of Eval3? If so what is the equivalent of “max_threads”?

Again many thanks for all your help;-)

-Apologies I probably should have posted my original issue under a different category (given minimum resolution is 250m)…the error returned stating “the bounding area is too large”, led me too believe there was a different issue.

error":{“status”:400,“reason”:“Bad Request”,“message”:“The bounding box area is too large! Please zoom in.”,“code”:"RENDERER_EXCEPTION

Hi @MMM!

Let me try to help with the issues you are having.

The resolution parameter in the SentinelHubInputTask is in fact in meters, because in the background the package projects to UTM in order to estimate the image size.

The issue lies in the unflexible SH request inside of eo-learn, where only the DETAIL preview mode is available (up to 250 m resolution). More info here, as @gmilcinski already mentioned. In order to allow for lower resolutions, one has to set the preview mode to PREVIEW or EXTENDED_PREVIEW, however, I think this is somewhat difficult to achieve in eo-learn, if not impossible. An issue already exists for this here, but no action was yet taken.

The reason why it works up to 180 m and not 250 in your case might be due to the original geometry being in WGS84 and not UTM, so perhaps something weird is happening with the resolution that you specify and the resolution the service is working with, the two might not be the same. This is just a speculation, I don’t really understand this either.

Are classes such as SentinelHubInputTask being deprecated for direct use of Eval3? If so what is the equivalent of “max_threads”?

The SentinelHubInputTask has a more “raw” approach to the service, where you don’t really have control over the evalscript, in this case it’s being auto generated. The max_threads parameter allows for multithreaded downloading.

I also think that the error that you get in this case – The bounding box area is too large! Please zoom in. – is ambiguous or misleading, since the issue lies in the resolution, not the bbox size.

If you want a quick and dirty fix, you can hard-code the preview mode inside of the data filter field in https://github.com/sentinel-hub/sentinelhub-py/blob/master/sentinelhub/sentinelhub_request.py#L115, otherwise I’ll push for resolving this, because I also had issues with this in the past.

Cheers!