Internal Server Error when downloading HLS image subsets

I am trying to download portions of Harmonized Landsat Sentinel (HLS) images. I am getting an error. The error message is pasted at the end of this post. The script I’m using worked a few weeks ago, but one of the images for day 180 in 2023 (June 29, 2023) can no longer be downloaded. I tried a few times over the last couple of days, and the error persists. Can you please let me know if there is anything I can do to resolve this issue or if you need additional information?

Here is the error:

500 Server Error: Internal Server Error for url: https://services-uswest2.sentinel-hub.com/api/v1/process
Server response: “{“status”: 500, “reason”: “Internal Server Error”, “message”: “Illegal request to s3://lp-prod-protected/HLSL30.020/HLS.L30.T18TXQ.2023180T153815.v2.0/HLS.L30.T18TXQ.2023180T153815.v2.0.B01.tif. HTTP Status: ‘404’”, “code”: “RENDERER_EXCEPTION”}”

Hi Ned,

Please can you share the request code, it is difficult to debug without that! In the meantime, there are plenty of examples here in relation to HLS data.

Thank you, William, for the quick reply. The script I am using worked in the past. In this instance, it appears as if I could access several images before it got to day 180. Here is the code I’m using:

shapefile = '/home/nedhorning/Abe/Paddocks/PaddocksUTM18N_21August2024_Aggregated.shp'
eopatch_path = "/home/nedhorning/Abe/Paddocks2023FarmAggregate21August2024/"
epsg = "EPSG:32618"
time_interval = ("2023-04-25", "2023-11-15")    # time interval of downloaded data
resolution = 30   # resolution of the request (in metres)
vt_bbox = [624360, 4732860 , 777150, 4991340]   # Coordinates for VT [xmin, ymin, xmax, ymax]
expansion_dist = 100  # Distance in meters to expand each side of a bounding rectangle 
# time difference parameter (minimum allowed time difference; if two observations are closer than this,
# they will be mosaicked into one observation)
time_difference = datetime.timedelta(hours=12)

gdf = gpd.read_file(shapefile)

bands_evalscript = """
    //VERSION=3

    function setup() {
      return {
        input: [{
          bands: ["CoastalAerosol", "Blue", "Green", "Red", "NIR_Narrow", "SWIR1", "SWIR2", "Cirrus",\
                  "QA", "dataMask"],
          units: "DN"
        }],
        output: [{
            id: "ms_data",
            bands: 8,
            sampleType: SampleType.UINT16
        }, {
            id: "quality_bands",
            bands: 2,
            sampleType: SampleType.INT16
        }]
      }
    }

    function evaluatePixel(sample) {
        bandsDN = [sample.CoastalAerosol, sample.Blue, sample.Green, sample.Red, sample.NIR_Narrow,
        sample.SWIR1, sample.SWIR2, sample.Cirrus];
        qualityBands = [sample.QA, sample.dataMask]
        return {
            ms_data: bandsDN,
            quality_bands: qualityBands
        }
    }
"""

# this will add all Sentinel multispectral bands as DN (int) not reflectance (float)
add_ms_bands = SentinelHubEvalscriptTask(
    features=[(FeatureType.DATA, "ms_data")],
    evalscript=bands_evalscript,
    data_collection=DataCollection.HARMONIZED_LANDSAT_SENTINEL,
    resolution=resolution,
    time_difference=time_difference,
    config=config,
    max_threads=3
)

# this will add all Sentinel pixel quality bands as DN (int) not reflectance (float)
add_quality_bands = SentinelHubEvalscriptTask(
    features=[(FeatureType.DATA, "quality_bands")],
    evalscript=bands_evalscript,
    data_collection=DataCollection.HARMONIZED_LANDSAT_SENTINEL,
    resolution=resolution,
    time_difference=time_difference,
    config=config,
    max_threads=3
)

farm_list = []

for i in range(len(gdf)):
    print(f"\rProcessing aggregate {i+1} of {len(gdf)}", end='', flush=True)
    # Get the current polygon
    polygon = gdf.loc[i, "geometry"]

    # Extract the polygon's coordinates
    x_coords, y_coords = polygon.exterior.coords.xy

    # Get the filename by concatenating the "farm" and "name" attributes
    farm = gdf.loc[i, 'Farm']
    farm_list.append(farm)

    xmin = int(min(x_coords) - expansion_dist)
    xmax = int(max(x_coords) + expansion_dist)
    ymin = int(min(y_coords) - expansion_dist)
    ymax = int(max(y_coords) + expansion_dist)
    
        # region of interest
    roi_coordinates = adjust_bounding_box(xmin, ymin, xmax, ymax)
    roi_bbox = BBox(roi_coordinates, crs=CRS(epsg))
    
    save = SaveTask(eopatch_path + "numpyHLS", overwrite_permission=2, compress_level=0)
    
    output_task = OutputTask(farm)
    
    #workflow_nodes = linearly_connect_tasks(add_indices, add_ms_bands, save, output_task)
    workflow_nodes = linearly_connect_tasks(add_ms_bands, add_quality_bands, save, output_task)
    workflow = EOWorkflow(workflow_nodes)

    result = workflow.execute(
        {
            workflow_nodes[0]: {"bbox": roi_bbox, "time_interval": time_interval},
            workflow_nodes[-2]: {"eopatch_folder": farm},
        }
    )
    
    

@william.ray I reran the script today to see if it was working, and it failed the same iteration, but now I’m getting a different error.: DownloadFailedException: Failed to download from:
https://services-uswest2.sentinel-hub.com/api/v1/process
with HTTPError:
500 Server Error: Internal Server Error for url: https://services-uswest2.sentinel-hub.com/api/v1/process
Server response: “{“status”: 500, “reason”: “Internal Server Error”, “message”: “java.util.concurrent.ExecutionException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: Failed to get EDL S3 credentials!”, “code”: “RENDERER_EXCEPTION”}”

Hi Ned,

Thanks again for the info. This does look like something that we need to fix on our side. Once fixed, I will update this thread. In the meantime, you can track the status here

@william.ray I was out for the last week but tested my script today, and the “java.util.concurrent.ExecutionException" error has been resolved; thank you, but I am now getting the initial error mentioned at the top of this thread. The script I ran (excerps above) still failed when retrieving the image “s3://lp-prod-protected/HLSL30.020/HLS.L30.T18TXQ.2023180T153815.v2.0/HLS.L30.T18TXQ.2023180T153815.v2.0.B01.tif” as noted in my initial post.