Incorrect aggregation in statistical API

Please execute the following three requests and inspect the intervals received in response. The three requests are identical except for aggregationInterval:

(1) One day aggregation:
URL=‘https://services.sentinel-hub.com/api/v1/statistics
curl -X POST ${URL}
-H ‘Content-Type: application/json’
-H “Authorization: Bearer ${TOKEN}”
-d ‘{
“input”: {
“bounds”: {
“bbox”: [
12.44693,
41.870072,
12.541001,
41.917096
]
},
“data”: [
{
“dataFilter”: {},
“type”: “sentinel-2-l2a”
}
]
},
“from”: “2021-07-06T00:00:00Z”,
“to”: “2021-07-20T23:59:59Z”
},
“aggregationInterval”: {
“of”: “P1D”
},
“width”: 512,
“height”: 343.697,
“evalscript”: “//VERSION=3\nfunction setup() {\n return {\n input: [{\n bands: [\n “B04”,\n “B08”,\n “SCL”,\n “dataMask”\n ]\n }],\n output: [\n {\n id: “data”,\n bands: 3\n },\n {\n id: “scl”,\n sampleType: “INT8”,\n bands: 1\n },\n {\n id: “dataMask”,\n bands: 1\n }]\n };\n}\n\nfunction evaluatePixel(samples) {\n let index = (samples.B08 - samples.B04) / (samples.B08+samples.B04);\n return {\n data: [index, samples.B08, samples.B04],\n dataMask: [samples.dataMask],\n scl: [samples.SCL]\n };\n}\n”
},
“calculations”: {
“default”: {}
}
}’ > P1D.json

There are three 1-day intervals, apparently corresponding to the days with available imagery:
{“from”:“2021-07-09T00:00:00Z”,“to”:“2021-07-10T00:00:00Z”}
{“from”:“2021-07-14T00:00:00Z”,“to”:“2021-07-15T00:00:00Z”}{“from”:“2021-07-19T00:00:00Z”,“to”:“2021-07-20T00:00:00Z”}
which is OK.

(2) One hour aggregation:
curl -X POST ${URL}
-H ‘Content-Type: application/json’
-H “Authorization: Bearer ${TOKEN}”
-d ‘{
“input”: {
“bounds”: {
“bbox”: [
12.44693,
41.870072,
12.541001,
41.917096
]
},
“data”: [
{
“dataFilter”: {},
“type”: “sentinel-2-l2a”
}
]
},
“aggregation”: {
“timeRange”: {
“from”: “2021-07-06T00:00:00Z”,
“to”: “2021-07-20T23:59:59Z”
},
“aggregationInterval”: {
“of”: “PT1h”
},
“width”: 512,
“height”: 343.697,
“evalscript”: “//VERSION=3\nfunction setup() {\n return {\n input: [{\n bands: [\n “B04”,\n “B08”,\n “SCL”,\n “dataMask”\n ]\n }],\n output: [\n {\n id: “data”,\n bands: 3\n },\n {\n id: “scl”,\n sampleType: “INT8”,\n bands: 1\n },\n {\n id: “dataMask”,\n bands: 1\n }]\n };\n}\n\nfunction evaluatePixel(samples) {\n let index = (samples.B08 - samples.B04) / (samples.B08+samples.B04);\n return {\n data: [index, samples.B08, samples.B04],\n dataMask: [samples.dataMask],\n scl: [samples.SCL]\n };\n}\n”
},
“calculations”: {
“default”: {}
}
}’ > PT1h.json

Actual result: There are three 1-hour intervals:
{“from”:“2021-07-09T00:00:00Z”,“to”:“2021-07-09T01:00:00Z”}{“from”:“2021-07-14T00:00:00Z”,“to”:“2021-07-14T01:00:00Z”}{“from”:“2021-07-19T00:00:00Z”,“to”:“2021-07-19T01:00:00Z”}
Each interval begins at midnight and statistics are reported as NaN.
Expected result: the intervals should begin at 10:00 UTC because that is the time when the imagery was acquired, and the actual statistics should be calculated.

(3) No aggregation:
curl -X POST ${URL}
-H ‘Content-Type: application/json’
-H “Authorization: Bearer ${TOKEN}”
-d ‘{
“input”: {
“bounds”: {
“bbox”: [
12.44693,
41.870072,
12.541001,
41.917096
]
},
“data”: [
{
“dataFilter”: {},
“type”: “sentinel-2-l2a”
}
]
},
“aggregation”: {
“timeRange”: {
“from”: “2021-07-06T00:00:00Z”,
“to”: “2021-07-20T23:59:59Z”
},
“aggregationInterval”: {
“of”: “”
},
“width”: 512,
“height”: 343.697,
“evalscript”: “//VERSION=3\nfunction setup() {\n return {\n input: [{\n bands: [\n “B04”,\n “B08”,\n “SCL”,\n “dataMask”\n ]\n }],\n output: [\n {\n id: “data”,\n bands: 3\n },\n {\n id: “scl”,\n sampleType: “INT8”,\n bands: 1\n },\n {\n id: “dataMask”,\n bands: 1\n }]\n };\n}\n\nfunction evaluatePixel(samples) {\n let index = (samples.B08 - samples.B04) / (samples.B08+samples.B04);\n return {\n data: [index, samples.B08, samples.B04],\n dataMask: [samples.dataMask],\n scl: [samples.SCL]\n };\n}\n”
},
“calculations”: {
“default”: {}
}
}’ > empty.json

Actual result: error 400 for bad request
Expected result:
This is more of a feature request - it should be OK to skip aggregation and let the API consumer aggregate data in time the way they want.

Hi @chris,

thanks for these observations and suggestions.
We will have a look into both. A custom aggregation should still possible but aggregation interval must correspond to the whole time range, e.g.:

"aggregation": {
    "timeRange": {
            "from": "2021-07-06T00:00:00Z",
            "to": "2021-07-21T00:00:00Z"
      },
    "aggregationInterval": {
        "of": "P15D"
    },
...
}

Best, Anja