How are pixels from different orbits combined in mosaic products?

I am back again with another orbit-related question about mosaic products!

When I order a Sentinel 2 mosaic that covers multiple Sentinel 2 relative orbits, what is happening in the areas that are covered by multiple orbits? Are all pixels from all possible orbits evaluated in the evaluatePixel function? Or does evaluatePixel operate on data from each orbit separately then combine the independent results later? The information in the mosaicking documentation seems to be relevant but I do not fully understand the distinction between ORBIT and TILE as it relates to combining data from separate orbits.

My goal is to account for orbit artifacts in the mosaic output images:

Based on visual patterns in that image it looks like all pixels regardless of orbit are evaluated simultaneously in evaluatePixels.

Is it possible to write an output layer that indicates which relative orbits contributed to the result for each pixel?

Would it be possible to run evaluatePixels separately by relative orbit, write out separate images by orbit in the same request? I imagine it would be simpler to run separate requests for each orbit.

Here is the evalscript that I used to produce the mosaic image above:

//VERSION=3
// based on this evalscript:
// https://github.com/sentinel-hub/custom-scripts/blob/master/sentinel-2/cloudless_mosaic/L2A-first_quartile_4bands.js
function setup() {
return {
    input: [{
    bands: [
        "B08", // near infrared
        "B03", // green
        "B02", // blue
        "SCL" // pixel classification
    ],
    units: "DN"
    }],
    output: [
    {
        id: "default",
        bands: 3,
        sampleType: SampleType.UINT16
    }
    ],
    mosaicking: "ORBIT"
};
}
// acceptable images are ones collected on specified dates

function preProcessScenes (collections) {
    var allowedDates = [%(date_string)s]; // filled in with string from python script
    collections.scenes.orbits = collections.scenes.orbits.filter(function (orbit) {
        var orbitDateFrom = orbit.dateFrom.split("T")[0];
        return allowedDates.includes(orbitDateFrom);
    })
    return collections
};

function getValue(values) {
values.sort(function (a, b) {
    return a - b;
});
return getMedian(values);
}
// function for pulling median (second quartile) of values
function getMedian(sortedValues) {
var index = Math.floor(sortedValues.length / 2);
return sortedValues[index];
}
function validate(samples) {
var scl = samples.SCL;
if (scl === 3) { // SC_CLOUD_SHADOW
    return false;
} else if (scl === 9) { // SC_CLOUD_HIGH_PROBA
    return false;
} else if (scl === 8) { // SC_CLOUD_MEDIUM_PROBA
    return false;
} else if (scl === 7) { // SC_CLOUD_LOW_PROBA
    // return false;
} else if (scl === 10) { // SC_THIN_CIRRUS
    return false;
} else if (scl === 11) { // SC_SNOW_ICE
    return false;
} else if (scl === 1) { // SC_SATURATED_DEFECTIVE
    return false;
} else if (scl === 2) { // SC_DARK_FEATURE_SHADOW
    // return false;
}
return true;
}
function evaluatePixel(samples, scenes) {
var clo_b02 = [];
var clo_b03 = [];
var clo_b08 = [];
var clo_b02_invalid = [];
var clo_b03_invalid = [];
var clo_b08_invalid = [];
var a = 0;
var a_invalid = 0;
for (var i = 0; i < samples.length; i++) {
    var sample = samples[i];
    if (sample.B02 > 0 && sample.B03 > 0 && sample.B08 > 0) {
    var isValid = validate(sample);
    if (isValid) {
        clo_b02[a] = sample.B02;
        clo_b03[a] = sample.B03;
        clo_b08[a] = sample.B08;
        a = a + 1;
    } else {
        clo_b02_invalid[a_invalid] = sample.B02;
        clo_b03_invalid[a_invalid] = sample.B03;
        clo_b08_invalid[a_invalid] = sample.B08;
        a_invalid = a_invalid + 1;
    }
    }
}
var gValue;
var bValue;
var nValue;
if (a > 0) {
    gValue = getValue(clo_b03);
    bValue = getValue(clo_b02);
    nValue = getValue(clo_b08);
} else if (a_invalid > 0) {
    gValue = getValue(clo_b03_invalid);
    bValue = getValue(clo_b02_invalid);
    nValue = getValue(clo_b08_invalid);
} else {
    gValue = 0;
    bValue = 0;
    nValue = 0;
}
return {
    default: [nValue, gValue, bValue]
};
}

Hi @hrodman,

great questions!

As you said mosacking is relevant here:

  • If you set it to TILE, Sentinel Hub will not do any mosaicking. Values from all tiles will be available to you in samples object in evaluatePixel function and tiles’ metadata in scenes object. How you then use it in evaluatePixel function is up to you.
    This is nicely illustrated here https://youtu.be/kbw3OyYkbA4?t=1378
    I like to use this request to check from which tiles or dates are values in samples object, when I am not sure.

  • If you set it to ORBIT, Sentinel Hub will mosaick values from tiles acquired on the same day. In other words, in areas where two or more tiles from the same day overlap, you will get only one value for this day in samples object in evaluatePixel function.
    This explanation could help but does not explicitly address you questions: https://youtu.be/kbw3OyYkbA4?t=1600 Or use this request to check for which days are values in samples and values from which tiles were mosaicked into one value.

So, your conclusion:

Based on visual patterns in that image it looks like all pixels regardless of orbit are evaluated simultaneously in evaluatePixels.

is correct. But it is accentually you - as a user - who has control over what will be returned in areas where more tiles or orbits overlap.

Is it possible to write an output layer that indicates which relative orbits contributed to the result for each pixel?

That is an interesting idea. I think it should be possible but I have not tried myself, so I can not say for sure. I would approach it like this:

  • find at which positions in samples are the final “nValue”, “gValue”, “bValue” values
  • get original tile id from scens for these positions
  • parse original tile id to get absolute orbit ids and convert it to relative orbit id (example, but you have actually done that before, right?)
  • and return relative orbit ids in a separate output

Would it be possible to run evaluatePixels separately by relative orbit, write out separate images by orbit in the same request? I imagine it would be simpler to run separate requests for each orbit.

If you know how many relative orbits there are in advance then it should be possible. Again, I have not tried myself, but you would need to get relative orbit ids from scenes and then split (or group) values in samples based on relative orbit ids. Then the whole process that you now have in evalautePixel would need to be run for each of these parts.

Did I manage to clarify? If not, I am happy to further extend my answer, just let me know which parts.
It would also be very interested in seeing your results (if you manage to improve your mosaics) in case you can share them.

Best, Anja