Multi temporal scene classification

Hello,

I want to have multi temporal scene classification, is it possible ? I saw it is done for min or max ndvi band but there is no resource for scene classification. I tried to do it by myself and here my evalscript;

//VERSION=3

 function RGBToColor (r, g, b,dataMask){
	return [r/255, g/255, b/255,dataMask];
}

function setup() {
   return {
    input: ["SCL","dataMask"],
    output: { bands: 4 },
    mosaicking:"ORBIT"
  };
}

function filterScenes (scenes, inputMetadata) {
    return scenes.filter(function (scene) {
//Here we limit data between "(TO date - 1 month) to (TO date)
	  return scene.date.getTime()>=(inputMetadata.to.getTime()-1*31*24*3600*1000) ;
    });
}

function evaluatePixel(samples) {
    
	     // Not-vegetated (dark yellow)
    var px = RGBToColor (255, 230, 90,samples.dataMask);
    for (var i=0;i<samples.length;i++) {
      var SCL=samples[i].SCL;
    	
	switch (SCL) {
    		// Vegetation (green)
    		case 4: return RGBToColor (0, 160, 0,samples.dataMask);
        
    		}    
    }
    return px
    
}

I don’t know why but it does not work.

Note: I am so new to satellite datas and evalscript

thanks…

Hello! Welcome to the Sentinel Hub Forum!

Can you explain in more detail what you mean by a multi-temporal scene classification? I see that you are trying to use the SCL band from Sentinel-2 L2A in your evalscript, do you only wish to use this band or do you also wish to use the spectral bands too?

If you only wanted to use the SCL band, what would your logic be on selecting which acquisition to use?

We do have an example on visualising the SCL band for individual scenes which I have shared below:

//VERSION=3

 function RGBToColor (r, g, b,dataMask){
	return [r/255, g/255, b/255,dataMask];
}

function setup() {
   return {
    input: ["SCL","dataMask"],
    output: { bands: 4 }
  };
}

function evaluatePixel(samples) {
    const SCL=samples.SCL;
    switch (SCL) {
    // No Data (Missing data) (black)    
    case 0: return RGBToColor (0, 0, 0,samples.dataMask);
        
    // Saturated or defective pixel (red)   
    case 1: return RGBToColor (255, 0, 0,samples.dataMask);

    // Dark features / Shadows (very dark grey)
    case 2: return RGBToColor (47,  47,  47,samples.dataMask);
        
    // Cloud shadows (dark brown)
    case 3: return RGBToColor (100, 50, 0,samples.dataMask);
        
    // Vegetation (green)
    case 4: return RGBToColor (0, 160, 0,samples.dataMask);
        
    // Not-vegetated (dark yellow)
    case 5: return RGBToColor (255, 230, 90,samples.dataMask);
        
    // Water (dark and bright) (blue)
    case 6: return RGBToColor (0, 0, 255,samples.dataMask);
    
    // Unclassified (dark grey)
    case 7: return RGBToColor (128, 128, 128,samples.dataMask);
    
    // Cloud medium probability (grey)
    case 8: return RGBToColor (192, 192, 192,samples.dataMask);
        
    // Cloud high probability (white)
    case 9: return RGBToColor (255, 255, 255,samples.dataMask);
    
    // Thin cirrus (very bright blue)
    case 10: return RGBToColor (100, 200, 255,samples.dataMask);
        
    // Snow or ice (very bright pink)
    case 11: return RGBToColor (255, 150, 255,samples.dataMask);

    default : return RGBToColor (0, 0, 0,samples.dataMask);  
    }
}

You can see this here in the EO Browser application.

If you are interested in more advanced classification methods, I would recommend reading about EO Learn, the python library developed by our EO Research team. There are lots of great resources in the documentation that you can tap into.

If you have any more questions, please let us know!

Just to add to this. EO Browser has 2 options in the Visualize tab regarding the timespan for which the data is shown.
The default is the timespan of one day (“single date”). The other one is a (custom) timespan, where you can set the start and end of the timespan by yourself.
It’s best to use a longer timespan with the multitemporal scripts, because in most areas the timespan of 1 day only contains data from one time point.

Thanks for your reply,

I want to classify vegetated areas by using scene classification and to make it more correct I want to evaluate pixels in different times. For example if I evalueate just one time like 10.08.2022, the image may has some clouds but if I do it for 1 month before now , it will be better. I will take 5-6 images in one month then I will select pixels which is belong to vegetated class. You can look at here, max ndvi is tried to find. I want to do same thing for scene classification.

But how can I reflect it to evalscript ?

Mosaicking needs to be set in the setup() function by adding mosaicking: Mosaicking.ORBIT or mosaicking: "ORBIT" like you already have.

This tells the platform to provide evaluatePixel() with an array of values for bands at the pixel at all the acquisition times instead of providing only the already mosaicked values for bands at the pixel.

As mentioned, as a result of setting the mosaicking, the samples parameter for the evaluatePixel() is now an array that looks like

[
    { // value for the acquisition time X
      "B01": 0.7662,
      "B02": 0.6874,
      "B03": 0.6669,
      "dataMask": 1
    },
    { // values for the acquisition time Y
      "B01": 0,
      "B02": 0,
      "B03": 0,
      "dataMask": 0
    }, 
   ...
]

If you don’t set the mosaicking or if you set it to SIMPLE, the samples parameter is an object that looks like this:

{ // mosaicked values from all acquisition times
  "B01": 0.053000000000000005,
  "B02": 0.0359,
  "B03": 0.0227,
  "dataMask": 1
}

Note: the bands can be different. These 2 examples contain B01, B02, B03 and dataMask because that’s what I had ready.

Some lines in the code in the evaluatePixel in your first message correctly treat scenes as an array, but others not.

function evaluatePixel(samples) {
  // Not-vegetated (dark yellow)
  var px = RGBToColor (255, 230, 90,samples.dataMask); // treats scenes as an object (to get the dataMask)
  // temporary "solution" is to get the dataMask value from the first element 
  var px = RGBToColor (255, 230, 90,samples[0].dataMask);

  // goes through the array of elements with values at different acquisition times
  for (var i=0;i<samples.length;i++) {
    var SCL=samples[i].SCL;
    switch (SCL) {
      // Vegetation (green)
      // returning already here can result on ignoring values for other acquisition times (e.g. SCL is 4 only for 1 acquisition time out of n of acquisition times)
      // solution would be to count how many times the SCL is 4 and maybe set the lightness / darkness of the color outside of the foor loop based on that
      case 4: return RGBToColor (0, 160, 0,samples.dataMask); // treats scenes like an object to get the dataMask
      // solution would be to get the dataMask value for the current sample
      case 4: return RGBToColor (0, 160, 0,samples[i].dataMask);
    }    
  }
  return px  
}

Other sources:

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.