Using other cloud storage instead of Amazon S3 Bucket (Batch processing API)

Hello @maxim.lamare!

I already start to use your new example of the Batch process API to prepare the Sentinel 1 images for larger areas. I have a question which would appreciate it if you could guide in this regard:

Is it possible to use other cloud storage instead of Amazon S3 Bucket or not? if so, could please let me know how is the configuration?

Kind regards,
Behzad

Hi Behzad,

Cool to hear that you are looking into Batch Processing. I find it makes life so much easier for processing large areas. Currently the process only supports Amazon object storage. There are future plans for adding other cloud storage options, but I don’t have a date for this.

Maxim

Hello Maxim (@maxim.lamare)!

Thank you for the info - Yes, I found out that for the large area eo-learn uses more processing units and I think Batch processing could help me to save some.

I am gonna use it for Sentinel-1 and honestly, I could not find suitable documentation regarding the evalscript. I thought you maybe can help me in this regard. I am gonna calculate different aggregation mode (mean, max etc.) but I cannot find anything about it.

Thank you in advance for your help :slight_smile:

Behzad

In Batch processing, you would use the same Evalscript as a normal API request.

There are some very basic Sentinel-1 evalscript examples in the API documentation, as well as more complex ones in the custom scripts repository for diverse applications.

Hi,

is there any update about this? is it still not possible to use other sources ? (beside S3, for example, GCS)

Not yet, but still on our time-line.
That said, it should be pretty simple to write a lambda function, in combination with SNS, that would copy the files from AWS to GCP and then delete the file…

1 Like