Sentinel-1 SLC on AWS?

Many thanks for making Sentinel-1 GRD available as cloud optimized geotiffs!

Are SLC products available on AWS right now as well?

In the “Data structure for Sentinel-1” section of the Sentinel on AWS webpage one can read

[product type] = GRD - GRD or SLC

but aws s3 ls s3://sentinel-s1-l1c/ shows a single GRD folder only.

SLC are not yet available on AWS. We will start working on this on a couple of weeks, after GRD sync is proven to be stable.
Note that current plan is to only keep a rolling archive of 2 months of SLC data on AWS.

Thank you. Just out of curiosity, why only 2 months? Is it a disk space limitation?

SLC products are great but accessing them through Copernicus Open Data Hub and mirrors is a real bottleneck. Having them available as cloud optimized tiffs on AWS would make them easily accessible for anyone.

Well, SLCs are pretty large and every TB costs something. AWS Open Public Dataset team has decided to allocate that much storage initially.
When they see user uptake, they might reconsider this option.

If you have some good “story” to tell, on how/why you need more data, I am sure they would love to hear, it might convince them to put additional resources to it…

Hi, any news on this?
Would be fantastic to have SLC on AWS.

There are some SLC data actually available:
https://registry.opendata.aws/sentinel1-slc-seasia-pds/

The overall replication of data was taken over by another company and we have not received an update from them yet.

1 Like

Hi @gmilcinski, do you know which company took over the replication of SLC data?

The sentinel1-slc-seasea-pds S3 bucket provides zipped SAFEs, which prevents efficient cloud access and defeats the whole purpose of mirroring SLC data on an S3 bucket in my humble opinion.

On the AWS Registry page it states it is done by:
https://earthobservatory.sg/

I am not a SAR expert and I have never done InSAR. I do in general agree with your assessment on cloud-native approach and ZIP files. That being said, as far as I know, SLCs are pretty well compressed, so unzipping everything would cost a lot. Also, most of InSAR processes take tens of minutes to execute so it might not be such an overhead to copy the file to the VM’s storage and unzip it…
One always has to think about costs and benefits… Would you rather have unzipped files and 3 times less data?
Again, I am not an expert, so just my thoughts…

Hi all, we are still interested in making more SLC data available as an AWS Public Dataset. Grega mentioned the subset of SE Asia, but we are also looking at ways of making a larger set of data available in other formats. That work is ongoing, but unfortunately I do not have an update at this time. But it’s on our radar (SAR joke!!!).

Thank you @gmilcinski and @jflasher for your answers. It’s good to hear that more SLC data could become available on AWS!

@gmilcinski, when you say that

SLCs are pretty well compressed, so unzipping everything would cost a lot

do you mean in terms of CPU cost of the unzipping operation, or in terms of storage space?

Actually the SLC tiff files are not compressed, and converting them to COG with compression enabled results in an overall size very close to the size of the original zipped SAFE, while enabling efficient cloud access.

Let’s take an example:

S1A_IW_SLC__1SDV_20191116T205157_20191116T205223_029941_036ABF_1783.zip file size is 4.0 GB. After unzip, the SAFE folder is 6.8 GB. Individual sizes of the 6 tiff files are:

943.2 MiB s1a-iw1-slc-vh-20191116t205159-20191116t205221-029941-036abf-001.tiff
943.2 MiB s1a-iw1-slc-vv-20191116t205159-20191116t205221-029941-036abf-004.tiff
1.2 GiB s1a-iw2-slc-vh-20191116t205157-20191116t205222-029941-036abf-002.tiff
1.2 GiB s1a-iw2-slc-vv-20191116t205157-20191116t205222-029941-036abf-005.tiff
1.2 GiB s1a-iw3-slc-vh-20191116t205158-20191116t205223-029941-036abf-003.tiff
1.2 GiB s1a-iw3-slc-vv-20191116t205158-20191116t205223-029941-036abf-006.tiff

If one re-encodes these 6 tiff files to COG with compression enabled, their file sizes are reduced to

498.0 MiB s1a-iw1-slc-vh-20191116t205159-20191116t205221-029941-036abf-001.tiff
638.2 MiB s1a-iw1-slc-vv-20191116t205159-20191116t205221-029941-036abf-004.tiff
656.9 MiB s1a-iw2-slc-vh-20191116t205157-20191116t205222-029941-036abf-002.tiff
812.7 MiB s1a-iw2-slc-vv-20191116t205157-20191116t205222-029941-036abf-005.tiff
653.0 MiB s1a-iw3-slc-vh-20191116t205158-20191116t205223-029941-036abf-003.tiff
767.6 MiB s1a-iw3-slc-vv-20191116t205158-20191116t205223-029941-036abf-006.tiff

totalling 4.0 GB, which is equivalent to the initial size of the zip.

Efficient cloud access to SLC makes sense because one doesn’t necessarily want to process an entire scene, but rather a single burst in a single subswath (i.e. ~ 1/30 in size with respect to the total scene, for the routine IW mode).

These are most useful insights.
When talking about these options a while ago, we were told (but have not checked it) that converting to COGs would make the SLCs incompatible with several standard software modules. Was this info incorrect?

And yes, I meant in storage.

@jflasher @gmilcinski Any update on the SLC data availability? We are looking for India Subcontinent region but unable to get any relevant open-data resource.

Hi @gunjan aside from seasea-pds referenced there have been no updates for your specific AOI but we are actively working to make a portion of SLC available through our program. I hesitate to provide a definitive timeline but it continues to be a priority. I would love to see them as COGs as well which would make processing more efficient. I will update this thread when I have more information to share.

1 Like

That would be so helpful meanwhile I got a hold of library Earth Observation Data Access Gateway (github.com) which gives a comprehensive list of open data access stores through single interface. I am going to use this for SLC data.

I should add that you can always set up tokens/credentials to access the SLC data directly from ASF. This is what was done for this dataset and they set up a pipeline to generate ARD tiles using backscatter/coherance.