Can you guys add your Sentinel-1 RTC analysis-ready data to open access registry as you have with you Sentinel-2 L2A data please? I found it in sentinelhub playground so clearly it is there. Just need a fast way to access the geotiffs. Or even if you put it in an S3 bucket that is somehow under a paid subscription that would be great (avoid any kind of API access).
Hi @justjohnp,
what you see in Playground and EO Browser is our API performing RTC (and orthorectification, speckle filtering, etc - depending what user chooses) on-the-fly.
If you need the data over specific AOI on the S3 bucket, you can make use of our Sentinel-1 CARD4L tool, see manuals here.
If interested in area over Africa, you can check DIgital Earth’s CARD4L bucket, which contains the COGs we have produced using the above-mentioned tool.
Ok so sounds like if I want RTC data over 6 month periods from 40,000 different areas/polygons (around 1 hectare each) the api is the only way? Is it fast enough to handle this kind of request even though it’s processing on the fly? I tried using Snap GPT and it couldn’t work fast enough even with optimized code to finish in a reasonable amount of time so surprised your api can be fast.
Not sure what is “fast” or not for you. I imagine it will take up to a few seconds for each polygon/observation. With Enterprise-S plan you can make cca 600 requests per minute, so you should have results in a day or two. Even with Basic option it should be done in a few days.
Check this sh-py example, just re-configure it to get relevant RTC data instead of S2.
thanks - yes a day or two is “fast” enough to me.
Follow-up question: How does SH API process the RTC from S1 GRD so quickly? Is the code openly available somewhere (or can you briefly explain what package you use to do this)? I’ve only seen Snap’s GPT used which is very slow.
It is proprietary code, result of tens of man-years of effort, combination of development, optimization and engineering.
With this being part of our business model, it is unfortunately not available as an open-source.
Got it! So regarding requests then for RTC data through our plan – does each polygon count as a request (even if it covers a large date range and many scenes…e.g. time_interval=(‘2019-06-01’, ‘2020-06-1’))? Do we need to care about processing units?
Depending a bit on how you will construct requests. If you will go the “easy” way, i.e. using sentinelhub-py example I pasted above, each polygon/observation will count as one request. However, this should not really be an issue as you have Enterprise account, so there is unlimited number of requests (per month), there is just request/minute limit.
As your polygons are super small, the PUs will not really be much - each request/observation probably around 0.01 PU (the minimum amount).
Assuming you will do 40.000 polygons, for each cca 50 observations, you will make 2 million requests and consume around 20-50.000 PU. This is way below your monthly limit (400.000 PU per month).
So, if I were you, I would go the easy way, create script that runs this example, then run this in 10-20 parallel threads and keep it running.
If you need some help with the above, let us know and we will try to create some more specific example.
Fantastic! That covers us for fast calculation of RTC polarimetric backscatter data from S1 then. Any chance that you have a product that can similarly calculate interferometric coherence quickly over a S1 time series for a specific polygon? That is hard to find available (I’ve only found a random S3 bucket that covers 2020 here: Global Seasonal Sentinel-1 Interferometric Coherence and Backscatter Data Set - Registry of Open Data on AWS)
Yeah, that is indeed much harder, and not something we can offer.
CreoDIAS has an on-demand processing, which costs 1.45 EUR per scene (see Sentinel-1 CARD-COH6). These data can easily be ingested in Sentinel Hub and then clipped per polygon, but might get expensive if you want to run this over large area…