API Implementation and tile consumption

Hello, I‘m pretty new here and I started working with the API. I want to implement a cloudless mosaic of Norway in my website.
Now I wonder how this mosaic is hosted:

https://data.linz.govt.nz/layer/93652-nz-10m-satellite-imagery-2017/

Do they use the Sentinel-Hub API? And does every reload of the page consumes processing units and requests? That would mean that the units in my plan are completely empty after one day.

Kind regards
Max

Hi Max,

Indeed, we produce the cloudless mosaic of New Zealand (updated yearly), but the data is hosted by Linz Data Service and displayed on the website using an XYZ tile service.

If you would like to implement a cloudless mosaic of Norway on your website, you have several options depending on how much work you want to do yourself.

Cloudless mosaic generation

Firstly, for the generation of the cloudless mosaic, you could do it yourself using the Batch Processing API. I would strongly recommend not to generate the mosaic on the fly: because of the shear volume of images you need to generate the product, the process would be extremely slow. The Batch Processing API allows you to generate the product once using our cloud-based servers in a scalable and efficient manner. We have written a blog post on how to produce a cloudless mosaic using Sentinel Hub services, and the code in available in our custom scripts repository.

Alternatively, if you don’t want to spend time testing the script on small test areas, fine-tuning it, then scaling up, our Austrian team offers the generation of cloudless mosaics as a service.

Delivery to the website

Once you have generated the cloudless mosaic of Norway you will want to deliver it to your website for display. Here again you have two options.

The first option (least effort), is to use Sentinel Hub services to deliver the data to your website. You would have your mosaic ingested as BYOC and serve it to your website using WMS. This removes the complexity of having to store the images and setup a tile server, but it will use Processing Units and Requests.

The second option is to store your product and setup the server integration yourself. You won’t be using units from your plan, but would have to deal with the infrastructure on your side.

I hope this answers your questions. Don’t hesitate to get in touch by email or here if you have more questions.

1 Like

Hello Maxim,

thank you for your answer! This helps me alot. What I still don’t understand is, what is the difference between using the Batch Processing API and running my script as custom script and linking it as WMS from my dashboard. I used a modified version of the custom script you mentioned to create a clodudless mosaic and I was able to link it to QGIS for example.

If you run the custom script every time you want to query the mosaic with WMS for instance, you will consume a lot of PUs: the script will be requesting temporal data stacks to compute the mosaic on the fly for each query. Batch allows you to run the script once (Batch Processing API consumes 3 times less PUs than the normal Process API) and build your mosaic. This means that when you then query your product, you are not requesting lots of bands each time.

However, this depends on your use-case: with Batch you are making a “fixed” product over a given time period (for the New Zealand example, we update it each year). With WMS, you can be more dynamic (e.g. the last 6 months of data from the time of the query) but with a higher processing cost. Apart from some very specific cases, I would recommend the Batch Processing option.