Does mapnik support pyramids/overviews?

I have DEMs from all over the world (thanks, #mapzen!) in 1x1°tiles. I generate elevation tints, slopeshades and hillshades from them for each tile, then use a vrt file to create a single file I can use to render my maps.

My problem begins when I want to render low zoom levels (0-8). At first I was rendering mostly around Europe, but now I’m rendering for friends and the places they visit. For low zooms I used to use gdalwarp to create x4, x16 and x256 ‘small’ GeoTIFF versions, but now that I span almost all the world, those are still too big.

So I was wondering about adding pyramidal/overview info to those tiles with gdaladdo, but wondering if gdalbuildvrt supports them and if mapnik is clever enough to use them in situations where reading the whole original dataset was “too much”. Does anyone has any experience in that?

OK, you beat me to it.
Maybe there’s a reason I Process my DEMs in python and not directly with gdal command-line tools.
See DEM-scripts/2023/hillshade_mem.py at master · yvecai/DEM-scripts · GitHub for my last and most efficient (to date) script handling a worldwide DEM in reasonable chunks.

According to GDAL: Overviews no longer work after "Raster Overzoom Quest" · Issue #3822 · mapnik/mapnik · GitHub there is some level of support, it just doesn’t seem to be properly documented. I’ll keep looking.

You can pre-compute lower zoom rasters with gdalwarp and declare the same amount of layers, like here : mapnik-opensnowmap.org/k_snow_map/project.yml at 7e800a2faa0afe0b8b2e0df67cc53d47eb95e039 · OpenSnowMap-org/mapnik-opensnowmap.org · GitHub

You mean like this?

For low zooms I used to use gdalwarp to create x4, x16 and x256 ‘small’ GeoTIFF versions, but now that I span almost all the world, those are still too big.

I have the impression you got my question wrong, but then it’s Saturday night, I’m tired, and it might be me misreading you.

In any case, I found out that even when mapnik might or not support overviews, the vrt format does not, so in the end I split the regions I render by continent and have one vrt file per each. Then the 'warps are small enough I can compute and use them.

I meant more like /2, /4, /8, /16, /32, /64…

But you’re right, there may be a misunderstanding. What do you mean by too big: you can’t compute the raster or Mapnik is really slow?

Somehow I’m running out of memory computing it, but also the resulting TIFFS are huge in size, but they’re also quite sparse, as I don’t render the whole world, just some areas that interest me. For instance, I rendered a big chunk of Europe, but also big mountains like the Aconcagua or the Dhaulagiri in Nepal.

In any case, the current vrt it’s trying to warp is 165_601x122_401, the command is:

gdalwarp -co BIGTIFF=YES -co TILED=YES -co COMPRESS=LZW -tr 123.68832310363732 -123.68832310363732 terrain.vrt terrain-medium.tif

Processing these kind of raster takes time, literally hours, even days with a worldwide DEM: that’s expected.
Also vrt are a very good way to postpone computation : not necessarily a good idea. Build geotiff in 4326 ASAP, then the final rendering rasters in webmercator.

Also, make sure you use gdalwarp with sensible options like creationOptions=[“TILED=YES”,“NUM_THREADS=4”,“BIGTIFF=YES”,“COMPRESS=DEFLATE”] (there’s an equivalent from the command line).

vrt are a very good way to postpone computation : not necessarily a good idea

The vrt is to collage all the 1x1° TIFFs. I’m definitely trying to advance computation; I don’t want to waste time on every rendering.

Build geotiff in 4326 ASAP, then the final rendering rasters in webmercator.

Not sure what do you mean by this. I’m actually reprojecting to WebMerc and compensating the DEMs (see https://www.grulic.org.ar/~mdione/glob/posts/trying-to-calculate-proper-shading/) so at render time there are no extra calculations.

Do you accept PRs? I think I can make that code way more readable :slight_smile:

Sure, feel free. It’s just my notes anyway, not like if I use those scripts that often but next time I may be happy to find them improved.

These are my scripts:

This does different processes on a single 1x1° DEM.

This launches the other in parallel and then generates the vrts and the ‘overview’ versions.

Compensation algo as from that article:

I see all these scripts not only as personal tools, but also as documentation for other people looking for examples about using GDAL et al.

//Build geotiff in 4326 ASAP, then the final rendering rasters in webmercator.

/Not sure what do you mean by this.

In 4326, à degree is a degree, so there’s no need to compensate anything if you compute the rendering rasters with gdaldem in 4326, then finally reproject to webmercator for Mapnik.

Yeah, that’s exactly what I want to avoid, to recompute on each render, I prefer to precompute everything. In any case, I still use Lanczos for interpolation, but I guess/hope that this way I can avoid any reprojection. I never really benchmarked it. Also, I’m pretty sure ESPG:4326 a.k.a. WSG 84 distorts shades, specially around 50N and beyond. Read my post for the rationale.

You can compute Hillshading in 4326, then reproject (precompute) a big raster in webmercator and use it to render.
That being said, I agree, the hillshade look is not the same and this is a matter of taste that can’t really be discussed.

So for your gdalwarp memory issue, my advice would be to try to loop on smaller chunks and stitch them afterwards, taking care of the edges of course.

Computationally is not that different, visually yes. OTOH, my issue is resizing for lower (<9) zooms, not the processing itself. I can do that in less than an hour.

Did you tried gdaladdo - ro?

That’s a 20Gpx image, should it take 24h to resize at 25x25%? I do 75x75% resizing of 24Mpx photos in less than a second…

Do you mean -ro? You think it’s trying to write on the vrt? Does writing on the vrt even work?

Ah, gdaladdo, not running it anymore…