I have DEMs from all over the world (thanks, #mapzen!) in 1x1°tiles. I generate elevation tints, slopeshades and hillshades from them for each tile, then use a vrt file to create a single file I can use to render my maps.
My problem begins when I want to render low zoom levels (0-8). At first I was rendering mostly around Europe, but now I’m rendering for friends and the places they visit. For low zooms I used to use gdalwarp to create x4, x16 and x256 ‘small’ GeoTIFF versions, but now that I span almost all the world, those are still too big.
So I was wondering about adding pyramidal/overview info to those tiles with gdaladdo, but wondering if gdalbuildvrt supports them and if mapnik is clever enough to use them in situations where reading the whole original dataset was “too much”. Does anyone has any experience in that?
OK, you beat me to it.
Maybe there’s a reason I Process my DEMs in python and not directly with gdal command-line tools.
See DEM-scripts/2023/hillshade_mem.py at master · yvecai/DEM-scripts · GitHub for my last and most efficient (to date) script handling a worldwide DEM in reasonable chunks.
For low zooms I used to use gdalwarp to create x4, x16 and x256 ‘small’ GeoTIFF versions, but now that I span almost all the world, those are still too big.
I have the impression you got my question wrong, but then it’s Saturday night, I’m tired, and it might be me misreading you.
In any case, I found out that even when mapnik might or not support overviews, the vrt format does not, so in the end I split the regions I render by continent and have one vrt file per each. Then the 'warps are small enough I can compute and use them.
Somehow I’m running out of memory computing it, but also the resulting TIFFS are huge in size, but they’re also quite sparse, as I don’t render the whole world, just some areas that interest me. For instance, I rendered a big chunk of Europe, but also big mountains like the Aconcagua or the Dhaulagiri in Nepal.
In any case, the current vrt it’s trying to warp is 165_601x122_401, the command is:
Processing these kind of raster takes time, literally hours, even days with a worldwide DEM: that’s expected.
Also vrt are a very good way to postpone computation : not necessarily a good idea. Build geotiff in 4326 ASAP, then the final rendering rasters in webmercator.
Also, make sure you use gdalwarp with sensible options like creationOptions=[“TILED=YES”,“NUM_THREADS=4”,“BIGTIFF=YES”,“COMPRESS=DEFLATE”] (there’s an equivalent from the command line).
//Build geotiff in 4326 ASAP, then the final rendering rasters in webmercator.
/Not sure what do you mean by this.
In 4326, à degree is a degree, so there’s no need to compensate anything if you compute the rendering rasters with gdaldem in 4326, then finally reproject to webmercator for Mapnik.
Yeah, that’s exactly what I want to avoid, to recompute on each render, I prefer to precompute everything. In any case, I still use Lanczos for interpolation, but I guess/hope that this way I can avoid any reprojection. I never really benchmarked it. Also, I’m pretty sure ESPG:4326 a.k.a. WSG 84 distorts shades, specially around 50N and beyond. Read my post for the rationale.
You can compute Hillshading in 4326, then reproject (precompute) a big raster in webmercator and use it to render.
That being said, I agree, the hillshade look is not the same and this is a matter of taste that can’t really be discussed.
So for your gdalwarp memory issue, my advice would be to try to loop on smaller chunks and stitch them afterwards, taking care of the edges of course.
Computationally is not that different, visually yes. OTOH, my issue is resizing for lower (<9) zooms, not the processing itself. I can do that in less than an hour.