I export bunch of DEM files (10000x5000) from Photoscan and I need to combine them with gdal_merge.py but the problem is that it consumes all the memory availible and can't finish processing.
What workaround can be applied to limit RAM usage?
I export bunch of DEM files (10000x5000) from Photoscan and I need to combine them with gdal_merge.py but the problem is that it consumes all the memory availible and can't finish processing.
What workaround can be applied to limit RAM usage?
gdal_merge allocates memory for whole raster at once so it runs quickly for datasets that fit into memory. If it is not you case, use gdalwarp tool which does tiling so you can control how much memory does it use:
gdalwarp --config GDAL_CACHEMAX 512 -wm 4096 merged.tif
where GDAL_CACHEMAX is memory for IO cache and -wm is memory limit which controls the tile size. Both are in MB.
-of GTiff -co tiled=YES. -wm 4096 feels quite large value https://trac.osgeo.org/gdal/wiki/UserDocs/GdalWarp. -wm 500 may be better but it is easy to test with your data and computer. – user30184 Nov 02 '16 at 14:40gdalbuildvrt mosaic.vrt *.tifthen converting usinggdal_translate -of GTiff mosaic.vrt mosaic.tif(and using-coto apply relevant creation options for the output tiff – tlrss Jan 22 '24 at 14:54