In Python, I generate a virtual raster using gdal warp with a cutline as per below
gdal.Warp(warpedFilePath, ds, format = 'VRT', cutlineDSName = csvFilePath, srcNodata = 0, dstAlpha = True, cropToCutline = True, dstSRS = 'EPSG:4326')
If the cutline is within the image, the vrt file is of "normal" size (a few KB) and the processing afterwards (when turning it into a GTIFF) is fast. However, if the cutline has matching edges with the image then I get a huge VRT file, eg several MB and the processing is over an hour. Looking at the Cutline value, I can see that every border pixel is being added! when the translate is done to have a tiff file, the process takes over an hour...
Is this a "normal" behaviour? is there an option to reduce the number of points in the cutline?
My gdal version is 2.2.4 The input raster is in UTM and the cutline is a csv file in 4326.
Edit: CSV file for the cutline
ID, WKT
1, "POLYGON ((-4.68694239006474 54.9863989533752,-4.68563773368381 54.9553739114236,-4.92089428671608 54.9518432821996,-4.948233 55.021719,-4.711947 55.051303,-4.68694239006474 54.9863989533752))"
Unfortunately I cannot provide the image, for one it is too large, around 1GB, and it is bound to its commercial terms. That said I am quite convinced the same would happen with a free Landsat image for instance.