I have been implementing the approach highlighted here: Finding regions or zones in raster DEM of similar slope or aspect values in ArcGIS for Desktop?
to calculate firstly the "spherical variance" and then the "spherical standard deviation (SD)" on a 50 cm DEM of an island.
My intention is to identify the ledges that may exist on the cliffs of the island and the relatively flat areas.
However, I have found it difficult to identify and justify how to classify the resulted spherical SD values. In particular, I understand that the closer the values are to zero, the more likely the variation of slopes in these areas is smaller (i.e. gentle and flat areas). In my understanding, I could classify such areas as the ledges I am looking for. However, I am unsure up to what values of spherical SD ledges can be classified (i.e. SD=0.1 or SD=0.2 and so on).
I seek answers pointing me to the threshold that I could set in my pursuit of classifying spherical SD values as flat areas and ledges. Ideally, a book or a scientific paper that has done something similar would be perfect.