When using coordinates in decimal degrees, how many decimal points actually make sense?
For example, say one is producing a table of locations of base camps in a report, there's no point in displaying a number like 62.946916, -135.018639 when 62.95, -135.02 is sufficient. On the other hand when giving someone a waypoint for GPS navigation 62.947, -135.019 is more warranted (manually keyed in, why type 6 digits when 3 will do?), while for locating a geocache 4 positions is desired, and 6+ for a survey stake.
In other words, what is the easiest way to determine for any given purpose how many decimal places should be displayed?
There are technical answers at Algorithm for offsetting a latitude/longitude by some amount of meters, but they're not readily adaptable to a quick reference handbook style of usage. I'm looking for a handy chart to refer to like "the nth decimal place is X distance in ground units (at Y latitude)", or "the nth decimal place is X distance in ground units (at ___ major city/landmark)".