1

So obviously different applications would require different levels of accuracy, but I'm wondering if there's some standard threshold that the field typically determines to be "good" (or "good enough").

For example, if I'm trying to locate a tree in the woods, I'd probably need sub-meter accuracy or close to it in order to ever find that tree again. However, locating the crest of a mountain would likely only require accuracy to several meters b/c of it's conspicuousness in the landscape.

However, maybe in both applications some "generalized" accuracy threshold could be calculated as:

grain size of object * xx% of that grain size

So is there such a standard, or is it only up to the researcher to make that decision personally given the context?

1 Answers1

1

In my experience six decimal places is 'good enough' for most purposes at ~11cm accuracy, and I believe that to be the general opinion for most applications, but yes it probably depends on your requirements.

See the post below for some detail:

Measuring accuracy of latitude and longitude?

jlmt
  • 111
  • 1