I hope I'm asking this at the right part of Stack Exchange. Please bear with me if I'm wrong.
I'm developing some gps based applications. The demands I have for precision are not very high, but I need to know the possible errors.
I have learned that the fourth decimal gives you ~10 meters precision. That should be enough for me.
The real question is how fast will I get that precision realiably in different environments (indoors, outdoors free sky, forest, cloudy, city etc).
The applications I'm developing is for handheld devices so I prefer to have the gps active in as short intervals as possible.
As I do now, the intervals I use the gps are more governed by battery life than precision. Now I'm trying to balance the two.