5

I'm looking for a fast way, preferably without trigonometric operations (just multiply, divide, add, sqrt and subtract) to determine if a point is visible. E.g. say I am facing 0 degrees N, I am at 51.40,-1.08, the camera has a HFOV of 30 degrees and the point is at 51.20,-1.085 it should return TRUE, because the point would be facing me and is within the 30 degree FOV.

It's required that it returns TRUE (false positive) instead of FALSE given edge cases, otherwise points may be hidden when they are actually visible. Currently I have my augmented reality project using atan2 to calculate the actual angle but if I can shave calculations off from points which are definitely not visible every little bit helps. I only need it to be accurate over distances of less than 2 km.

Here's my code to calculate horizontal and vertical position. It's only accurate over <5km distances but accuracy isn't critical anyway as those waypoints aren't drawn. This project is open source, so it's okay for me to post a lot of code.

res->est_dist = hypot2(p_viewer->p.lat - p_subj->lat, p_viewer->p.lon - p_subj->lon) * 2000.0f;
// Save precious cycles if outside of visible world.
if(res->est_dist > cliph)
    goto quick_exit;
// Compute the horizontal and vertical angle to the point. 
res->h_angle = angle_dist_deg(RAD2DEG(atan2(p_viewer->p.lon - p_subj->lon, p_viewer->p.lat - p_subj->lat)), p_viewer->yaw);
res->v_angle = RAD2DEG(atan2(p_viewer->p.alt - p_subj->alt, res->est_dist)); // constant scale factor: needs to be adjusted
// Normalize the results to fit in the field of view of the camera.
res->h_angle += p_viewer->hfov / 2;
res->v_angle += p_viewer->vfov / 2;
// Compute projected X and Y position.
res->pos.x = (res->h_angle / p_viewer->hfov) * p_viewer->width;
res->pos.y = (res->v_angle / p_viewer->vfov) * p_viewer->height;

See that call to atan2 for the h_angle? If I could eliminate it by saying "this point is definitely not visible" it would be nice.

whuber
  • 69,783
  • 15
  • 186
  • 281
Thomas O
  • 2,053
  • 3
  • 15
  • 11
  • Your code does not work correctly due to the large distortions in the projection you are using (a version of Plate Carree). You have to use a reasonably distortion-free projection when computing horizontal angles (or else use spherical trigonometry). – whuber Dec 03 '10 at 20:13
  • @whuber Ah, but it does! I am only using very short distances, and distortion is acceptable. If you want to see very similar code in action: (in Python) http://www.youtube.com/watch?v=V_fjUtqqX0M It's for augmented reality, so any distortions converge to zero as you approach the waypoints; therefore, distortions don't matter (as long as they aren't overwhelming.) – Thomas O Dec 03 '10 at 20:16
  • @Thomas Even at short distances the distortions are grievous except near the Equator. Their effect is to modify the apparent HFOV. For example, near 60 degrees latitude looking north your computation treats the HFOV as being around 15 degrees, not 30 degrees. If you don't care about the HFOV and don't mind it changing as you rotate around a fixed point of view, then you're ok. But there are better solutions, as outlined in my response and comments to it. – whuber Dec 03 '10 at 20:21
  • @whuber Comes down to processing speed really. Most people will be in the same area (it's for model planes), so distortions in FOV can be accounted for on start up if necessary. – Thomas O Dec 03 '10 at 20:23
  • @Thomas OK, but be aware that the distortion changes--tremendously--with the direction of view. Ignoring this issue gives the fastest processing speed, but accommodating it is an easy fix that ought to require a few tens of cycles at the most (look up a cosine in a table and perform two divisions). At any rate, the method sketched in my response shows how to avoid using arctangents. – whuber Dec 03 '10 at 20:25
  • @whuber, Without a hardware FPU, a division is about 300 cycles; atan2 is about 800 cycles. Cosine is about 500 cycles; plus two divides and it might be about as fast. I like this solution though because it works well enough. A future version will have a more powerful (about 8x faster) processor so I might be able to slip some more operations in. Maybe I could even find one with a hardware FPU, or switch to using fixed point math. – Thomas O Dec 03 '10 at 20:31
  • @Thomas Note that you never really have to divide if you compute (or look up or approximate) the secant instead: then you only have to multiply. For example, you can obtain 0.74% accuracy in sec(x) for all x between 45 and 55 degrees with the formula 1/cos(x) = sec(x) = 0.032668x - 0.06619. If you use scaled integers instead of floats, I presume the operations are only a cycle or two, not hundreds of cycles: there's a lot to be gained with that trick. – whuber Dec 03 '10 at 20:43
  • @whuber: My MCU manufacturer (Microchip) have a software fixed point Q16 library. I was considering using it; it might be faster. It has most trig functions. – Thomas O Dec 03 '10 at 20:47
  • @Thomas Speeding up the calcs using fixed point or integer arithmetic is a good idea. It's always going to help to avoid inverse trig functions, no matter how you do the arithmetic. The heart of the trick is to use direction vectors and inner products as I have described: that's how you replace trig with basic arithmetic. – whuber Dec 03 '10 at 20:50
  • @whuber: Yep, it's probably a better idea because the fixed point library also uses the 40-bit accumulators which support add, subtract, multiply, multiply-accumulate, euclidean distance etc. But these are only available on the dsPIC33F processors and I'm hoping to migrate to PIC24 after a while, which doesn't have these accumulators. So I might have to stick with floating point for now. It renders at about 20-30 frames per second, which is fast enough for me. – Thomas O Dec 03 '10 at 20:53

2 Answers2

6

Given that your inputs are in degrees, trigonometric operations are unavoidable. At a minimum you need to account for the distortion in latitude relative to longitude at all points not on the equator.

The simplest approximation (based on a spherical model of the earth and suitable only for small distances) I can think of projects the two points (lat1, lon1), (lat2, lon2) to the points

(x1, y1) = (lon1 * cos(lat1), lat1) and

(x2, y2) = (lon2 * cos(lat2), lat2)

in the Euclidean plane. (This is a Sinusoidal projection.) The direction vector from the viewpoint (#1) to the target point (#2) therefore is

(u, v) = (x2-x1, y2-y1)

with length n = Sqrt(u^2 + v^2), whence the unit direction vector equals

(u/n, v/n).

If you can specify the direction you're facing as a vector, rather than as an angle, you can avoid some more trig. Otherwise, a direction t degrees east of north has to be converted to the unit direction vector

(a, b) = (sin(t), cos(t)).

Finally, visibility is tested by comparing the inner product

(u/n, v/n) . (a, b) = (a*u + v*b)/n

to the cosine of half the horizontal field of view (another trig value, but it can be precomputed once and for all). The inner product is less than this cosine only for the invisible points.

This method requires computing one cosine for each base point and target point, as well as a sine and a cosine for the direction of view.

For example, with a HFOV of 30 degrees you can look 15 degrees to the right and left. The cosine of 15 degrees equals 0.965926. If you are standing at lon = -1.080 and lat = 51.40 and the target point is located at lon = -1.085 and lat = 51.20, then

(x1, y1) = (-1.080 / cos(51.4), 51.4) = (-0.67397, 51.4)

(x2, y2) = (-1.085 / cos(51.2), 51.2) = (-0.67987, 51.2)

(u, v) = (-0.00608, -0.200)

n = 0.200092

(u/n, v/n) = (-0.03036, -0.99954).

The direction vector for due north is (sin(0), cos(0)) = (0, 1). Its inner product with (u/n, v/n) equals -0.99954. Because this is (a lot) less than 0.965926, the point is invisible. (This, I hope, was obvious at the outset, because the target point is south of--that is, behind--the point of view.) If you happened to be looking due south, your direction vector would be (sin(180), cos(180)) = (0, -1), the inner product would equal +0.99954, and because this exceeds 0.965926, you would conclude that the target point is visible.


In some cases you can avoid a lot of trig. If you have a single base point and lots of target points nearby, all the cosines of the latitudes will be approximately the same and equal to the cosine of the base point's latitude. In this case you need compute it only once and use that approximation for all the other calculations. In this fashion, no matter how many target points there are, you need only four trig evaluations for each base point, HFOV, and direction of view combination. Of course, because this is an approximation (albeit a good one), it would be silly to worry about "edge cases" (which I presume are points right at the edge of the field of view).

whuber
  • 69,783
  • 15
  • 186
  • 281
  • My processor has no hardware FPU, so all operations are done in software. It currently fits in 5,940 cycles for each waypoint and is accurate enough. I don't need high accuracy, and I'm sure there's at least a way to approximate it. Have a look at how I did it with my code sample. – Thomas O Dec 03 '10 at 20:10
  • 1
    The method I have described needs only a table of cosines. You could replace the calculation of a cosine by a O(1) lookup (involving division and truncation) into a reasonably accurate table. In fact, if you're willing to compute the cosine and sine for the direction of view and the cosine for half the HFOV, you can adapt the table for the range of latitudes in your area, making it small and highly accurate. You might not need a table at all if the north-south extent of your area is sufficiently small: the cosine of the latitude would be essentially constant. – whuber Dec 03 '10 at 20:17
  • @Thomas O - you can use approximations or the real sine / cosine formulae – dassouki Dec 03 '10 at 20:25
  • @whuber: How many entries? More entries = more precision, but more program memory. 128KB doesn't go far...! – Thomas O Dec 03 '10 at 20:25
  • @Thomas I suspect that if you scale the coordinates appropriately you can do most of these calculations in integer arithmetic to reasonable accuracy. This is possible because the direction vector will have small coefficients (because distances are small) and the unit normal vector's coefficients are bounded between -1 and 1. That potentially decreases the computational effort by around 90%. – whuber Dec 03 '10 at 20:29
  • @Thomas: Because you are not too concerned with accuracy, you can cover the full range of cosines with three entries (for 0, 45, and 90 degrees) and interpolate among them. Alternatively, use simple approximating functions. For instance, a quick calculation in Excel indicates that cos(x) = 1.023512 - 0.00616x for x between 0 and 45 degrees with an error never worse than 0.04. Over a narrower range between 40 and 60 degrees, use 1.328415 - 0.01386x: the error is never worse than 0.008. You need only store two numbers and do one multiplication and one addition. – whuber Dec 03 '10 at 20:36
  • @whuber Thanks for the approximation. I could also use this to rotate the viewport. Every cycle counts! Do you know of one from 45..90 degrees? – Thomas O Dec 03 '10 at 20:43
  • @Thomas: Set up a set of equations and values in Excel and find the best coefficients using the Solver add-in. – whuber Dec 03 '10 at 20:45
  • @whuber Don't have Excel, but I think OpenOffice.org Calc has a solver like function. Or I could use Python. – Thomas O Dec 03 '10 at 20:46
  • @Thomas Or use R. Or any statistical package. Or any package that can do optimization (e.g., MatLab, Mathematica). Regardless of what you use, it's more valuable to you to be able to find exactly the solutions you need rather than having to ask me (and trust that I got them right every time!). – whuber Dec 03 '10 at 20:47
4

First: What whuber said about trig being unavoidable.

Since you're doing augmented reality, here is my recommendation:

  • Drop the lat/lon for everything as early as possible in your data pipeline: convert to 3D cartesian coordinates and use those instead, and keep them around so you don't have to re-do it all the time. (There's the trig!)
  • Then use traditional frustum culling to drop points that aren't visible.
    • Using OpenGL? Don't want to do the math+bookkeeping for a bunch of plane-side tests? Use gluProject with the current modelview and projection matrices & viewport. If the resulting coordinate is inside the screen rectangle (pixels) and has a 'z' value of between 0 and 1, it will be visible on the camera! (From memory so check the manual.)

Here's what your computational costs look like:

  • Four trig operations to convert from spherical coordinates (lat/lon/alt, or phi/theta/rho, depending if you're a geographer or a mathematician) to cartesian.
    • However, you just need to do it once! (This is relevant to this question too: since you're now just working in cartesian, just add the vector and you're done -- accurately so, too, without fudge factor.)
  • A bunch of multiplies and adds (three vector-matrix multiplies), plus three divisions and six comparisons for the actual visibility test. Fast!

To summarise, Spherical Coordinates Bad. Cartesian Good.

Dan S.
  • 3,549
  • 17
  • 20
  • Do you have any tips on converting the coordinates? The position of the viewer may move so these would have to be updated, but if the cost is only a few sin or cos calculations when the coordinates change it would probably be worth it. – Thomas O Dec 03 '10 at 22:18
  • Actually, I created a second question for this, because it sounds like a good optimisation: http://gis.stackexchange.com/questions/4147/lat-lon-alt-to-spherical-or-cartesian-coordinates – Thomas O Dec 03 '10 at 22:23
  • I answered over there. :) Also ... not to make a pedantic point, but the real reason to go with cartesian in this case isn't for optimization's sake but for ease of manipulation. There are other cases where working with spherical coordinates make sense, but visibility testing isn't one of them, and (of course) neither is translating by cartesian vectors ... – Dan S. Dec 03 '10 at 23:07