1

I have a camera looking at a flat plane that is a known distance away from it. I also know the cameras intrinsic parameters.

The image from the camera looks like this: (it's grayscale but that shouldn't matter)

It has notable lens distortion, so I correct the image via OpenCV's undistort function. The result looks like this: enter image description here

Now my question is: How can I calculate the real-world size that each pixel corresponds to on the plane? For example, if I measure that one of the rectangles is 50 pixels wide, how can I calculate how many millimeters that would be?

Does each pixel have the correspond to the same area in the undistorted image?

Zciurus
  • 113
  • 1
  • 5

2 Answers2

1

How accurate do you expect/need to be?

IMHO between the inaccuracy of the focal length (especially since you are not focused to infinity) and your lens correction, the GIGO principle says that you aren't going to be very accurate. Better put a ruler/tape measure in the photo and calibrate the thing experimentally. You may find that the pixel/mm ratio isn't completely uniform.

xenoid
  • 21,297
  • 1
  • 28
  • 62
  • Hmm, okay. What about the second part of my question, is the pixel size uniform across the image? Let's say I experimentally figured out that 1px corresponds to 1.4 mm in the center of the (undistorted) image. Can I assume the same mm/px size for pixels near the edge? – Zciurus Jun 22 '23 at 09:37
  • That's the purpose of calibration... you take several shots with the ruler in various position and you'll find out. And also, it all depends how accurate you want to be. I'd say that if 5% is good enough then a single measure is OK, below 1% you'll probably need several measures, and in between you do some that may just tell you that they weren't necessary. – xenoid Jun 22 '23 at 10:11
  • ... and if you setup is fixed, you can even take a picture of a grid, and using the shot of the grid and the grid, compute a displace map to apply to your picture, to make it exact. I have done so to calibrate my scanner (with the help of some Gimp scripts). – xenoid Jun 22 '23 at 10:13
  • The camera model I'm using is calibrated by the manufacturer. I can read out its intrinsic parameters via software and then perform an undistortion as needed. Ideally I would like to calculate the mm/px value (of the undistorted image) just from the intrinsics and the distance to the plane. (I want to achieve an automated process, so manually calibrating each unit would be annoying) Is there a way to calculate the FOV of the undistorted image? From that I could easily work out the projected pixel size at a given distance. – Zciurus Jun 22 '23 at 11:15
  • 2
    If you shoot a rectangular grid from up close, the sides of the grid, being farther from the camera, should appear smaller (with the corollary that many straight lines would be curved). And this is not the case because most lenses compensate for this (rectilinear lenses), and your openCV correction is also doing this. So, is the lens correction expanding the sides or shrinking the center, and is openCV doing the same or the opposite? If you have several units you don't need to calibrate each, they should all behave like the one you took your measures on. – xenoid Jun 22 '23 at 12:00
1

If you know the focal length, find out the field of view from https://www.nikonians.org/reviews/fov-tables then it's just trigonometry:

pixel size = 2 * tan ( horiz.fov /2 ) * distance / image width in pixels.

So at 50 mm on 12 Mpx full frame at 5 m distance you get

tan (39.6°/2) * 5000mm / 4000 = 0.45 mm

At very close distance the focus will change the focal, but at a focusing distance greater than about 5x the focal length this effect should be minor so not many corrections are needed.

Distortion correction may reduce the field of view, but only if the software crops the image to avoid areas which have been "compressed" radially and which would result in empty regions, "pulled in" from outside the original image.

I searched and with OpenCV you are not forced to do it, here it states that "the cv2.getOptimalNewCameraMatrix() function will also return the region of interest, which can be used to crop the image", so if you don't want, you don't need to. See also on the official documentation for undistort and for the camera matrix. It looks to me you should be using "Alpha=1" for the matrix, and that's it.

Once you do, the resulting image should correspond exactly to the formula written above! it's the distortion (now no more present) that makes it deviate from it.

If greater accuracy is required, you need a reference tape in the photo itself.

Edit

As pointed out by Michael C, the magnification is not uniform in the field of view for rectilinear lenses and is higher at the corners. See wikipedia. Depending on the focal length used, it may be relevant or not: it should not be particularly relevant below 140° field of view (see here). Also, it is not much relevant if the plane of the object (assuming a flat object) is parallel to the plane of the camera.

FarO
  • 1,329
  • 2
  • 14
  • 27
  • Hi, thanks for you answer. If the FOV is known, determining the pixel size is rather staright forward, as you mentioned. The problem is that the distortion correction influences the FOV. So far I have not figured out how to calculate the FOV of the distorted image (it's smaller than the original FOV) – Zciurus Jun 23 '23 at 08:14
  • I think I completed the answer with what you needed :) Please let me know if it works!! – FarO Jun 23 '23 at 20:39
  • The problem is that magnification (which is a close corollary to focal length) is not constant across the field of view of most lenses. This is particularly the case with wide angle rectilinear projection lenses. For such lenses magnification at the corners may be significantly higher than magnification at the center of the field. For the magnification to be the same across the field, you must accept fisheye projection. – Michael C Jun 24 '23 at 20:18
  • 1
    @MichaelC true, I didn't think about it. That's why people get distorted and get much bigger when in the corners of (or example) smartphone selfies. – FarO Jun 26 '23 at 08:24
  • @MichaelC however, provided the plane of the subject is parallel to the plane of the sensor, no distortion should take place, isn't it? think about the photos to test patters (squares) used to show pincushion/barrel: the squares are all the same size https://www.opticsforhire.com/blog/types-of-projections-in-wide-angle-lenses-part-1/ (grid distortion for various lenses) – FarO Jun 28 '23 at 07:11
  • @FarO Unless the lens is telecentric (that is, the lens is as large in diameter as the field of view) Items in the edge of the field are further from the lens than items in the center of the field. Thus they won't appear the same size if the magnification is constant across the field. If a test image shows pincushion or barrel distortion, then the squares on the test chart are not even square in the resulting image, much less the same size as the other squares in the image. – Michael C Jun 30 '23 at 23:04
  • @MichaelC magnification constant or not, a grid with uniform squares kept PARALLEL to the plane of the camera will appear in the photo as uniform (except for minor imperfections in the correction), thanks to the rectilinear properties. Of course magnification is different radially to achieve it, but the amount of difference is tuned to the different distance, and they compensate each other. See https://www.lenstip.com/541.6-Lens_review-Samyang_AF_14_mm_f_2.8_EF_Distortion.html for an example of a uniform grid which still appears still as such in the photo. Or maybe I misunderstood your point? – FarO Jul 03 '23 at 10:51
  • @FarO If there were a perfectly rectilinear lens that wouldn't even be the case. And in reality there is no perfectly rectilinear lens, particularly at shorter focal lengths/wider angles of view. Rectilinear lenses make straight lines look straight at the expense of angles and areas. – Michael C Jul 04 '23 at 14:37