2

I'm trying to measure sharpness through a script in order to measure sharpness across different devices.

My methodology:

  • I have a pattern with pure white and black stripes like this: enter image description here
  • I use HD printing, matte, no reflection or texture
  • I take a picture in exactly the same conditions

When I take the picture with a device, I then analyse the picture pixel by pixel. Of course, the picture will never be perfect but as testing conditions will always be the same, I should be able to compare sharpness.

So what I measure:

  • the amount of pixels in pure white (usually close to 0)
  • the amount of pixels in pure black (usually close to 0)
  • the amount of pixels in grey
  • the amount of pixels for everything else For each of those measurements, I have the amount of unique colors (in LAB format) as well as the sum of all pixels for each type of color.

From what I see with my eyes and what the pixels are saying, I see some common trends but I also see different directions I could take.

Some discoveries:

  • the number of distinct greys seems to give a good indication about sharpness (less distinct greys = more sharpness)
  • the difference of light greys (when L from lab color is >50) and dark greys (when L from lab color is <50) also seems to give a hint about sharpness/contrast as when the difference between both is big, sharpness is better

Do you have ideas of the criterias I should use to measure sharpness?

Thanks

Laurent
  • 151
  • 4
  • 3
    Do you measure both tangentially and sagittally? Most lenses will differ slightly in each direction. Some lenses differ significantly between the two, especially near the edges and corners of the frame. – Michael C Jun 06 '21 at 13:31
  • 3
  • 1
    Are you measuring lenses in the abstract without a camera? If not, the reality is that the camera will also affect your measurements to some degree. If you are using an auto focus camera and lens combination, you can save yourself a lot of wheel re-inventing by using existing products that already do this. One relatively inexpensive product I've used is Reikan FoCal https://www.focal-iq.com . If you're still determined to do it yourself, take a look at some of their analysis results like https://www.focal-iq.com/lenses/Canon_EF_50mm_f_1.4_USM_Canon_EF_analysis.html#peak – user10216038 Jun 06 '21 at 15:57
  • @MichaelC I have horizontal & vertical bars but as I'm looking at pixels values programmatically I take the easy route by measuring any deviation from the original white and black stripes – Laurent Jun 06 '21 at 18:43
  • @user10216038 Thanks for the suggestion, I should have pointed out that I'm testing smartphone cameras which means that I can't run any software like for dslr? I need somehow to reverse engineer what the camera is measuring, compare with real life observation until the scripts produces something ok. – Laurent Jun 06 '21 at 18:46
  • @Larent It seems you're trying to reinvent the wheel that has been around and has already been optimized for a long, long time. MTF measurements and other methods used to measure acutance do not depend on the camera in question being a DSLR. They analyze the results (i.e. the photos produced) to determine numerical measurements. The entire contrast side of an MTF diagram is the scale that shows the percentage difference in brightness between the white and black lines. If the white line is fully white and the dark line is fully black, that's a 1.00 score (which never actually happens). – Michael C Jun 06 '21 at 22:28
  • If the average of the dark areas is at 25% brightness and the average of the light areas is at 75% brightness, that results in a score of 0.50, because the average difference between the dark and light areas is 50% of total possible difference between pure black and pure white. At the resolution limit of the camera/lens system, both areas are equally bright or dark and thus can not be discriminated from each other. – Michael C Jun 06 '21 at 22:30
  • Do note that the accuracy of contrast (acutance - or sharpness) measurements is dependent upon the focus position of the camera/lens system. If the target is misfocused, the system will measure less sharp than when the lens is optimally focused. That's the most significant problem with testing systems such as camera phones, because focus often can't be set manually and then the distance between the camera and target altered in gradations of micrometers (µm) using a test bench to find the point of sharpest focus. – Michael C Jun 06 '21 at 22:36
  • Horizontal and vertical bars do not, strictly speaking, measure tangential and sagittal performance. Sagittal lines are like the spokes of a wheel spreading out from the center of the lens' field of view. Tangential lines are tangent to the circular surface of the wheel that rolls over the ground. – Michael C Jun 06 '21 at 22:43
  • @MichaelC Thanks for your extensive comment. I did not rule out MTF, I even tried to reproduce it but I'm not yet there. I wanted to find something simpler for my blog as most of the people do not understand what MTF means. I know it's a bit like cutting corners but I'd like to come to a single score indicating how sharp a smartphone camera can be. I also know that it's sharper in the middle but again, most of normal smartphone users don't care about this. I think I'll create a specific thread about MTF to dig deeper into this. – Laurent Jun 07 '21 at 18:30
  • @MichaelC I had another question, do you know a reliable MTF program that I could use? Imatest is really too expensive for and MTF mapper is really giving weird results compared to Imatest. Maybe there is something in between that would fit my need and budget. Thanks! – Laurent Jun 07 '21 at 18:31
  • @MichaelC I have opened another thread here: https://photo.stackexchange.com/questions/125089/how-to-calculate-mtf-programmatically-with-php-for-example – Laurent Jun 07 '21 at 19:16

0 Answers0