3

I understand that shot noise will be dependent only on final photon count, and thus is invariant under area changes (shot noise = sqrt(total_photons)). Is the same true for final SNR, if considering shot noise as our only source of noise (which would be the theoretical limit of lowest possible noise, I think)?

I am of the opinion that final SNR is determined on a per pixel basis (i.e, greater photon count per pixel produces a better image), and thus is dependent upon area, assuming equal pixel densities. It does not make sense to me that two sensors of equal pixel density, that both captured the same number of photons will have the same SNR, if one sensor is 100x the size and thus it's signal 100x diluted over the sensor area, the other being 100x more concentrated. My friend seems to disagree. I argue they will have the same noise, but the smaller sensor with higher light intensity / unit area will have a greater SNR, since the light intensity per unit area is greater. In other words I argue SNR is (photons / pixel) / sqrt( photons / pixel), while he argues it's (total-photons) / sqrt( total-photons).

Here is the exact scenario we're discussing: https://photographylife.com/wp-content/uploads/2014/11/Full-Frame-vs-APS-C-vs-M43-vs-CX.jpg

Two shooters with sensors of the same pixel density, but one 25% of the size (FF and M43 in the image above). Their focal lengths are identical (say 12mm and 12mm), their apertures identical, standing in the same spot, etc. The shooter with the smaller sensor shoots only the area enclosed by the 'M43' rectangle.

If both shooters gathered the same amount of photons, would the SNR be the same, or different? Similarly, if the photons gathered per pixel are the same, will the resultant SNR be the same when comparing only the area enclosed by the rectangle of the larger image (i.e, 25% crop), to the image taken from the smaller sensor (which captured only the area of the rectangle)?

Edit: This is different from the suggest question. My question is asking if SNR is calculated on a per pixel basis or simply by total photons.

  • 1
    On a per pixel basis, there's no way of discriminating between signal and noise. There's only one number generated for each pixel. – Michael C Jan 10 '22 at 18:41
  • What photographic problem are you trying to solve? That is, what image are you having trouble producing that a correct answer to this question would allow you to make? – Michael C Jan 10 '22 at 18:43
  • It's purely a physical or conceptual problem. If one number is generated per pixel, wouldn't greater photons / pixel still help produce an image with greater SNR over one with less photons / pixel (but same total photons)? – PoissonedPtard Jan 10 '22 at 19:24
  • Are the resulting images to be displayed in proportion to the size of each sensor? Or at the same display size? – Michael C Jan 10 '22 at 19:30
  • Please explain what you mean by "both shooters gathered the same number of photons? Is the shooter with the smaller senor exposing for four times as long as the one with the larger sensor? – Michael C Jan 10 '22 at 19:32
  • There's no example image at your link. – Michael C Jan 10 '22 at 19:34
  • Proportionate size, I understand that's a crucial detail. So the 25% sensor is displayed at 25% size. My understanding is that, theoretically, the cropped portion of the upper right quadrant taken by the larger sensor camera should be identical to the 25% sensor image, if photons gathered per unit area are equal. – PoissonedPtard Jan 10 '22 at 19:36
  • "Please explain what you mean by "both shooters gathered the same number of photons? Is the shooter with the smaller senor exposing for four times as long as the one with the larger sensor?" - Yes, or shooting in different conditions. My friend seems to think SNR will be identical here, since total photon count is identical. – PoissonedPtard Jan 10 '22 at 19:36
  • Here's the thing. If light per unit area is the same for both, the larger sensor will collect more photons than the smaller sensor. The only way to "spread" the same amount of light from the same scene over a larger sensor is to increase focal length without increasing the entrance pupil, which necessarily increases the f-number and decreases the amount of light per unit area. – Michael C Jan 10 '22 at 19:37
  • If the shooter with the smaller sensor is exposing for four times as long, are any sensels reaching/exceeding full well capacity? Or are none of the sensels on the larger sensor exceeding one-quarter full well capacity? – Michael C Jan 10 '22 at 19:39
  • Yes, I understand that if light per unit area is equal, the larger area sensor collects more light. What I'm wondering is, does a 25% sensor taking a 25% image (i.e, shooting only the upper right quadrant) perform as well as the larger sensor in only that upper right quadrant (as in, if you cropped the upper right quadrant of the larger image and compared)? Will SNR be identical, when comparing the 25% sensor image to a 25% crop of a sensor 4 times larger? My understanding is yes, my friends understanding is no - total photons collected in the larger image were greater, so SNR is greater. – PoissonedPtard Jan 10 '22 at 19:40
  • To keep things simple, we will say no they are not reaching full capacity. – PoissonedPtard Jan 10 '22 at 19:40
  • If you crop the image from the larger sensor to the same dimension as the smaller one and they both have the same number of sensels per unit area, there will be no difference (assuming both sensors use the same materials, technology, fabrication process, etc.). – Michael C Jan 10 '22 at 19:43
  • "i.e. shooting only the upper right quadrant" The upper right quadrant of what? Your link looks like the lens cap was left on, so there will be no signal at all. – Michael C Jan 10 '22 at 19:44
  • I say his understanding is wrong, because photons in each area should be equal, so it's on a per unit area basis. – PoissonedPtard Jan 10 '22 at 19:44
  • What's on a per unit area? – Michael C Jan 10 '22 at 19:45
  • "If you crop the image from the larger sensor to the same dimension as the smaller one and they both have the same number of sensels per unit area, there will be no difference (assuming both sensors use the same materials, technology, fabrication process, etc.)" - this was my understanding, thank you. I know it's a rather contrived scenario, but appreciate the clarity. – PoissonedPtard Jan 10 '22 at 19:45
  • "i.e. shooting only the upper right quadrant" The upper right quadrant of what? Your link looks like the lens cap was left on, so there will be no signal at all." - the upper right quadrant of whatever image the sensor 4x the size took. Imgur was not working for some reason, but I had inscribed a rectangle enclosing the upper right quadrant of a landscape image for demonstration purposes. – PoissonedPtard Jan 10 '22 at 19:46
  • As your question stands, without a usable image at the link, it's not clear what you're asking. If they're both shooting a totally dark scene, there will be no signal nor any shot noise. – Michael C Jan 10 '22 at 19:47
  • Unless the scene is perfectly uniform, the light won't be the same, though, so the photon count won't be the same between the larger and smaller image. They're taking a photo of two different scenes (unless there is no light in the scene to capture). – Michael C Jan 10 '22 at 19:49
  • Are you talking about taking an image of a scene, or taking an image of a two dimensional image of a scene already produced by another camera and display technology. It's still not at all clear what your are asking. – Michael C Jan 10 '22 at 19:50
  • Regarding the image, perhaps we can use this: https://photographylife.com/wp-content/uploads/2014/11/Full-Frame-vs-APS-C-vs-M43-vs-CX.jpg, where the smaller sensor is M43, the larger is FF. And yes, we're assuming a perfectly uniform hypothetical scene. The entire scenario is purely theoretical. – PoissonedPtard Jan 10 '22 at 19:51
  • In case it helps, the scenario had arisen when I asked why smaller sensors are worse in low light, asking if they don't instead simply just take a smaller image, all else equal (not equivalent). I was under the impression that if you 25% cropped a FF image, it should look the same as an M43 image taken with the same focal length, etc. My pal insisted the total photons determines SNR alone, so a photo with 25% less photons will have a lower SNR, even if it's 25% the size (proportionately lower resolution and viewing size). – PoissonedPtard Jan 10 '22 at 19:53

2 Answers2

0

PSN is a function of photons/time/area. Assuming equivalent fill factors/efficiency, pixel size is irrelevant.

I think the part you have confused is that, for any given equivalent exposure (Ap/SS), the smaller sensor will not have more luminous flux density; nor will the larger sensor have less.

In the example you show the extra light the larger sensor gets is discarded/cropped by a smaller sensor, but the photons/area remaining is equal. In the other scenario of using different focal length lenses with different sized entrance pupils (same f/#), in order to record the same image composition, the larger sensor receives the same flux density (photons/area is also equal) but the image covers a larger area; i.e. the larger sensor receives more light for a higher/better resultant SNR. I.e. the larger sensor requires a longer lens, with a larger entrance pupil which transmits more light, in order to maintain the same flux density (exposure) over the larger sensor area.

It only really matters on a per pixel basis if there is a significant difference between fill efficiencies (unusual unless comparing extremes these days). Or if you are measuring it on a per/pixel basis for some reason (i.e. not as an image)

Steven Kersting
  • 17,087
  • 1
  • 11
  • 33
  • @PoissonedPtard, If the FF sensor has 1/4 of the exposure the SNR cannot be equal. An easy way to see that is to evaluate a night image with a large DR; you will find shadow areas are noisier than areas of the image that are lit more brightly... the SNR varies across an image with the #photons/area/time. The only variable here is the size of the area being evaluated. In the crop/crop factor scenario, the SNR/area doesn't change; what does change is the total area and resulting SNR remaining. So yes, the quality would be equivalent. – Steven Kersting Jan 10 '22 at 23:46
0

Yes, if you fix total photon count, then you also fix signal-to-noise ratio which will be invariant under area changes. However, in many cases, you don't fix total photon count. You fix photon density, photons per unit area.

An example: you have a 400mm f/5.6 lens and are shooting on full frame. You observe two features of the lens:

  1. It collects enough light during daylight hours, so you can use fast exposure time and don't need image stabilization and never have excessive noise
  2. The lens is just too short for your uses

...therefore, because of (2), you are considering switching to a 1.6x crop camera. The lens would then effectively be 640mm lens. But can you rely on feature (1) with a crop camera?

The answer is: no you can't. Total photon count is less on the smaller sensor, because the sensor is smaller and can't see all the light the full frame sensor would see. Thus, the full frame sensor would collect 1.6^2 = 2.56 times more light than the crop sensor would collect. Therefore, on the crop camera the lens is not effectively a 640mm f/5.6 lens but rather a 640mm f/8.96 lens where 8.96 = 1.6 * 5.6. (And before someone complains about equal ISOs and equal exposure, on full frame you can use 2.56x times higher ISO and still get the same noise level, because full frame cameras collect more light.)

Another example: you are considering between 20 megapixel and 80 megapixel cameras having equal sensor sizes and shoot in low light. Which camera has lower noise?

With no post-processing, on the 20 megapixel cameras the areas of pixels are larger. Thus, they collect more light since photon density is constant. Therefore, 20 megapixel camera has lower noise.

However, in this case the difference isn't so clear because the total amount of collected light across all pixels is constant. With suitable post processing, using a noise removal algorithm that considers neighboring pixels too, it may be possible to combine information from neighboring pixels in a way that makes the 80 megapixel camera look more like a 20 megapixel camera, so with such an algorithm, you could use the 80 megapixel camera in low light and get equal results, while at the same time if you happen to shoot in good light, you can enjoy the real benefits of the 80 megapixel camera.

As for your second question, it's exactly the same as my second example which is a borderline case. In raw images, the 100x diluted sensor has more noise. However, suitable algorithms can make the 100x bigger sensor work like the small sensor, recovering the information from the noise, but if you use such algorithms, you can't enjoy the full resolution of the big sensor that has 100x the pixel count.

The raw SNR in non-processed image is indeed:

photons/pixel / sqrt(photons/pixel)

but you can by reducing effective pixel count in final post-processed image make the SNR in an image to behave more like:

total_photons / sqrt(total_photons)

As for 12mm f/something in MFT and 12mm f/something in full frame, we can observe:

  • Effective focal length is different so field of view will be different; the images won't be the same (they also won't be the same based alone on the different aspect ratios)
  • Physical aperture size (12mm/something) is the same so both collect the same amount of light at the lens level
  • MFT sensor is smaller so less of that light ends up being at the sensor
  • Both MFT and FF shooters need to use the same ISO to get the same exposure
  • ...but all FF sensors have less sensitivity to high ISO than small sensors (unless you are comparing a sensor from 25 years ago to a today's state of the art sensor; let's make this fair and compare two sensors with similar technology levels), so the FF shooter will have less noise if lack of light creates noise in the final image

If you made this 12mm/something vs 24mm/(2*something) then the images would be equivalent. The field of view would be the same. The FF sensor would also collect the same amount of light so noise would be the same. With 12mm/something and 24mm/something, the field of view would be the same, but the FF sensor would collect more light. There also would be depth of field and background blur differences.

Note that in the MFT vs FF example, light was NOT diluted. Light per unit area was the same. However, in your second example, light was diluted. It's a different example, then. In the MFT vs FF example, you don't need to have different pixel counts -- the FF sensor can very well have the same pixel count as the MFT sensor and still have lower noise.

So the MFT vs FF is clear: FF is better. However, a sensor with 100x the size and 100x the pixel count, and with diluted light (meaning you use a lens with higher f-number and larger crop circle to get the diluted light), the big sensor is worse ... until you post-process the images to take into account information from neighboring pixels in which case the 100x sensor would not effectively use 100x pixels as individual pixels, and would be equally good.

juhist
  • 6,740
  • 16
  • 51