Recently upgraded to a newer LCD monitor with dci 92% monitor. I noticed that some games and some photos seem really oversaturated, really shady. After some googling, I found a post suggesting to use the monitors sRGB mode, which does make games look more "normal", if a little bit less colorful.
As I understand this, most content is designed using sRGB displays. However, when this content is rendered on a wider gamut display like DCI, the colors get "spread out", and this makes the image seen different to what the creator had intended.
This got me thinking: What does this "sRGB mode" actually do to make it look "correct"? And what aspect of the panel actually determines it's range? Is it the relative quality of the diodes, the backlight, the chipset controlling the pixels?
I ask this because it seems strange that this new monitor which is technically better has to effectively nerf itself in order to make things look normal, which just seems counter-productive to me, and it's sent me down this rabbit hole of gamut, color range, calibration etc. Any material on the subject would be appreciated.