Within the classical signal detection framework, the effect of bias is studied by keeping the signal and signal plus noise stimuli identical, which should keep d' constant, and manipulating the instructions to the subjects or the payoff/cost matrix to change the bias. Under these conditions, we expect d' to remain constant since it is defined by the signal and signal plus noise distributions. Changing the instructions or payoff/cost matrix can change the underlying distributions even if the stimuli are fixed because forcing someone to use an unnatural criterion could add additional noise. This will cause a decrease in d'. For fixed distributions, changing the criterion can only change d' (as defined by ZD−ZF and not mu2-mu1/sigma) if the distributions are not Gaussian.
The attached graphic looks like the experiments manipulated the stimuli/paradigm by changing the inter-stimulus interval (ISI). This seems to have changed the underlying signal and signal plus noise distributions and causes d' to increase as the bias decreases. Potentially the longer ISI gives the subject more processing time leading to an increased d'. Depending on how bias is calculated (cf. Two different values for criterion in signal detection theory?), the criterion could be staying constant relative to the noise alone stimuli, but the bias could be changing.