Relation of Entropy and SNR : Based on this question and answer, I had another question that struck me and I am curious to know, if somebody can shed some light, on the following situation: $y= desired_{signal} + noise$ is received by the receiver and the $error = y - \hat{y}$ where $\hat{y}$ is simulated at the receiver by using guessed parameters with the knowledge of the kind of model used at the transmitter.
For increasing SNR (attenuating noise) would Entropy of error decrease? In the link provided, the entropy of the transmitted signal decreases with increasing SNR since the uncertainty decreases when noise is getting attenuated. But what about the entropy of error? Should error entropy increase with increasing SNR? I am confused about this.
EDIT: The way I calculated entropy of error: Let the system model be AR(2): $y(t)= a1y(t-1) + b1y(t-2) + noise$. At the receiver end, I have $z(t) = y(t) - (a2y(t-1) + b2y(t-2))$ where $(a2,b2)$ are close guesses to ($a1,b1$). I calculated 2 entropies H1 for y(t) and H2 for z(t).
z(t) will be close to zero when a2,b2 will be equal to a1,b1 in which case entropy of z(t) is found to be minimum among all different z(t) for a particular SNR value of noise. So, for a particular noise value, I have 10 pairs of (a2,b2) and for each I get 10 values of H2(z(t)). I chose the minimum among all H2 for a particular SNR level. But, for different noise levels, as I increase the SNR at the transmitter end, the minimum entropy, H2(z(t)), increases with increasing SNR. I found that H1 decreases with increasing SNR but the reverse happens for H2. Is this trend correct?