I think I am misunderstanding the notion of mutual information of continuous variables. Could anyone help me clear up the following?
Let $X \sim N(0, \sigma^2) $ and $Y \sim N(0, \sigma^2) $ denote Gaussian random variables. If $X$ and $Y$ are correlated with a coefficient $\rho$, then the mutual information between $X$ and $Y$ is given by (reference: https://en.wikipedia.org/wiki/Mutual_information).
\begin{equation} I(X; Y) = -\frac{1}{2} \log (1-\rho^2). \end{equation}
Here, I thought $I(X; Y) \rightarrow \infty$ when $\rho \rightarrow 1$ (for $X = Y$, $\rho = 1$). I considered this another way.
I considered $Y = X$. In this case, I would obtain $ I (X; Y) = H(X) - H(Y|X) = H(X) $.
For the Gaussian random variable $X$, $H(X)$ is bounded as follows (reference: https://en.wikipedia.org/wiki/Differential_entropy): \begin{equation} H(X) \leq \frac{1}{2} \log ( 2 \pi e \sigma^2). \end{equation}
Thus, $ I (X; Y) \leq \frac{1}{2} \log ( 2 \pi e \sigma^2)$.
Here is my question. I obtained two different results on $ I (X; Y)$ for $X = Y$. What could be some mistakes in my understanding?
Thank you in advance.