4

I guessed the following statement about Conditional Expectations and tried to prove it unsuccessfully:

if $E(Y|X=x)$ is strictly increasing in $x$, then $E(X|Y=y)$ is strictly increasing in $y$.

Any hint? I also tried to find a counter-example.

I thought that this statement must be true since it seemed to me as the stochastic counterpart of the following statement: the inverse of an increasing function is also increasing. Also, it is closely related to the following: the regression coefficient of $Y$ on $X$ has the same sign that the regression coefficient of $X$ on $Y$. But I was not able to prove it yet. I also checked some parametric cases, like the joint normal distribution, where the statement is true. I also consulted some main textbooks, and google scholar but no success.

Appreciate any insight.

AnonA
  • 43
  • 2
    Welcome to MSE. Your question is phrased as an isolated problem, without any further information or context. This does not match many users' quality standards, so it may attract downvotes, or closed. To prevent that, please [edit] the question. This will help you recognise and resolve the issues. Concretely: please provide context, and include your work and thoughts on the problem. These changes can help in formulating more appropriate answers. – José Carlos Santos Jun 08 '21 at 05:41
  • 1
    Thanks for help. My question is an abstract question. Now I tried to explain how I got to this ''conjecture''. The statement can be wrong but I was not able to find a counterexample either. – AnonA Jun 08 '21 at 05:50
  • The additions are what we are looking for. hopefully someone in that field will see it. – Alan Jun 08 '21 at 06:14
  • If I understand correctly your example, in that case both conditional expectations are decreasing. – AnonA Jun 08 '21 at 06:33
  • You are right, but another answer is given. – Gono Jun 08 '21 at 06:38

1 Answers1

3

The following is a counterexample:

Y = 0 Y = 1 Y = 100
X = 0 0% 50% 0%
X = 1 25% 0% 25%

since $E(Y|X=0) = 1$ and $E(Y|X=1) = 50$, the first condition is satisfied, but since $E(X|Y=0) = 1$ and $E(X|Y=1) = 0$, the second condition is not satisfied.

The Zach Man
  • 550
  • 3
  • 13
  • apologies for awkwad formatting, it seems table support is rather limited. – The Zach Man Jun 08 '21 at 06:36
  • Beautiful and simple! Thanks a lot. I am thinking what is the condition on the shape of the joint distribution that makes the statement correct. Looking at your example, the joint cdf is concave. – AnonA Jun 08 '21 at 06:43
  • yeah, that sounds right - I suspect if you added the condition that each P(Y|X=x) and each P(X|Y=y) were functions with a single hump, then the theorem would hold. Of course, that limits you to talking about continuous random variables, for the most part. – The Zach Man Jun 08 '21 at 06:49
  • Super Interesting. Any idea of the proof? Sounds to me that given single-hump means concavity or convexity, Jensen inequality might not be useful. – AnonA Jun 08 '21 at 06:53
  • actually, upon further thought that's probably not correct either - I don't have a counterexample I can write down since it's basically just a 3d surface that I'm visualizing, but I think that you could come up with a counterexample that's all single-hump funtions. – The Zach Man Jun 08 '21 at 07:02