In the most text book of advance probability theory, they always start from a probability space $(\Omega, F, P)$, and introduce the corresponding measure theory, then use measurable function to define a random variable. However, the probability space part is always vague. When we define a new random variable, we always directly define it by its density function or mass function. However, we cannot see the connection between probability space and random variable anymore.
My question is that can we always define a random variable from a probability space? For example, I want to define a Bernoulli distributed random variable. We can start from a prob. space that $\Omega=\{H,T\}$ and $P(H)=p, P(T)=1-p$, then define the measurable function between prob.space and $(R,B)$ as $X(T)=1$ and $X(H)=0$. Or a prob. space that $\Omega=[0,1]$, and $P$ is the Lebesgue measure, and $F$ the collection of Lebesgue measurable sets. Then define $X(\omega)=1$ if $\omega<0.5$ and so on.
Exponential distributed random variable is still OK, because the uniform distributed random variable can be defined by a identity function on the prob.space mentioned above, and we can apply the inverse function of the CDF of exponential distribution to define a exponential distributed random variable.
However, if we go further to define Gaussian random variable. Can we do similar procedure as above to define it? What can I image is that we can invoke CLT to define a Gaussian from a $\Omega =\{H,T\} $ or $\Omega=[0,1]$. But it is not very clear. Also, if we continue the thinking, how to construct a complicate stochastic process?