Although as pointed by user452 MAD is a less sensitive statistic to outliers than the standard deviation, I think that N. Taleb has a different perspective. In fact, quite opposite to it.
First, within the domain of Robust Statistics, outliers have always been considered as negative artifacts. Robust Statistics, like the median, avoid such distortions. Instead, N. Taleb considers outliers as providing information about the tails of the probability distributions (e.g. the crash of 1929, Black Monday of 1987, Dot-com bubble, etc.). So, N. Taleb is not advocating to use the MAD to reduce the effect of outliers, but precisely to be able to consider them as part of the analysis.
Second, the existence of the standard deviation depends on the tail of the distribution to decrease at least as $O(x^{-(2+\epsilon)})$, and the variance and so the convergence of the corresponding estimator depends on the tail of the distribution to decrease at least as $O(x^{-(4+\epsilon)})$ (i.e. the kurtosis of the distribution should exist). In fact, there are many known natural and man-made processes that generate distributions with low exponents in the tails, also known fat-tailed or heavy-tailed. Therefore, the use of the standard deviation is technically dubious in many cases (and the same apply to the correlation, PCA, etc.).
Third, from a epistemologically perspective, when the underlying process generating the data is not well-known, it is incorrect to assume that the process won't be fat-tailed (i.e. absence of evidence is not the same as evidence of absence).
Finally, in a decision-making scenario involving unbounded risks (e.g. financial ruin, wars, etc.), when the underlying process generating the data is not well-known, then we should take actions that avoid these unbounded risks. It does not make sense to take actions to reduce the probability of occurrence, because any repeated exposure to a small probability will certainly end in catastrophe (i.e. probability of catastrophe after N attempts $p_N$ grows as $(1-p)^N-1$, exponentially fast with the number of attempts).
Given the above, MAD is a better estimator than standard deviation. However, for decisions involving unbounded risk with with not well-known processes, N. Taleb advocates even for the incorrectness of such estimates. Instead he advocates for precautionary principle.
Although in draft form, a more mathematically oriented exposition of his ideas can be found in the Statistical Consequences of Fat Tails: Real World Preasymptotics, Epistemology, and Applications.