I've recently come across two quotes in very unrelated sources implying that for centuries, the English and their colonial offspring did not appreciate the role of vegetables in a healthy diet.
The well-known ‘green sickness’ in young women, to which contemporaries gave a sexual meaning, was chlorosis, anaemia produced by a lack of iron in the diet, stemming from upper-class disdain for fresh vegetables. The well-to-do ate too much meat and were frequently constipated . . . In the seventeenth century [the poor] may have escaped the gout and stone which plagued their betters, and may even have had better teeth from eating more vegetables.
From Jacksonian America: Society, Personality, and Politics:
During [the cholera epidemic of 1832], among the quaint notions that flourished was a belief in the the therapeutic qualities of beef and mutton and the evil effects of vegetables. Before and after the epidemic many people considered tomatoes taboo.
I was already aware of anti-potato prejudice, but now it seems to me like disdain for vegetables was even more widespread. Even some vegetarians did not think much of the health benefits of vegetables, according to an editorial in The Index to Good Health (Michigan, 1899):
Nothing short of confusion is produced when a vegetarian declares to his friends that he does not eat vegetables . . . Generally he does [eat vegetables] when he first becomes a vegetarian, but very often as he progresses in the knowledge of the value of foods, he comes to a point where he thinks it economy of labor to live on fruits and grains and nuts, discarding vegetables as not worth the trouble of consumption and digestion.
When did members of the English-speaking world begin eating vegetables by choice instead of out of poverty? When did it become conventional wisdom that vegetables were, in fact, healthy? Was this change attributable to any kind of research?
