Why the Word Diet is Often Seen in a Negative Way

As early as the 1920’s, the American Academy of Nutrition began to speak about the changes you made in the foods that you ate, as your “diet.” That is; any change that you made to improve health or for the better was a good diet. Today, in a more popular way, we can often be […]

Continue Reading