During the first several decades of the twentieth century, these beliefs were introduced to the United States as Germans settled around the country, some opening the first health food stores.
In turn, young Americans adopted the beliefs and practices of the new immigrants. One group, called the "Nature Boys", took to the California desert, raised organic food, and espoused a back-to-nature lifestyle.