In October of 1953, the farmers of the Western hemisphere were busy toiling over harvested grain, either milling it into flour or prepping it for brewing. Meanwhile, a group of historians and anthropologists gathered to debate which of these two common grain uses humans mastered first—bread or beer?
The original question posed by Professor J. D. Sauer, of the University of Wisconsin’s Botany Department, was even more provocative. He wanted to know whether “thirst, rather than hunger, may have been the stimulus [for] grain agriculture.” In more scientific terms, the participants were asking: “Could the discovery that a mash of fermented grain yielded a palatable and nutritious beverage have acted as a greater stimulant toward the experimental selection and breeding of the cereals than the discovery of flour and bread making?”
Interestingly, the available archaeological evidence didn’t produce a definitive answer. The cereals and the tools used for planting and reaping, as well as the milling stones and various receptacles, could be involved for making either the bread or the beer. Nonetheless, the symposium, which ran under the title of Did Man Once Live by Beer Alone?, featured plenty of discussion.
The proponents of the beer-before-bread idea noted that the earliest grains might have actually been more suitable for brewing than for baking. For example, some wild wheat and barley varieties had husks or chaff stuck to the grains. Without additional processing, such husk-enclosed grains were useless for making bread—but fit for brewing. Brewing fermented drinks may also have been easier than baking. Making bread is a fairly complex operation that necessitates milling grains and making dough, which in the case of leavened bread requires yeast. It also requires fire and ovens, or heated stones at the least.
On the other hand, as some attendees pointed out, brewing needs only a simple receptacle in which grain can ferment, a chemical reaction that can be easily started in three different ways. Sprouting grain produces its own fermentation enzyme—diastase. There are also various types of yeast naturally present in the environment. Lastly, human saliva also contains fermentation enzymes, which could have started a brewing process in a partially chewed up grain. South American tribes make corn beer called chicha, as well as other fermented beverages, by chewing the seeds, roots, or flour to initiate the brewing process.
But those who believed in the “bread first, beer later” concept posed some important questions. If the ancient cereals weren’t used for food, what did their gatherers or growers actually eat? “Man cannot live on beer alone, and not too satisfactorily on beer and meat,” noted botanist and agronomist Paul Christoph Mangelsdorf. “And the addition of a few legumes, the wild peas and lentils of the Near East, would not have improved the situation appreciably. Additional carbohydrates were needed to balance the diet… Did these Neolithic farmers forego the extraordinary food values of the cereals in favor of alcohol, for which they had no physiological need?” He finished his statement with an even more provoking inquiry. “Are we to believe that the foundations of Western Civilization were laid by an ill-fed people living in a perpetual state of partial intoxication?” Another attendee said that proposing the idea of grain domestication for brewing was not unlike suggesting that cattle was “domesticated for making intoxicating beverages from the milk.”
In the end, the two camps met halfway. They agreed that our ancestors probably used cereal for food, but that food might have been in liquid rather than baked form. It’s likely that the earliest cereal dishes were prepared as gruel—a thinner, more liquidy version of porridge that had been a Western peasants’ dietary staple. But gruel could easily ferment. Anthropologist Ralph Linton, who chose to take “an intermediate position” in the beer vs. bread controversy, noted that beer “may have resulted from accidental souring of a thin gruel … which had been left standing in an open vessel.” So perhaps humankind indeed owes its effervescent bubbly beverage to some leftover mush gone bad thousands of years ago.
The post Did Humans Once Live by Beer Alone? An Oktoberfest Tale appeared first on JSTOR Daily.
In 1909, American agricultural scientist Franklin Hiram King embarked on a journey to Asia. Having been chief of the Division of Soil Management in the USDA Bureau of Soils from 1902 to 1904, King was concerned about the United State’s rapidly deteriorating soil health. He wasn’t alone. The 19th and 20th centuries were marked by an increasing demand for food from the growing populations in Europe and America, along with decreasing soil productivity. But in Asia, the situation was starkly different. Chinese farmers grew their crops on the same land year after year, but their soil never seemed to lose its fertility.
“At the time, the US was still a relatively young nation and there was already lots of concern about the loss of fertility of soils,” says Joseph Heckman, professor of soil science at Rutgers University. “King was concerned about it and he heard that things were different in Asia.” He hoped to learn how farmers there were able to keep soil healthy and productive over the long term.
King called the farming approach he observed in Asia “permanent agriculture,” emphasizing the fact that the soils remained fertile over thousands of years. Today we have a different name for it: sustainable farming. This term may sound like a buzzword of the last few decades, but farmers in China, Korea, and Japan had practiced this form of food production successfully for many generations. King actively advocated implementing similar ideas in the Western World. “They really had a form of sustainable agriculture, even though they never used that word,” Heckman says. If King’s ideas had taken hold, the U.S. might have been farming sustainably throughout the entire last century. But a powerful force was already turning against him.
By the early 20th century, scientists already knew that plants needed several vital chemical elements to grow, such as nitrogen, potassium, and phosphorus. As plants grow, they take these nutrients from the soil to support their cellular tissues and processes. After a few years, these vital minerals get used up. Once soils become deficient in these elements, their productivity drops—they simply can’t give plants enough building materials. So chemists argued that adding these minerals to the soil would help restore yields. But while potassium and phosphorus could be mined in a number of places in the United States and Europe, nitrogen was a problem. Except for one source—Chilean saltepeter that provided roughly 60 percent of the global need for nitrogen throughout the 19th century—there were no other mines to harvest it. As a gas, it is abundant in the atmosphere, but there was no easy way of gathering it from the air. At least not until two scientists in Germany, Fritz Haber and Carl Bosch, figured out how to do it.
It was Haber who first successfully produced the reaction in his lab. A very stable gas, atmospheric nitrogen doesn’t react with other elements easily, so Haber used high pressure and temperature, plus a special catalyst to get the reaction going. At 500° degrees centigrade and under pressure of 150 to 200 atmospheres, Haber managed to fuse nitrogen and hydrogen into ammonia, a fertilizer now used worldwide. Bosch, who was a chemist and an engineer, helped scale up the process, and in the early 1920s, synthetic ammonia use became widespread. Germany was producing 350,000 tons of nitrates annually, and was planning to reach 500,000 tons a year. There was no shortage of buyers. France alone consumed 70,000 tons of nitrogen a year, 80 percent of which was imported at a cost of 500,000,000 francs. Grabbing nitrogen out of thin air wasn’t just a boon for agriculture, it was also a lucrative business.
Consequently, the agricultural community divided into two camps. The organic camp believed that soil nutrients should be replenished by returning the biological matter back to it. The chemical camp embraced the idea that saturating land with synthetically made fertilizer would be good enough.
The second camp gained popularity. Farming with synthetic nitrogen was easier. Plants grew faster and bore more fruit. Using factory-made nitrates was cleaner than using manure and less labor-intensive than recycling agricultural refuse. As a result, more and more farmers switched to inorganic fertilizers and demand grew. The ability to fix atmospheric nitrogen into usable substances was so crucial for feeding the earth’s growing population that Haber received a Nobel Prize for his invention in 1919. The Haber-Bosch process is credited as the method that helped humankind battle hunger and sparked the planet’s population explosion.
Nonetheless, some scientists viewed the idea of synthetic fertilizers as deeply flawed.
They argued that soil health—which influenced the health of the plants it grew and of those who ate them—was more complex than a mix of three fertilizing chemicals. In the 1940s, British scientist Albert Howard published several books arguing this point. In An Agricultural Testament, Howard introduced the “The Law of Return,” arguing the importance of returning various waste materials to the soil to build and maintain its fertility. Synthetic fertilizers, he insisted, broke the Law of Return, and would eventually cause the soil to deteriorate. In another book, Farming and Gardening for Health or Disease, he argued that disease—whether in plants, animals, or humans—was caused by unhealthy soil and that using organic farming techniques would keep the soil and those living on it, healthy.
In the United States, the organic movement enjoyed an unexpected boost during World War II. As the country needed more food, the federal government encouraged Americans to grow fruits and vegetables in their backyard gardens. About 80 percent of the population responded, and in 1943, these gardens produced 40 percent of American produce. During the war, there was little synthetic fertilizer available because the nitrates were used to make munitions, and food growers needed other types of soil nourishment. So when Jerome Rodale launched his Organic Gardening magazine, which encouraged growing food without chemicals, it quickly became popular, fueling interest in organic agriculture.
However, when the war ended, so did the shortage of nitrates. With no more explosives and other war agents to produce, nitrogen manufacturers switched to making chemical fertilizers and pesticides, advertising the new, chemical ways of farming. For Howard, that looked like declaring a war on soil itself. In 1946, he published another book, The War in the Soil, in which he vehemently criticized the burgeoning big farming business. “The war in the soil is the result of a conflict between the birthright of humanity—fresh food from fertile soil—and the profits of a section of Big Business in the shape of the manufacturers of artificial fertilizers and … poison sprays to protect crops.” Nonetheless, the immediate benefits of fertilizers and pesticides, such as yields and convenience, overshadowed concerns. Consequently, large-scale industrial agriculture took over not only in produce growing, but also in animal farming, leading to concentrated animal feeding operations or CAFOs.
In the years that followed, humankind learned that heavy use of agricultural chemicals comes at a price—from plummeting bird populations to the quality of food it produces to human health issues. And CAFO operators received continuous criticism for polluting air and water, animal mistreatment, and antibiotic overuse. CAFOs also caused more soil deterioration. Instead of putting animals on natural pasturelands, CAFOs feed them with corn and soy. But these crops are often grown with fertilizers and pesticides. And because they grow over a few short months leaving soil bare the rest of the year, they cause erosion rather than building soil organic matter—unlike grasses that comprise pasturelands. The latest research finds that cows on pasture can help sequester carbon and thus help mitigate climate change.
Studies also found that food produced by animals raised on pastureland is more nutritious. Compared to industrially-produced animal foods, organic milk has better balanced Omega3 and Omega6 ratios, and organic eggs contains more vitamin A and E, Heckman notes. “There’s quite a bit of evidence that organic foods from animals raised on pasture are uniquely different,” he says, adding that having animals on pasture also helps maintain healthy soils, replenishing organic matter and preventing erosion.
In the early 20th century, Franklin Hiram King lost the battle against chemical agriculture, but can it be fought again and won in the 21st? It’s a matter of government policies, Heckman says—and often the policies favor big business while limiting small organic farms’ ability to reach consumers. “When certain government policies make it difficult for farmers and consumers to have access to these foods,” Heckman says, “it essentially shuts down those kinds of good farms. And by doing so, it shuts down good soil management, too.”
The post Chinese Peasants Taught the USDA to Farm Organically in 1909 appeared first on JSTOR Daily.