We are searching data for your request:
Upon completion, a link will appear to access the found materials.
Over the course of my life I have seen several large, dangerous fires. Unfortunate animals often perish in these. Especially slower ones, such as turtles. Surely, it can not have been difficult for humans to invent cooking in the broad sense - that is heating food by fire - over the course of a few millennia.
Now, boiling food should be an entirely different question, a mental leap that I don't see as obvious at all, nor do I think it likely to occur by accident. So, what are some early archeological clues for humans doing it?
I realise, it might be difficult to find archeological evidence, but implements, such as pots, spoons or tripods for placing such on the fire are probably very old. One text that made me think archaeologists might know much more than I imagined, was Native American Cookery. Said article contains the passage:
Boiling could be done in skin or bark utensils, or even on a clay bed, by filling with cold water, dropping in the meat and then heating with hot stones taken from a near-by fire. It was safer to boil in a bark dish than in a clay pot, because of the ease with which the pot was broken. One hot stone gives off a great deal of heat, and a dozen or so used in this manner soon finishes the task of hot-stone cooking.
If this applies generally, or many people used their kettles for pot roasting, we might have great difficulty finding these implements or establishing their use. Yet, shapes, materials or grease traces, might tell archaeologists much more than they would tell me.
Even if our estimates have to be very conservative, I am still interested.
I can't speak to boiling in particular, but if its just evidence you're asking about, there's evidence of Homo Erectus (a human ancestor) using controlled fires about a million years ago.
Archeologists are actually involved in a raging debate about human cooking right now. The oldest faction argues based on artifacts and skeletal features like denature that it was an innate behavior of Homo Erectus, which would make it about 1.8 million years old. The youngest faction argues that man certainly "harvested" wildfires when available, but didn't master the ability to create fires at will until only about 12,000 years ago (which would make it a part of the Neolithic Revolution).
Here are a few points I got from a book on the Ancient Indus, by Rita P. Wright.
- Using a pot to boil is preferred, as the contents can be maintained at boiling point for long periods, making the food more palatable.
- Before this, people used stone boiling. A variety of containers could be used: stone bowls, pottery, baskets lined with bitumen. The stones would be heated in a fire, then dropped into the container with water and the food to be boiled.
Archaeologically, you can look for burnt stones. Here for example is a mention from a site from around 25K BP: http://prfdec.natur.cuni.cz/~kfggsekr/rggg/pdf/Svoboda_etal_09.pdf
Cooking requires pots, and thus is linked to the invention of pottery.
The oldest pot I could find reference to, discovered in the chinese province Jiangxi, dates back to about 20,000 BCE (late stone age), which would mean pottery / cooking predates agriculture by some margin.
Note that the linked articles (dated 2012) state the Jiangxi finding to be noteworthy, because previous findings date to about 10,000 BCE, i.e. after the invention of agriculture.
Our Early Ancestors May Have Boiled Their Food in Hot Springs Long Before Learning to Control Fire
Some of the oldest remains of early human ancestors have been unearthed in Olduvai Gorge, a rift valley setting in northern Tanzania where anthropologists have discovered fossils of hominids that existed 1.8 million years ago. The region has preserved many fossils and stone tools, indicating that early humans settled and hunted there.
Now a team led by researchers at MIT and the University of Alcalá in Spain has discovered evidence that hot springs may have existed in Olduvai Gorge around that time, near early human archaeological sites. The proximity of these hydrothermal features raises the possibility that early humans could have used hot springs as a cooking resource, for instance to boil fresh kills, long before humans are thought to have used fire as a controlled source for cooking.
“As far as we can tell, this is the first time researchers have put forth concrete evidence for the possibility that people were using hydrothermal environments as a resource, where animals would’ve been gathering, and where the potential to cook was available,” says Roger Summons, the Schlumberger Professor of Geobiology in MIT’s Department of Earth, Atmospheric, and Planetary Sciences (EAPS).
Summons and his colleagues published their findings on September 15, 2020, in the Proceedings of the National Academy of Sciences. The study’s lead author is Ainara Sistiaga, a Marie Skłodowska-Curie fellow based at MIT and the University of Copenhagen. The team includes Fatima Husain, a graduate student in EAPS, along with archaeologists, geologists, and geochemists from the University of Alcalá and the University of Valladolid, in Spain the University of Dar es Salaam, in Tanzania and Pennsylvania State University.
An unexpected reconstruction
In 2016, Sistiaga joined an archaeological expedition to Olduvai Gorge, where researchers with the Olduvai Paleoanthropology and Paleoecology Project were collecting sediments from a 3-kilometer-long layer of exposed rock that was deposited around 1.7 million years ago. This geologic layer was striking because its sandy composition was markedly different from the dark clay layer just below, which was deposited 1.8 million years ago.
“Something was changing in the environment, so we wanted to understand what happened and how that impacted humans,” says Sistiaga, who had originally planned to analyze the sediments to see how the landscape changed in response to climate and how these changes may have affected the way early humans lived in the region.
Ainara Sistiaga taking samples at Olduvai Gorge, a rift valley setting in northern Tanzania where anthropologists have discovered fossils of hominids that existed 1.8 million years ago. Credit: Ainara Sistiaga
It’s thought that around 1.7 million years ago, East Africa underwent a gradual aridification, moving from a wetter, tree-populated climate to dryer, grassier terrain. Sistiaga brought back sandy rocks collected from the Olduvai Gorge layer and began to analyze them in Summons’ lab for signs of certain lipids that can contain residue of leaf waxes, offering clues to the kind of vegetation present at the time.
“You can reconstruct something about the plants that were there by the carbon numbers and the isotopes, and that’s what our lab specializes in, and why Ainara was doing it in our lab,” Summons says. “But then she discovered other classes of compounds that were totally unexpected.”
An unambiguous sign
Within the sediments she brought back, Sistiaga came across lipids that looked completely different from the plant-derived lipids she knew. She took the data to Summons, who realized that they were a close match with lipids produced not by plants, but by specific groups of bacteria that he and his colleagues had reported on, in a completely different context, nearly 20 years ago.
The lipids that Sistiaga extracted from sediments deposited 1.7 million years ago in Tanzania were the same lipids that are produced by a modern bacteria that Summons and his colleagues previously studied in the United States, in the hot springs of Yellowstone National Park.
The team digging in a region of Olduvai Gorge, an archaeological site in Tanzania where remains of early human settlements have been previously unearthed. Credit: Courtesy of Fernando Diez-Martin
One specific bacterium, Thermocrinis ruber, is a hyperthermophilic organism that will only thrive in very hot waters, such as those found in the outflow channels of boiling hot springs.
“They won’t even grow unless the temperature is above 80 degrees Celsius [176 degrees Fahrenheit],” Summons says. “Some of the samples Ainara brought back from this sandy layer in Olduvai Gorge had these same assemblages of bacterial lipids that we think are unambiguously indicative of high-temperature water.”
That is, it appears that heat-loving bacteria similar to those Summons had worked on more than 20 years ago in Yellowstone may also have lived in Olduvai Gorge 1.7 million years ago. By extension, the team proposes, high-temperature features such as hot springs and hydrothermal waters could also have been present.
“It’s not a crazy idea that, with all this tectonic activity in the middle of the rift system, there could have been extrusion of hydrothermal fluids,” notes Sistiaga, who says that Olduvai Gorge is a geologically active tectonic region that has upheaved volcanoes over millions of years — activity that could also have boiled up groundwater to form hot springs at the surface.
The region where the team collected the sediments is adjacent to sites of early human habitation featuring stone tools, along with animal bones. It is possible, then, that nearby hot springs may have enabled hominins to cook food such as meat and certain tough tubers and roots.
Ainara Sistiaga in her lab. Credit: Angel Mojarro
“The authors’ comprehensive analyses paint a vivid picture of the ancient Olduvai Gorge ecosystem and landscape, including the first compelling evidence for ancient hydrothermal springs,” says Richard Pancost, a professor of biogeochemistry at the University of Bristol, who was not involved in the study. “This introduces the fascinating possibility that such springs could have been used by early hominins to cook food.”
“Why wouldn’t you eat it?”
Exactly how early humans may have cooked with hot springs is still an open question. They could have butchered animals and dipped the meat in hot springs to make them more palatable. In a similar way, they could have boiled roots and tubers, much like cooking raw potatoes, to make them more easily digestible. Animals could have also met their demise while falling into the hydrothermal waters, where early humans could have fished them out as a precooked meal.
“If there was a wildebeest that fell into the water and was cooked, why wouldn’t you eat it?” Sistiaga poses.
While there is currently no sure-fire way to establish whether early humans indeed used hot springs to cook, the team plans to look for similar lipids, and signs of hydrothermal reservoirs, in other layers and locations throughout Olduvai Gorge, as well as near other sites in the world where human settlements have been found.
“We can prove in other sites that maybe hot springs were present, but we would still lack evidence of how humans interacted with them. That’s a question of behavior, and understanding the behavior of extinct species almost 2 million years ago is very difficult, Sistiaga says. “I hope we can find other evidence that supports at least the presence of this resource in other important sites for human evolution.”
Reference: “Microbial biomarkers reveal a hydrothermally active landscape at Olduvai Gorge at the dawn of the Acheulean, 1.7 Ma” by Ainara Sistiaga, Fatima Husain, David Uribelarrea, David M. Martín-Perea, Troy Ferland, Katherine H. Freeman, Fernando Diez-Martín, Enrique Baquedano, Audax Mabulla, Manuel Domínguez-Rodrigo and Roger E. Summons, 15 September 2020, Proceedings of the National Academy of Sciences.
This research was supported, in part, by the European Commission (MSCA-GF), the NASA Astrobiology Institute, and the Government of Spain.
Ancient bacteria reveals the critical food early humans ate
Mineralized dental plaque preserved ancient microbial information.
The gut is the darling of human microbiome research. But with over 700 species of bacteria, the human mouth hosts the second largest and most diverse microbiome in the body.
According to new research, understanding ancient oral microbe communities can help modern-day scientists better understand major events in human evolution and even treat health conditions in the future.
“The microbes that live on your body are just as important to your body as an organ like your liver or heart, so making sure we keep these microbes happy and healthy is just as important. But to do that, we really need to understand how they work,” James Fellows Yates tells Inverse.
Fellows Yates is a biomolecular archaeology Ph.D. student at the Max Planck Institute for the Science of Human History and co-author of a paper published Monday in the journal Proceedings of the National Academy of Sciences examining the evolution of the oral microbiome.
Most of what we understand about the oral microbiome has come from samples taken from industrialized societies that have incorporated mouthwash and floss into daily life. Far less is known about the global diversity of oral microbes in unindustrialized and ancient humans.
This is important because the oral microbiome is often implicated in mouth diseases. Research like this new study suggests the presence or absence of certain microbes likely aren’t to blame. Instead, it may be driven by an imbalance in the microbial ecosystem.
“There is a lot we don’t know about these microbial communities and we need to explore them in a lot of different species, including ourselves, to get an understanding of that microbial diversity,” Fellows Yates tells Inverse.
What to know first — All microbiomes include a core microbe, which is more or less the same in all members of a species, and a variable microbiome, which varies between individuals based on what they eat, where they live, and physiological differences.
Fellows Yates and colleagues discovered that a set of core oral microbes have remained consistent throughout African hominid evolution, and are shared today by modern humans, gorillas, and howler monkeys.
Like the microbes living in the human gut, scientists are just beginning to understand how each of these oral microbe species evolved to live here and what they mean for the health of modern-day humans.
Examining the bacteria living in dental tartar — plaque that mineralizes onto teeth — is also a reliable way to collect clues about ancient life. Teeth don’t break down like soft organs do, so fossilized microbes stay preserved for millennia. The gut microbiome is also extremely susceptible to change, whether that be from antibiotics, diet, lifestyle, or external factors.
Plaque is the opposite. The oral microbiomes that live in tooth build-up are typically quite stable, especially among all the species in a genus.
What’s new — Fellows Yates and his team made two important findings.
First, a group of modern and ancient primates that includes gorillas, Neanderthals, and humans appears to have the same core oral microbe. The team identified 10 core bacterial genera that have remained consistent throughout African hominid evolution, which are also found in howler monkeys.
This preservation suggests that this specific microbial make-up may have played an important role in health for more than 40 million years, Fellows Yates explains.
But the human core biome was different in one key way. Compared with non-human hominids, humans uniquely have an abundance of Streptococcus species, which are responsible for converting starches to sugars.
“These starch and sugar-rich diets allowed hominids to have a bigger brain and evolve as we have,” Fellows Yates says. The human brain, for its part, uses up to one-quarter of the body's energy and as much as 60 percent of blood glucose. The team views this finding as evidence of an ancient behavior that led to this bigger brain.
How they did it — The team identified and analyzed 89 dental plaque samples from different primates, including Neanderthals, howler monkeys, chimpanzees, and modern-day humans. To do this, they used the largest database of sequenced human diversity as a reference and compared the ancient DNA to known samples. That’s how they were able to match the broken pieces of ancient microbial DNA to the full strands needed for identification.
This revealed the 10 bacterial genera that have remained consistent deep into human history and that may hold important information about modern-day human health.
When it comes to better understanding what ancient hominid diets consisted of, and when they changed, preserved tartar holds a record of plants that have long disappeared. The team found that humans who lived in the Pleistocene Epoch, which ended about 11,700 years ago, ate diets that varied as much as modern humans’ diets do today — although this variation would look different spread across a dinner table. Around this time, ancient humans all ate a mix of starchy tubers (tubers are starchy options like yams and potatoes), fruits, and rhizomes.
Meanwhile, dental plaque sourced from ancient humans living in South Africa revealed they ate starchy grass seeds as far back as 170,000 years ago — this was also found in samples from Skhul, Israel that date to about 120,000 years ago. This study suggests an adaptation for eating starch-rich foods began earlier than previously thought.
Why it matters — This study suggests previous attempts to understand how the oral microbiome influences diseases have been limited by population samples taken only from people living in modern, industrialized societies.
According to Fellows Yates, this research method — comparing bits of ancient DNA to sequences of known DNA housed in a database of samples mostly taken from industrialized countries — was just an initial step toward course correcting.
“This reference-based approach is a big thing at the moment and we know we need to get away from that,” he says. But that’s easier said than done since the methods used to reconstruct modern DNA can’t be used for ancient DNA.
Working only with a database that focuses on samples taken from the Western world creates bias and doesn’t allow palaeogeneticists to identify currently unknown ancient microbes, Fellows Yates explains. And if palaeogeneticists can figure out a way to identify ancient microbes without requiring a reference, they might be able to identify extinct microbes.
In regard to oral health, modern oral hygiene practices might be hiding the root causes of disease. Moving away from the Western-focused model will allow health treatments to be more equitable among all people, regardless of where they live.
“We know that the microbiome plays a huge role in health, and if we want to make sure that healthcare treatments are designed to help everyone, we need to understand how the microbial communities might be different in different microbiomes and groups of people,” Fellows Yates says.
Why (and How, Exactly) Did Early Humans Start Cooking?
Clearly, the controlled use of fire to cook food was an extremely important element in the biological and social evolution of early humans, whether it started 400,000 or 2 million years ago. The lack of physical evidence suggests early humans did little to modify the control and use of fire for cooking for hundreds of thousands of years, which is quite surprising, given that they developed fairly elaborate tools for hunting during this time, as well as creating some of the first examples of cave art about 64,000 years ago. Physical evidence shows that cooking food on hot stones may have been the only adaptation during the earliest phases of cooking.
Then, about 30,000 years ago, “earth ovens” were developed in central Europe. These were large pits dug in the ground and lined with stones. The pits were filled with hot coals and ashes to heat the stones food, presumably wrapped in leaves, was placed on top of the ashes everything was covered with earth and the food was allowed to roast very slowly. The bones of many types of animals, including large mammoths, have been found in and around ancient earth ovens. This was clearly an improvement over rapidly roasting meat by fire, as slow cooking gives time for the collagen in tough connective tissue to break down to gelatin this process takes at least several hours, and often much longer, depending on the age of the animal and where the meat comes from in the animal. The shoulders and hindquarters of animals are involved in more muscular action and thus contain more connective tissue than the tenderloin near the ribs. Breaking down tough connective tissue makes the meat easier to chew and digest. Like today’s barbecue methods, cooking meat slowly in earth ovens made it very tender and flavorful.
After dry roasting with fire and heating on hot stones, the next true advance in very early cooking technology appears to have been the development of wet cooking, in which food is boiled in water. Boiling food would certainly be an advantage when cooking starchy root tubers and rendering fat from meat. Many archeologists believe the smaller earth ovens lined with hot stones were used to boil water in the pit for cooking meat or root vegetables as early as 30,000 years ago (during the Upper Paleolithic period). Others believe it is likely that water was first boiled for cooking in perishable containers, either over the fire or directly on hot ashes or stones, well before this time.
Unfortunately, no direct archeological evidence has survived to support this conclusion. Yet we know that even a flammable container can be heated above an open flame as long as there is liquid in the container to remove the heat as the liquid evaporates. Thus containers made of bark or wood or animal hides could have been used for boiling food well before the Upper Paleolithic period. No physical evidence of sophisticated utensils for cooking food appears until about 20,000 years ago, when the first pieces of fired clay pottery appear. Using sensitive chemical methods, scientists have determined that shards of pottery found in Japan contain fatty acids from marine sources such as fish and shellfish. These heat-resistant pots may have been used to boil seafood.
The development of simple clay ovens did not occur until at least 10,000 years later. If cooking has had such a profound effect on the evolution of humans, why is there little evidence from earlier periods of the development of more sophisticated methods of cooking than simply roasting in a hot pit or boiling in water with hot stones?
Jacob Bronowski may have answered that question in his enlightening book The Ascent of Man. The life of early nomads, such as the hunter-gathers who existed for several million years or more, was a constant search for food. They were always on the move, following the wild herds. “Every night is the end of a day like the last, and every morning will be the beginning of a journey like the day before,” he wrote. It was a matter of survival. There simply was no time for them to innovate and create new methods of cooking. Being constantly on the move, they couldn’t pack up and carry heavy cooking utensils every day, even if they had invented them. Then, about 10,000 years before the last ice age ended, creativity and innovation finally began to flourish in spite of the restrictions of nomadic life. Early humans were finding that food was becoming more abundant due to warming weather, so they could gather it more easily without needing to move constantly.
With the end of the last ice age and the beginning of the Neolithic period, about 12,000 years ago, everything changed. Everything! It was the dawn of the agricultural revolution, when wandering nomads began to settle and turn into villagers. What made this possible? The discovery that seeds from new varieties of wild grasses that emerged after the end of the ice age, such as emmer wheat and two-row barley, could be gathered, saved, planted, and harvested the following season. This occurred first in an area known as the Fertile Crescent (Jordan, Syria, Lebanon, Iraq, Israel, and part of Iran). Enough food could now be harvested in 3 weeks to last an entire year!
The change from a nomadic life to a sedentary life in more secure settlements was critical.
Being able to harvest large quantities of food at one time meant these early farmers could no longer move from place to place they had to build immovable structures for storing and protecting all the food, and this resulted in the creation of permanent settlements. The agricultural revolution then spread to other parts of the world over several thousand years.
Thanks to the pioneering research of the Russian scientist Nikolai Vavilov in the 1930s and the American scientist Robert Braidwood in the 1940s, we now know that over several thousand years people living in seven independent regions of the world domesticated crops and animals indigenous to that region. Unfortunately, Vavilov’s studies were prematurely ended when he was imprisoned in 1940 by the Stalinist government for his revolutionary views on evolution.
As the ice age was coming to an end around 12,000 years ago, early humans were harvesting wild wheat and barley in quantity in the Fertile Crescent, but there was no evidence of domesticated plants and animals. By domesticated, I mean plants and animals deliberately raised for food by humans rather than wild plants and animals gathered in the forests and fields. Then within a period of roughly 300 years, between 10,000 and 9,700 years ago, the first evidence of domesticated plants and animals began to appear in the southern Jordan Valley around the ancient settlement of Jericho.
In this relatively brief time period, the seeds of plants like wheat and barley became larger while the bones of animals became smaller. That’s how archeologists in the field can tell the difference—and it makes sense. As early humans began to select seeds to plant, they chose the larger seeds, which were storing more of the nutrients required for faster growth. The resulting crops grew faster to outcompete the wild weeds and provided higher yields—and in turn produced still larger seeds.
These early humans also selected wheat plants with terminal clusters of seeds that retained the kernels during harvest instead of allowing them to scatter in the wind like the wild varieties. The rachis, the short stalk that holds the seed to the plant, became shorter and thicker with time. DNA analysis confirms that the physical differences observed between domesticated and wild seeds originate in the plant’s genome. All these changes occurred as a result of human selection of plants with more desirable traits. These are the first plants to be genetically modified through human intervention. Similarly, domesticated goats and sheep were selected to be more docile and adaptable to living in a confined pen and feeding off the scraps of food left by their keepers. Thus they became smaller. These physical changes in domesticated plants and animals began to take shape as humans started to produce their own food.
The development of new foods and methods of cooking in the few thousand years following the emergence of agriculture illustrates how important this period was for the advancement of humans. The change from a nomadic life to a sedentary life in more secure settlements was critical, as it allowed humans to make significant achievements in technology and other areas. Within a few thousand years, small farming villages grew into large permanent settlements and then small cities. Jericho is perhaps the oldest permanent settlement, providing an accurate record of agricultural development between 10,000 and 9,700 years ago. Hunter-gatherers first settled there around 11,000 years ago in order to be near a constant source of water, a spring-fed oasis. Archeological excavations of the oldest buried sections of Jericho, which cover an area of a little less than ¼ acre (0.1 hectares), did not reveal any signs of domesticated seeds or animal bones.
By 9,700 years ago, the first domesticated seeds of emmer wheat and barley began to appear in higher levels of soil, and the earliest farming settlement had grown to an area of about 6 acres (2.5 hectares) with perhaps 300 people living in mud brick houses. By 8,000 years ago, Jericho was home to a permanent agricultural settlement of approximately 3,000 people occupying an area of 8–10 acres (3.2–4 hectares). About this same time, emmer wheat hybridized with a wild grass to produce bread wheat, which contained higher levels of the gluten-forming proteins required for making leavened bread. Wheat had finally emerged in the form in which it is still grown and used today around much of the world.
Excerpted from Cook, Taste, Learn: How the Evolution of Science Transformed the Art of Cooking © 2019 Guy Crosby. Used by arrangement with Columbia University Press. All rights reserved.
Hot Stew in the Ice Age? Evidence Shows Neanderthals Boiled Food
An ancient diet expert suggests our early cousins knew how to boil their meals.
Neanderthal cooking likely wouldn't have won any prizes on Top Chef, but a paleontologist suggests that our ancient cousins knew how to cook a mean stew, without even a stone pot to their name.
"I think it's pretty likely the Neanderthals boiled," said University of Michigan archaeologist John Speth at a recent meeting of the Society for American Archaeology in Austin, Texas. "They were around for a long time, and they were very clever with fire."
Neanderthals were a species of early humans who lived in Europe and the Near East until about 30,000 years ago. Conventional wisdom holds that boiling to soften food or render fat from bones may have been one of the advantages that allowed Homo sapiens to thrive, while Neanderthals died out. (Related: "Surprise! 20 Percent of Neanderthal Genome Lives on in Modern Humans, Scientists Find.")
But based on evidence from ancient bones, spears, and porridge, Speth believes our Stone Age cousins likely boiled their food. He suggests that Neanderthals boiled using only a skin bag or a birch bark tray by relying on a trick of chemistry: Water will boil at a temperature below the ignition point of almost any container, even flammable bark or hides.
"You can boil in just about anything as long as you take it off the flame pretty quickly," Speth says. His presentation included video of water boiling in a paper cup (the water keeps the paper from reaching its ignition temperature) and mention of scenes in Jean Auel's 1980 novel, Clan of the Cave Bear (later a movie), in which Neanderthals boiled stews in hide pouches.
"This wasn't an invention of some brainy modern people," Speth says. (Related: "Neanderthals Lived in Small, Isolated Populations, Gene Analysis Shows.")
While conceding that Neanderthals were handy with wood and fire, archaeologists such as Mary Stiner of the University of Arizona in Tucson want to let Speth's idea simmer for a while before they swallow it.
"Whether they went as far as boiling stuff in birch bark containers or in hides is harder to evaluate," Stiner says. "I am not convinced."
The use of fire by humans goes back more than 300,000 years in Europe, where evidence is seen in Neanderthal hearths. (Related: "Oldest Known Hearth Found in Israel Cave.")
But most research has supported the idea that Stone Age boiling, which relied on heating stones in fire pits and dropping them into water, arrived on the scene too late for Neanderthals.
Evidence of cracked "boiling stones" in caves used by early modern humans, for example, goes back only about 26,000 years, too recent for Neanderthals. And pottery for more conventional boiling appears to be only about 20,000 years old.
But who needs boiling stones or pots? Speth suggests that Neanderthals boiled foods in birch bark twisted into trays, a technology that prehistoric people used to boil maple syrup from tree sap.
Archaeologists have demonstrated that Neanderthals relied on birch tar as an adhesive for hafting spear points as long as 200,000 years ago. Making birch tar requires clever cooking in an oxygen-free container, says paleontologist Michael Bisson of Canada's McGill University.
"I've burned myself trying to do it," Bisson says, adding that Neanderthals were plenty clever when it came to manipulating birch. They likely ignited rolled-up birch bark "cigars" and plunged them into holes to cook the tar in an oxygen-free environment.
If the tar is exposed to oxygen in the air as it cooks, "it explodes," Bisson adds.
Supporting the boiling idea, Speth said that animal bones found in Neanderthal settings are 98 percent free of scavenger's gnawing marks, which he says suggests the fat had been cooked off.
And some grains found in the teeth of a Neanderthal buried in Iraq's Shanidar Cave site appear to have been cooked, according to a 2011 Proceedings of the National Academies of Science report.
"It is speculative, but I think it is pretty likely that they knew how to boil," Speth says.
In a separate talk at the meeting, University of Michigan paleontologist Andrew White noted recent evidence that Neanderthal mothers weaned their children at an earlier age than human mothers typically do. He said the early transition from milk to food supports the theory that Neanderthals boiled their youngsters' food to make it more digestible.
The idea that Neanderthals could probably boil their food first came to Speth as he watched an episode of the TV show Survivorman. Stuck in East Africa with only dirty water to drink, host Les Stroud sterilized the muddy liquid by boiling it in a plastic bag.
"Who says you can't learn anything from TV?" says Speth. "I figured if we could boil water in a plastic bag, then Neanderthals could do it in a birch tray."
Correction: The discplines of two experts mentioned in the story, Dr. Speth and Dr. Stiner, have been corrected.
Evolution of cookware, from basic tools to modern utensils
Over the last couple of centuries, the introduction of a variety of metals like iron, copper and aluminum have entered the marketplace. Stainless steel is now a favorite cookware material for many due to its shiny luster is also non-reactive to acidic foods.
Each metal or ceramic product has its advantages and disadvantages. Some pans heat up quicker than others, some distribute the heat more evenly. Whatever your need, there is cookware for everyone we can thank our ancestors for this.
We prefer the traditional ways of cooking our grandparents cooked, that’s in cast iron and ceramics. However, there are so many different choices. For us, the most important factor is the enjoyment you have from cooking.
RELATED ARTICLESMORE FROM AUTHOR
Braiser Versus Skillet, Which Pan Is The Better Choice?
What Size Dutch Oven Is Best? (For Singles, Couples and Families)
Dutch Oven versus stockpot (uses, differences, and benefits)
Hopefully, I can share some knowledge on the development of Plate Stoves, and how they influenced the cookware we use today.
We saw the development of plate stoves or ranges purchased by wealthy customers, hotels and businesses from mid-to-late 1700’s. It was towards the end of the 1700’s that boiler holes (or Stove Eyes) were seen, and the change from a heating source to providing the means to prepare food started to take root. The early cookware made to take advantage of new stove design was kettles and pots having a pit bottom. And the first cooktop spiders went the same route.
These pans rested in the stove hole, some had a round bottom and other flat, having either short tab feet or legs up to 3/4 inch. They also had a wide top rim (generally 3/4 inch around) that most people assume was for a cover, BUT being there was no standard for cookware sizes the wide rim allowed ex. a 6 inch Pan to rest in a 7 1/2 opening. Lots of innovation happening…patents gone wild. It would be around the 1830’s that pans finally made their way out of the stove to level or above the cooktop.
Old photographs are a great resource, but you really need to see and touch a lot of them to start noticing small details. And how New England and the southeastern PA have so much in common. VT/NY west to Ontario outward towards Wisconsin show their own style.
Thanks Boonie, you may also want to research Count Rumford, he had a huge impact on the development of cookware.
Thank you ever so much for sharing your expertise on early stoves, and how stove innovation impacted cookware design. It certainly is appreciated and adds to the resource. You’re very fortunate to see many different models and makes. I’m a New Zealander and our stove history is rather limited, with Shacklock Stoves being the most well-known.
I’ll have to check out Count Rumford and his inventions.
It look like Scott has come over from a Facebook group called The Iron Works! Collectors of Early Iron! If you are interested in early cast-iron cookware you may want to check out the group.
Hi, I was wondering if you knew anything about the Manufacturing Co. using the Term Neverbreak with relation to Carbon Steel Skillets. Neverbreak is stamped on the top of the handle along with a # 8 ect. depending on the Diameter.
Unfortunately, I’m unaware of the maker of your skillet. But you have a neat piece of American cookware history. These pans are known as cowboy skillets and were designed as lightweight sturdy pans for traveling and for use on open flame. Cowboy skillets tend to warp, but this won’t affect the cooking ability. And will make you feel like a real pioneer. Sounds like fun
Enjoy you little piece of history.
Do you have any information about the baking materials in the late 1800s? Of course, cast iron was available for nearly anything, including gems and muffins. I’m trying to determine when tin or aluminum was used for baking. Any insights?
Thanks for getting in touch.
It’s a really interesting era for baking. I see a lot of antiques used in the Victorian era made from tin and enamelware. Such as flour tins, biscuit tins, and candle models. But the transition between cast iron, tined copper or earthenware is not so clear cut.
I notice in the mid 19th century cookery books, the authors gave instructions to “bake in pans”. However, in the late 1800s and early 1900s, that term had completely changed to “bake in a buttered tin”.
In Britain as the wealth grew in the late 1800s and early 1900s, bakers introduced oddly shaped bread tins to display in the front windows of their bakery’s in entice customers.
As for aluminum that’s completely out of my league, and I consider it mid 20th century cookware. You might come across old advertising baking sheets from flour companies such as PY-O-My Pastry Mix. Ekcology I believe was another well-known mid 20th century brand that produced aluminum bakeware.
I checked out your website btw, and it’s looking good, well done.
Anyway hope this was some interest to you.
I have my mother’s 1930s to 1940’s Korean cast iron Dutch ovens and skillets. One Dutch oven seems to have a finish that is very smooth and the other seems to have some sort of black finish that is worn off towards the bottom half. They are not pitted although the skillet has a few pits.
I want to clean use them. Could they be possibly made of aluminum? Anything you can tell me about Korean cast iron products in that time frame would be very helpful.Thanks
Thanks for getting in touch.
It’s fantastic you want to restore your mothers old cookware.
Korean manufacturers ramped up production of Western style cast-iron cookware in the 1960s. Much of this early cast-iron cookware were copies of Lodge skillets and Dutch ovens. Putting low cost Asian manufacturers from Japan and Korea in direct competition with local manufacturers.
If you have older cookware I’m sure it would be well made and smooth. It also sounds like your cast-iron is in great condition for the age, and shallow pitting is known as “flea bites”.
Aluminum cookware has been used since the late 1800s. However, unseasoned cast-iron is shiny, and the color can range from grey to silver. This is often a surprise to those new to cast-iron cookware. Use a magnet to test your cookware. A magnet will stick to an iron pan but will fall off aluminum cookware.
Interesting site. I never thought of looking back further than the pioneering days. It’s amazing we seem to be returning the way we used to cook for example stoneware and iron. And using modern cookware bonded with tradition cooking surfaces such as ceramic.
I see you’re from New Zealand. It’s great to have readers from all over the world including my homeland. I hope to include the history of British ironware that early kiwi settlers used in their the late 1800s to early 1900s.
Evolution of Human Innovation
The early roots of stone tool innovation, exchange between distant hominin groups, and the use of coloring material are reported in three papers in the journal Science on March 15, 2018. These milestones in the technological, ecological, and social evolution of the human species date back to 320,000 years ago, roughly coinciding with the oldest ages for fossils attributed to Homo sapiens, and 120,000 years earlier than the oldest fossils of our species in eastern Africa. The discoveries include clues to environmental change at the time of these milestones and to the possible factors behind these key developments in human evolution. The publications stem from research in the Olorgesailie Basin, southern Kenya, a multi-decade project of the Smithsonian Institution’s Human Origins Program in collaboration with the National Museums of Kenya.
© Copyright Human Origins Program, Smithsonian Institution
The Olorgesailie project is led by Dr. Rick Potts, director of the Smithsonian’s Human Origins Program, National Museum of Natural History, Washington, DC. Potts co-authored the three papers with long-term collaborators Dr. Alison Brooks (George Washington University and the NMNH Human Origins Program), Dr. Alan Deino (Berkeley Geochronology Center), Dr. Kay Behrensmeyer (NMNH), Dr. John Yellen (National Science Foundation and the Human Origins Program) and 19 additional researchers.
© Copyright Smithsonian Institution
© Copyright Smithsonian Institution
© Copyright Smithsonian Institution
© Copyright Smithsonian Institution
© Copyright Smithsonian Institution
85%) change in mammal species and a prolonged period of strong climatic and landscape change. A unique combination of geological, geochemical, paleobotanical, and faunal evidence suggests that the earliest MSA southern Kenya took hold during an era of pronounced resource instability and episodes of scarcity. In settings of unstable resources, the expansion of human social networks may have become an important survival tool.
© Copyright Smithsonian Institution
103,000 years old. The light tan sediments in the upper right are
50,000 years old. The sediments and the dates thus show geologically rapid changes in the landscape where MSA artifacts were made and used by early Homo.
© Copyright Smithsonian Institution
The research teams for the three studies published in Science include collaborators from the following institutions: the Smithsonian Institution, the National Museums of Kenya, George Washington University, the Berkeley Geochronology Center, the National Science Foundation, the University of Illinois at Urbana-Champaign, the University of Missouri, the University of Bordeaux (Centre National de la Recherche Scientifique), the University of Utah, Harvard University, Santa Monica College, the University of Michigan, the University of Connecticut, Emory University, the University of Bergen, Hong Kong Baptist University and the University of Saskatchewan.
Funding for this research was provided by the Smithsonian, the National Science Foundation, and George Washington University.
Summary of the papers:
The paper by Alison S. Brooks, John E. Yellen, Richard Potts, and 12 coauthors announces the oldest known evidence of the technology and behaviors linked to the emergence of the human species. The article focuses on early evidence of resource exchange, or trade, between distant groups of ancestral humans, and the use of coloring materials, which is a form of symbolic behavior typical of our species.
Brooks, A.S., Yellen, J.E., Potts, R., Behrensmeyer, A.K., Deino, A.L., Leslie, D.E., Ambrose, S.H., Ferguson, J., d’Errico, F. Zipkin, A.M., Whittaker, S., Post, J., Veatch, E.G., Foecke, K., Clark, J.B., 2018. Long-distance stone transport and pigment use in the earliest Middle Stone Age, Science. http://science.sciencemag.org/cgi/doi/10.1126/science.aao2646
The paper by Richard Potts, Anna K. Behrensmeyer, and 13 coauthors identifies the adaptive challenges during this critical phase in African human evolution. Integrating diverse sources of environmental data, the article advances the idea that changing landscapes and climate throughout the region prompted the evolutionary shift by favoring technological innovation, longer distance movements, and greater connectivity among social groups as a means of adjusting to scarce and unpredictable resources.
Potts, R., Behrensmeyer, A.K., Faith, J.T., Tryon, C.A., Brooks, A.S., Yellen, J.E., Deino, A.L., Kinyanjui, R., Clark, J.B., Haradon, C., Levin, N.E., Meijer, H.J.M., Veatch, E.G., Owen, R.B., Renaut, R.W., 2018. Environmental dynamics during the onset of the Middle Stone Age in eastern Africa, Science. http://science.sciencemag.org/cgi/doi/10.1126/science.aao2200
The paper by Anna K. Behrensmeyer, Alan Deino and Richard Potts presents the results of more than 15 years of field research on the last 500 thousand years of geological history in the southern Kenya rift system. The team worked together to integrate the geology, the absolute ages, and the archeological sites to synthesize a detailed history of rapid environmental changes that affected the landscape inhabited by early populations of our genus, Homo."
Behrensmeyer, A.K., Potts, R., Deino, A., The Oltulelei Formation of the southern Kenyan Rift Valley: A chronicle of rapid landscape transformation over the last 500 k.y., 2018. Geological Society of America Bulletin. https://pubs.geoscienceworld.org/gsa/gsabulletin/article/529628/the-oltulelei-formation-of-the-southern-kenyan
The paper by Alan Deino and 5 coauthors provides the chronology for the discoveries described in the accompanying papers, and documents one of the oldest known and most securely-dated sequences for the African Middle Stone Age, between 320,000 and 295,000 years ago. The article relies on the latest developments in 40 Ar/ 39 Ar dating, integrates U-series analyses carried out at the Berkeley Geochronology Center, and offers a synthesis of dates for late Acheulean and early Middle Stone Age archeological sites throughout Africa.
Bones Filled with Marrow Served as Prehistoric Humans' 'Cans of Soup'
People who lived hundreds of thousands of years ago may not have had pantries or supermarkets, but they stocked up on food when they could, researchers recently discovered.
Evidence from a cave in Israel dating back more than 400,000 years suggests that after butchering their animal prey, Paleolithic humans didn't eat everything immediately. Rather, they stored bones packed with fat and tasty, nutrient-rich marrow to crack open and eat later — much as people today might open and enjoy a can of soup.
These are the earliest clues about food storage in ancient human societies, hinting that their survival was not as hand-to-mouth as once thought, according to a new study.
"Bone marrow constitutes a significant source of nutrition and, as such, was long featured in the prehistoric diet," study co-author Ran Barkai, a senior lecturer in archaeology at Tel Aviv University (TAU), said in a statement. Fats were especially important to people who were hunter-gatherers, as they relied "almost exclusively" on animals for their diet and did not have access to carbohydrates, the study authors reported.
"Until now, evidence has pointed to immediate consumption of marrow following the procurement and removal of soft tissues," Barkai said. "In our paper, we present evidence of storage and delayed consumption of bone marrow."
Archaeologists examined more than 80,000 animal bones and remains found at the Qesem Cave near Tel Aviv the location dates to between 420,000 and 200,000 years ago, according to the study. Animals that were butchered and eaten by people who lived in the region at the time included hoofed mammals, tortoises, birds and even a few carnivores their most common prey was Persian fallow deer (Dama dama mesopotamica).
Not all of the deers' bones were brought back to the cave most of them were left behind when the animal was butchered, save for the skulls and the long leg bones. What's more, the leg bones showed cut marks on the shafts that differed from those resulting from the butchering of the animals. The scientists suspected that these cuts were performed later, to remove dried skin that had been wrapped around the bones to preserve the marrow for future meals.
Experiments helped the researchers to test their hypothesis. First, they wrapped long animal bones called metapodials in skin, and set them aside to see if that would preserve the edible nutrients inside. Weeks later, they sliced off the skin and broke the bones open, comparing the cut marks to the ones found in the ancient bones from the cave.
"We discovered that preserving the bone along with the skin for a period that could last for many weeks enabled early humans to break the bone when necessary and eat the still nutritious bone marrow," lead study author Ruth Blasco, a researcher with TAU's Department of Archaeology and Ancient Near Eastern Civilizations, said in the statement.
"The bones were used as 'cans' that preserved the bone marrow for a long period until it was time to take off the dry skin, shatter the bone and eat the marrow," Barkai added.
Around the middle of the Pleistocene epoch, the geological period that began around 2.6 million years ago and lasted until around 11,700 years ago, human communities underwent "economic, social and cognitive transformations," the study authors wrote. These so-called marrow cans used by Stone Age humans are signs of that change, setting the stage for even more dramatic shifts in human adaptation to come in the millennia that followed, the researchers said.
The findings were published online Oct. 9 in the journal Science Advances.
Stone Age Stew? Soup Making May Be Older Than We'd Thought
The tradition of making soup is probably at least 25,000 years old, says one archaeologist.
Soup comes in many variations — chicken noodle, creamy tomato, potato and leek, to name a few. But through much of human history, soup was much simpler, requiring nothing more than boiling a haunch of meat or other chunk of food in water to create a warm, nourishing broth.
So who concocted that first bowl of soup?
Most sources state that soup making did not become commonplace until somewhere between 5,000 and 9,000 years ago. The Oxford Encyclopedia of Food and Drink in America says, for example, "boiling was not a commonly used cooking technique until the invention of waterproof and heatproof containers about five thousand years ago."
That's probably wrong — by at least 15,000 years.
It now looks like waterproof and heatproof containers were invented much earlier than previously thought. Harvard University archaeologist Ofer Bar-Yosef and colleagues reported last year in Science on their finding of 20,000-year-old pottery from a cave in China. "When you look at the pots, you can see that they were in a fire," Bar-Yosef says.
Their discovery is possibly the world's oldest-known cookware, but exactly what its users were brewing up isn't certain. Perhaps it was alcohol, or maybe it was soup. Whatever it was, the discovery shows that waterproof, heatproof containers are far older than a mere 5,000 years.
That kind of container, though, isn't even necessary for boiling. An ancient soup maker could have simply dug a pit, lined it with animal skin or gut, filled his "pot" with water and dropped in some hot rocks.
The Time Traveler's Cookbook: Meat-Lover's Edition
The power of the expanding steam cracks the rocks, a distinct characteristic that first shows up in the archaeological record around 25,000 years ago in Western Europe, says archaeologist John Speth, an emeritus professor of anthropology at the University of Michigan in Ann Arbor.
But Speth says boiling, and soup, could be even older.
He started thinking about ancient boiling after watching an episode of the television show "Survivorman," in which host Les Stroud boils water in a plastic container. "You can boil without using heated stones," Speth realized. All you need is a waterproof container suspended over a fire — the water inside keeps the material from burning.
Long-ago cooks could have fashioned such a container from tree bark or the hide of an animal, Speth says. Finding evidence of such boiling, though, would be incredibly difficult because those types of materials usually don't get preserved in the archaeological record.
Speth has argued that Neanderthals, ancient human relatives that lived from around 200,000 to 28,000 years ago, would have needed boiling technology to render fat from animal bones to supplement their diet of lean meat, so that they could have avoided death by protein poisoning.
The kidneys and liver are limited in how much protein they can process in a day — when more than that amount is consumed, ammonia or urea levels in the blood can increase, leading to headaches, fatigue and even death. So humans must get more than half their calories from fat and carbohydrates.
If Neanderthals were boiling bones to obtain the fat, they could have drunk the resulting broth, Speth says.
Neanderthals were probably cooking in some way, scientists have concluded. A 2011 study from the Proceedings of the National Academy of Sciences found evidence of cooked starch grains embedded in 46,000-year-old fossil Neanderthal teeth from Iraq.
"This doesn't prove that they were making soups or stews," Speth says — some have suggested the meal would have resembled oatmeal — "but I would say it's quite likely."
Putting a date on the world's first bowl of soup is probably impossible. Anthropologists haven't been able to determine for certain when man was first able to control fire, or when cooking itself was invented (though it was likely more than 300,000 years ago, before Homo sapiens first emerged, Harvard primatologist Richard Wrangham says in his book Catching Fire).
And the story is probably different for people in different parts of the world. It appears that pottery was invented in eastern Asia thousands of years before it emerged in western Asia, Bar-Yosef notes. "Maybe boiling wasn't so important because you had bread" in the West to balance out all that protein, he says.
Other parts of the world never had any tradition of boiling food. "A lot of hunter-gatherers didn't use containers at all," Speth says. In places like Tanzania and the Kalahari, there are tribes that didn't boil water until after Europeans arrived.
Speth says, though, it's very likely that humans were concocting soup at least 25,000 years ago in some places. Whether our ancestors were boiling up broth before that — well, we'll just have to wait and see what the archaeologists dig up.