The CDC reports that 36 percent of American adults eat fast food everyday—but what exactly do they consider to be "fast food?"
Many in the health industry, including our very own team of nutritionists, were dismayed to hear of a new Centers for Disease Control and Prevention report released earlier this month, suggesting that more than one third of American adults eat fast food daily. The study, which was conducted between 2013 and 2016, found that a whopping 44 percent of those between the ages of 20 and 39 years old were purchasing food from fast-food restaurants—and what's more, people with higher incomes being had an increased likelihood of regular consumption.
While many prominent media outlets covered the report and its illuminating statistics, nearly all of them failed to fully identify which restaurants were included in the CDC's study.
Stay up to date on what healthy means now.
Sign up for our daily newsletter for more great articles and delicious, healthy recipes.
Leah Douglas, an associate editor at the Food and Environment Reporting Network, took a closer look at which kinds of restaurants were considered to be serving "fast food" by the federal agency's standards. She pinpointed a few chains that could help put perspective into why the numbers seem so high; it turns out that the study considered much more than just McDonald's, Burger King, Wendy's, and the ilk. Many fast-casual chains, even those with reputable commitments to healthier items and non-processed ingredients, qualified for the CDC's report—including Sweetgreen, the hyper-local salad chain where most meals are packed with fresh fruit and vegetables, and are around 520 calories.
The CDC also included data from places like Kentucky Fried Chicken and Taco Bell, of course—but they also included input from coffee shops, bagel stores, and even ice cream parlors. It's a broader definition than most would probably associate with fast food, and for some experts, it raises concern over the effect that the CDC's report will have on the health industry.
After all, Douglas makes the point that the general public could be under the impression that one-in-three Americans eat a Big Mac everyday.
Looking to make the best choices while eating out? Read more:
The key to understanding the study in question is to know that the study's participants self-reported whether they had eaten fast food, including items from "carry out" restaurants. When Douglas pressed Cheryl Fryar, the study's lead author, on the definition of "fast food" used in her research, Fryar confirmed that fast-casual chains were included in data collection, including Sweetgreen, as well as the likes of Panera Bread, Chipotle, and even Mediterranean-chain Zoe's Kitchen.
Marion Nestle, a longtime nutrition and food studies professor at New York University, told Douglas that the team behind the CDC's study "did not assess diet quality for this report." Another CDC-commissioned study, published in 2013, suggested that 11 percent of Americans' caloric intake came from fast food—this study used a very similar definition based on self-reporting methodology as well.
The bottom line: It's clear that eating fast-food meals regularly can be extremely problematic for health and holistic diets, but the CDC may need to work on identifying which kinds of restaurants are truly playing a role in America's obesity rates.
CDC reveals just how much fast food American kids eat each day
One-third of U.S. kids eat fast food on a typical day, according to a new report from the Centers for Disease Control and Prevention. Above, a little girl requests a packet of sauce at an El Pollo Loco restaurant in Santa Ana.
More than one in three American kids will eat fast food today, a new government report says.
The same will be true tomorrow, and the next day, and the day after that.
On any given day, 34.3% of U.S. children and teens between the ages of 2 and 19 eats pizza, fried chicken, tacos or some other dish prepared in a fast-food restaurant, according to data collected by the Centers for Disease Control and Prevention.
More specifically, 12.1% of these young diners will get more than 40% of their daily calories in the form of fast food. An additional 10.7% will trace 25% to 40% of their daily calories to a fast-food joint, and 11.6% will get fewer than 25% of their calories from one of these dining establishments.
When you average it all out, the youth of America get 12.4% of their calories on a bun, out of a deep fryer or from another quintessentially fast-food source every single day.
It doesn’t matter if these diners are boys or girls. Whether toddlers or teenagers, the proportion of daily calories obtained from fast food was statistically equivalent for both genders, according to the report published Tuesday by the CDC’s National Center for Health Statistics.
Nor did it matter whether diners were rich or poor. Kids from families who were close to the poverty line counted on fast food for 11.5% of their daily calories, on average. Kids at the other end of the economic spectrum averaged 13% of their daily calories from fast food. That gap wasn’t big enough to be considered statistically significant, the report said.
Even weight status had little bearing on the appetite for fast food. Children and teens who were underweight or had a normal weight averaged 12.2% of their daily calories in the form of fast food. That was slightly higher than the 11.6% for overweight kids and slightly lower than the 14.6% for obese kids. Again, those differences weren’t big enough for the researchers to say they were real.
There was a significant difference in fast-food consumption according to race and ethnicity. Asian American kids averaged fewer calories from fast food than their peers, getting only 8% on any given day, on average. That compared with 11.2% for Latino kids, 13.1% for white kids and 13.9% for African American kids. (The differences among non-Asian kids weren’t statistically significant.)
FOR THE RECORD
Sept. 18, 3:18 p.m.: An earlier version of this story said Asian American children and teens were less likely than their peers to eat fast food on any given day, instead of saying that they averaged fewer daily calories in the form of fast food. It also said younger children were less likely than teens to eat fast food on a typical day, instead of saying that fast food accounted for fewer of their daily calories.
The researchers speculated that fast food hadn’t caught on as much in Asian American households because these families weren’t as assimilated into the U.S. lifestyle, including its eating habits. Fully 27.4% of Asian children in the United States were born overseas, compared with 19.7% of Latino children, 2.5% of whites and 1.9% of blacks.
The other significant difference had to do with age. Overall, children between the ages of 2 and 11 ate less fast food than adolescents between the ages of 12 and 19. On a typical day, fast food accounted for 8.7% of the calories eaten by younger kids, compared with 16.9% for older children. That pattern was seen regardless of gender, race or ethnicity, weight status or family income, the researchers found.
The report was based on data from the CDC’s 2011-2012 National Health and Nutrition Examination Survey.
Follow me on Twitter @LATkarenkaplan and “like” Los Angeles Times Science & Health on Facebook.
This Is the Most Unpopular Restaurant Chain in America
Though it topped our Dissatisfaction Index, customers keep going back to this iconic brand.
Anastasiya 99 / Shutterstock
Restaurants have had a rough year, considering they're one of the few places patrons can't enjoy with a mask on. But many diners have still turned to fast-food joints and carryout options throughout 2020 and now in 2021. And through the challenges, we've been able to see which restaurant chains could take the heat. With that in mind, we set out to find which major restaurants have the worst customer service, according to diners themselves. Yes, it's time to crown the most hated unpopular restaurant chain in America.
First, we used the 2020 American Customer Satisfaction Index (ACSI) report as a jumping off point for our list of 31 fast-food and casual-dining restaurants. We also factored in the report's customer satisfaction rating of the 31 contenders, with 100 being the highest level of satisfaction. In addition to that score, we gathered data from consumer review and rating sites Trustpilot and Pissed Consumer to find out where customers were the most—and least—satisfied. Both sites use a five-star rating system.
Finally, we gave each of these metrics a weighted value before running them through our exclusive algorithm to see how they scored on our 100-point scale Dissatisfaction Index, which is one test you don't want to score highly on if you're a restaurant.
Generally, consumers surveyed for the ASCI report were much more positive about their experiences. Internet commenters were less satisfied (not a huge surprise there). You also have to factor in that people tend to remember bad experiences more strongly than good ones, and are therefore more likely to write a negative review than a positive one.
Nevertheless, the end results of our number crunching may surprise you, as the most hated restaurant chain is a stalwart of American culture that won't be going anywhere, no matter how much customers complain. Read on to discover the most unpopular restaurant chain in America and to see how your favorite eateries ranked. And if you want to learn more about bad consumer experiences, This Store Has the Worst Customer Service in America.
How you can get more information about strychnine
You can contact one of the following:
- Regional poison control center: 1-800-222-1222
- Centers for Disease Control and Prevention
- Public Response Hotline (CDC)
- 888-232-6348 (TTY)
The Centers for Disease Control and Prevention (CDC) protects people&rsquos health and safety by preventing and controlling diseases and injuries enhances health decisions by providing credible information on critical health issues and promotes healthy living through strong partnerships with local, national, and international organizations.
For Fast-Food Restaurants
You can eat sensibly at fast-food restaurants by choosing lower-fat foods instead of &ldquothe usual."
- Instead of a danish, try a small bagel.
- Instead of a jumbo cheeseburger, try a grilled chicken sandwich, a sliced meat sandwich or a regular hamburger on a bun with lettuce, tomato and onion.
- Instead of fried chicken, try a grilled chicken and a side salad.
- Instead of fried chicken pieces, try a grilled chicken sandwich.
- Instead of french fries, try a baked potato with vegetables and/or low-fat or fat-free sour cream or margarine on the side.
Written by American Heart Association editorial staff and reviewed by science and medicine advisers. See our editorial policies and staff.
Tim Horton's fills a similar space to Dunkin', serving coffee, baked goods, and a range of sandwiches and snacks. It's not officially on the menu, but ordering a "Double Double" coffee from Tim's is hands-down one of the most popular items on the menu. (Double Double means two creams and two sugars in your coffee.) Not only was this confirmed in a 2012 tweet by the brand, but its popularity could also be gauged by the fact that the specific order was added to the Canadian Oxford Dictionary in 2004.
In traditional tribal societies, the gathering of shellfish, wild plants, berries and seeds is often done by women. Bison have traditionally been an important source of food for the Plains Indians in the area between the Mississippi River and the Rocky Mountains.
Recipes were initially passed down through oral tradition. Over a period of hundreds of years, some tribes migrated into different climate zones, so by the time European settlers recorded these recipes the cuisine had probably adapted to use local ingredients. Some anthropologists propose that the southwestern Eastern Pueblo, Hopi and Zuni may have retained more of the original elements. 
Country food Edit
Country food, in Canada, refers to the traditional diets of the Indigenous peoples in Canada (known in Canada as First Nations, Metis, and Inuit), especially in remote northern regions where Western food is an expensive import, and traditional foods are still relied upon.   
The Government of the Northwest Territories estimated in 2015 that nearly half of Northwest Territories residents in smaller communities relied on country food for 75% of their meat and fish intake in larger communities the percentage was lower, with the lowest percentage relying on country foods (4%) being in Yellowknife, the capital and only "large community".
The most common country foods in the Northwest Territories' area include mammals and birds (caribou, moose, ducks, geese, seals, hare, grouse, ptarmigan), fish (lake trout, char, inconnu (coney), whitefish, pike, burbot) and berries (blueberries, cranberries, blackberries, cloudberries). 
In the eastern Canadian Arctic, Inuit consume a diet of foods that are fished, hunted, and gathered locally. This may include caribou, walrus, ringed seal, bearded seal, beluga whale, polar bear, berries, and fireweed.
The cultural value attached to certain game species, and certain parts, varies. For example, in the James Bay region, a 1982 study found that beluga whale meat was principally used as dog food, whereas the blubber, or muktuk was a "valued delicacy".  Value also varies by age, with Inuit preferring younger ring seals, and often using the older ones for dog food. 
Contaminants in country foods are a public health concern in Northern Canada volunteers are tested to track the spread of industrial chemicals from emitters (usually in the South) into the northern food web via the air and water. 
In 2017, the Government of the Northwest Territories committed to using country foods in the soon-to-open Stanton Territorial Hospital, despite the challenges of obtaining, inspecting, and preparing sufficient quantities of wild game and plants. 
In Southern Canada, wild foods (especially meats) are relatively rare in restaurants, due to wildlife conservation rules against selling hunted meat, as well as strict meat inspection rules. There is a cultural divide between rural and remote communities that rely on wild foods, and urban Canadians (the majority), who have little or no experience with them. 
Eastern Native American cuisine Edit
The essential staple foods of the Indigenous peoples of the Eastern Woodlands have traditionally been corn (also known as maize), beans, and squash, known as "The Three Sisters" because they were planted interdependently: the beans grew up the tall stalks of the corn, while the squash spread out at the base of the three plants and provided protection and support for the root systems.
Maple syrup is another essential food staple of the Eastern Woodlands peoples. Tree sap is collected from sugar maple trees during the beginning of springtime when the nights are still cold.  Birch bark containers are used in the process of making maple syrup, maple cakes, maple sugar, and maple taffy. When the sap is boiled to a certain temperature, the different variations of maple food products are created. When the sap starts to thicken, it can be poured into the snow to make taffy. 
Since the first colonists of New England had to adapt their foods to the local crops and resources, the Native influences of Southern New England Algonquian cuisine form a significant part of New England cuisine with dishes such as cornbread, succotash and Johnnycakes and ingredients such as corn, cranberries and local species of clam still enjoyed in the region today. 
The Wabanaki tribal nations and other eastern woodlands peoples have made nut milk and infant formula made from nuts and cornmeal.   
Southern Native American cuisine Edit
Southeastern Native American culture has formed the cornerstone of Southern cuisine from its origins till the present day. From Southeastern Native American culture came one of the main staples of the Southern diet: corn (maize), either ground into meal or limed with an alkaline salt to make hominy, using a Native American technique known as nixtamalization.  Corn is used to make all kinds of dishes from the familiar cornbread and grits to liquors such as whiskey, which has been an important trade item, historically.
Though a less important staple, potatoes were also adopted from Native American cuisine and have been used in many ways similar to corn. Native Americans introduced the first non-Native American Southerners to many other vegetables still familiar on southern tables. Squash, pumpkin, many types of beans, tomatoes, many types of peppers, and sassafras all came to the settlers via Indigenous peoples. The Virginia Algonquian word pawcohiccora means hickory-nut meat or a nut milk drink made from it.
Many fruits are available in this region. Muscadines, blackberries, raspberries, and many other wild berries were part of Southern Native Americans' diet.
To a far greater degree than anyone realizes, several of the most important food dishes of the Southeastern Indians live on today in the "soul food" eaten by both black and white Southerners. Hominy, for example, is still eaten . Sofkee live on as grits . cornbread [is] used by Southern cooks . Indian fritters . variously known as "hoe cake", . or "Johnny cake." . Indians boiled cornbread is present in Southern cuisine as "corn meal dumplings", . and as "hush puppies", . Southerns cook their beans and field peas by boiling them, as did the Indians . like the Indians they cure their meat and smoke it over hickory coals.
Southeastern Native Americans traditionally supplement their diets with meats derived from the hunting of native game. Venison has always been an important meat staple, due to the abundance of white-tailed deer in the area. Rabbits, squirrels, opossums, and raccoons are also common.
Livestock, adopted from Europeans, in the form of hogs and cattle, are also kept. Aside from the more commonly consumed parts of the animal, it is traditional to also eat organ meats such as liver, brains, and intestines. This tradition remains today in hallmark dishes like chitterlings, commonly called chitlins, which are the fried large intestines of hogs livermush, a common dish in the Carolinas made from hog liver and pork brains and eggs. The fat of the animals, particularly of hogs, is traditionally rendered and used for cooking and frying. Many of the early settlers were taught Southeastern Native American cooking methods.
Selected dishes Edit
- (Chitlin), usually made from the large intestines of a hog , coarsely ground corn used to make grits , small, savory, deep-fried round ball made from cornmeal-based batter
- Indian fritter , pig liver, parts of pig heads, cornmeal and spices
- Sofkee, corn soup or drink, sour 
- Myth #1: Food is a delivery vehicle for nutrients. What really matters isn't broccoli but its fiber and antioxidants. If we get that right, we get our diet right. Foods kind of get in the way.
- Myth #2: We need experts to tell us how to eat. Nutrients are invisible and mysterious. "It is a little like religion," Pollan said. "If a powerful entity is invisible, you need a priesthood to mediate your relation with food."
- Myth #3: The whole point of eating is to maintain and promote bodily health. "You are either improving or ruining your health when you eat -- that is a very American idea," Pollan says. "But there are many other reasons to eat food: pleasure, social community, identity, and ritual. Health is not the only thing going on on our plates."
- Myth #4: There are evil foods and good foods. "At any given time there is an evil nutrient we try to drive like Satan from the food supply -- first it was saturated fats, then it was trans fat," Pollan says. "Then there is the evil nutrient's doppelganger, the blessed nutrient. If we get enough of that we, will be healthy and maybe live forever. It's funny through history how the good and bad guys keep changing."
Great Plains Native American cuisine Edit
Indigenous peoples of the Great Plains and Canadian Prairies or Plains Indians have historically relied heavily on American bison (American buffalo) as a staple food source. One traditional method of preparation is to cut the meat into thin slices then dry it, either over a slow fire or in the hot sun, until it is hard and brittle. In this form it can last for months, making it a main ingredient to be combined with other foods, or eaten on its own.
One such use could be pemmican, a concentrated mixture of fat and protein, and fruits such as cranberries, Saskatoon berries, blueberries, cherries, chokecherries, and currants are sometimes added. Many parts of the bison were utilized and prepared in numerous ways, including: "boiled meat, tripe soup perhaps thickened with brains, roasted intestines, jerked/smoked meat, and raw kidneys, liver, tongue sprinkled with gall or bile were eaten immediately after a kill." 
The animals that Great Plains Indians consumed, like bison, deer, and antelope, were grazing animals. Due to this, they were high in omega-3 fatty acids, an essential acid that many diets lack. 
When asked to state traditional staple foods, a group of Plains elders identified prairie turnips, fruits (chokecherries, June berries, plums, blueberries, cranberries, strawberries, buffalo berries, gooseberries), potatoes, squash, dried meats (venison, buffalo, jack rabbit, pheasant, and prairie chicken), and wild rice as being these staple foods.  And pemmican.
Western Native American cuisine Edit
In the Pacific Northwest, traditional diets include salmon and other fish, seafood, mushrooms, berries, and meats such as deer, duck, and rabbit.
In contrast to the Easterners, the Northwestern peoples are traditionally hunter-gatherers, primarily. The generally mild climate led to the development of an economy based on year-round abundant food supplies, rather than having to rely upon seasonal agriculture.
In what is now California, acorns can be ground into a flour that has at times served as the principal foodstuff for about 75 percent of the population,  and dried meats can be prepared during the dry season. 
Southwestern Native American cuisine Edit
Ancestral Puebloans of the present-day Four Corners region of the United States, comprising Arizona, Colorado, New Mexico, and Utah, initially practiced subsistence agriculture by cultivating maize, beans, squash, sunflower seeds, and pine nuts from the pinyon pine, and game meat including venison and cuniculture, and freshwater fish such as Rio Grande cutthroat trout and rainbow trout are also traditional foods in the region.
Ancestral Puebloans are also known for their basketry and pottery, indicating both an agricultural surplus that needed to be carried and stored, and clay pot cooking. Grinding stones have been used to grind maize into meal for cooking. Archaeological digs indicate a very early domestication of turkeys for food.
New Mexican cuisine is heavily rooted in both Pueblo and Hispano food traditions, and is a prevalent cuisine in the American Southwest, it is especially prevalent in New Mexico.
The 2002 Foods of the Southwest Indian Nations won a James Beard Award, the first Native American cookbook so honored.   Publishers had told the author, Lois Ellen Frank, that there was no such thing as Native American cuisine. 
Alaskan native cuisine Edit
Alaska native cuisine consists of nutrient-dense foods such as seal, fish (salmon), and moose. Along with these, berries (huckleberries) and bird eggs are traditionally consumed by Alaska Natives. 
Seal, walruses, and polar bear are the large game that Alaska Natives hunt. Smaller game includes whitefish, Arctic char, Arctic hare, and ptarmigan.
Due to weather, edible plants like berries are only available to be consumed in the summer, so the people have a diet very high in fat and protein, but low in carbohydrates.
The game that is hunted is also used for clothing. The intestines of large mammals are used to make waterproof clothing and caribou fur is used to make warm clothing. 
Food Insecurity In The U.S. By The Numbers
Food Bank For New York City hosts a pop-up food pantry during Hunger Action Month at Lincoln Center on September 24, 2020.
Michael Loccisano/Getty Images for Food Bank for New York City
With COVID-19 continuing to spread, and millions of Americans still out of work, one of the nation's most urgent problems has only grown worse: hunger.
In communities across the country, the lines at food pantries are stretching longer and longer, and there's no clear end in sight. Before the pandemic, the number of families experiencing food insecurity — defined as a lack of consistent access to enough food for an active, healthy life — had been steadily falling. But now, as economic instability and a health crisis takes over, new estimates point to some of the worst rates of food insecurity in the United States in years.
"COVID has just wreaked havoc on so many things: on public health, on economic stability and obviously on food insecurity," said Luis Guardia, the president of the Food, Research and Action Center.
It's a crisis that's testing families, communities and the social safety net in ways that may have seemed unthinkable before the pandemic began. Here's a closer look at the landscape:
Nearly 1 in 4 households have experienced food insecurity this year
Even before the pandemic hit, some 13.7 million households, or 10.5% of all U.S. households, experienced food insecurity at some point during 2019, according to data from the U.S. Department of Agriculture. That works out to more than 35 million Americans who were either unable to acquire enough food to meet their needs, or uncertain of where their next meal might come from, last year.
For about a third of these households, access to food was so limited that their eating patterns were disrupted and food intake was reduced. The rest were able to obtain enough food to avoid completely disrupting their eating patterns, but had to cope by eating less varied diets or utilizing food assistance programs.
The coronavirus pandemic has only worsened the problem. According to one estimate by researchers at Northwestern University, food insecurity more than doubled as a result of the economic crisis brought on by the outbreak, hitting as many as 23% of households earlier this year.
Millions more children are experiencing food insecurity
In non-pandemic times, households with children were nearly 1.5 times more likely to experience food insecurity than households without children, according to the USDA, which reported that 13.6% of households with children experienced food insecurity last year. More than 5 million children lived in these homes.
Then came the coronavirus. An analysis by the Brookings Institution conducted earlier this summer found that in late June, 27.5% of households with children were food insecure — meaning some 13.9 million children lived in a household characterized by child food insecurity. A separate analysis by researchers at Northwestern found insecurity has more than tripled among households with children to 29.5%.
The Coronavirus Crisis
'Children Are Going Hungry': Why Schools Are Struggling To Feed Students
School lunch programs were already struggling to meet rising demand before the pandemic. With COVID-19 now keeping children out of school, many don't have access to school lunches at all.
"The other thing that COVID has done is it's really affected kids a lot in terms of food insecurity," Guardia said. "One of the things we've noticed across the board is that households with children are more food insecure. And we believe that also has to do with school closures. So a lot of kids get their nutrition from school meals, and that's been disrupted."
Black families are twice as likely as whites to face food insecurity
The data shows that food insecurity is more likely to wreak havoc on some communities than others.
Black and Hispanic Americans are particularly disproportionately affected. According to USDA data, 19.1% of Black households and 15.6% of Hispanic households experienced food insecurity in 2019. White Americans fell below the national average, with 7.9% experiencing food insecurity.
College graduates experienced food insecurity at a rate of just 5% last year. For those without a high school degree, the rate skyrocketed to 27%. Adults who have a disability — in particular adults who have a disability and are not in the work force — also experience more than two times the rate of food insecurity as adults who do not have a disability.
19 million Americans live in food deserts
Location is another factor at play. People who live in food deserts are often more likely to experience food insecurity because food is harder to obtain where they live. About 19 million people, or roughly 6% of the population, lived in a food desert and 2.1 million households both lived in a food desert and lacked access to a vehicle in 2015, according to the USDA.
Food can also be costlier where they live. A 2010 estimate from the USDA found that groceries sold in food deserts can cost significantly more than groceries sold in suburban markets, meaning people in low-income communities impacted by food insecurity often pay more money for their food. Milk prices, for example, were about 5% more in some spots while prices for cereal were sometimes 25% higher.
The definition of food desert can change depending on where you live. In urban settings, you need to live more than a mile away from a supermarket to be considered inside a food desert. For rural areas, it's greater than 10 miles. Rural areas are slightly more likely to be food deserts than urban areas and, according to Feeding America, and while they make up just 63% of counties in the country, they make up 87% of counties with the highest rates of food insecurity.
The Coronavirus Crisis
In Rural Nebraska, Combating Hunger From The Pandemic Is A Community Effort
The Coronavirus Crisis
For One Food Insecure Family, The Pandemic Leaves 'No Wiggle Room'
38 million people used SNAP in 2019
One in nine people in the U.S. used SNAP — the Supplemental Nutrition Assistance Program (also known as food stamps) — in 2019, according to the Center on Budget and Policy Priorities. SNAP benefits vary depending on the need of the participant, but the average SNAP benefit for each member of a household was $129 per month in fiscal year 2019.
SNAP is the largest food assistance program for low-income Americans in the nation, and because of COVID-19, demand for the program has been growing. In March, when the Families First Act passed as part of the government's emergency response to the pandemic, the maximum benefit for SNAP recipients was temporarily expanded by an estimated 40%. An analysis from the New York Times shows that SNAP grew by 17% from February 2020 to May 2020, three times faster than in any previous three-month period.
Yet even with that expanded food aid, the program hasn't managed to meet the nation's food security needs. Congressional Democrats have sought to increase funding for SNAP and other nutrition assistance benefits, but prospects appear uncertain.
COVID-19 could double the number of people experiencing food insecurity globally
The problem is hardly unique to the U.S. According to the United Nations World Food Program, the global pandemic has the chance to double the number of people experiencing acute food insecurity, from 135 million in 2019 to 265 million in 2020.
"COVID-19 is potentially catastrophic for millions who are already hanging by a thread," the program's chief economist, Arif Husain said in a statement published this spring. "It is a hammer blow for millions more who can only eat if they earn a wage. Lockdowns and global economic recession have already decimated their nest eggs. It only takes one more shock — like COVID-19 — to push them over the edge. We must collectively act now to mitigate the impact of this global catastrophe."
If You're Layering These Masks, the CDC Says to Stop Immediately
You should steer clear of these face coverings if you're trying to double mask.
Masks have been encouraged since early in the pandemic to offer protection against the fast-spreading coronavirus. But as new variants of the virus started spreading at more alarming rates, people went looking for further ways to ensure that they were fully protected. The Centers for Disease Control and Prevention (CDC) recently updated their mask guidelines to endorse double masking, the idea that wearing two masks layered on top of each other better ensures protection against COVID. However, the CDC did make some stipulations in their updated guidelines. When it comes to layering your masks, there are some types of mask that shouldn't be used. Read on to find out which masks you can't double mask with, and for more essential mask guidance, If You See This on Your Mask, the FDA Says Toss It Immediately.
When layering masks, the CDC explicitly says you should not use two disposable masks. Why? It won't actually help protect you more. According to the CDC, "disposable masks are not designed to fit tightly and wearing more than one will not improve fit." For a mask to fit tightly, it must sit over your nose, under your chin, and against your cheeks without any gaps. And for more mask recommendations to heed, The CDC Warns Against Using These 6 Face Masks.
You can layer disposable masks with other masks, however. The CDC recommends people double mask by wearing one disposable mask underneath a cloth mask. When doing this, "the second mask should push the edges of the inner mask against your face," the agency explained. Earlier research has endorsed this particular order of layering as well.
A Jan. 21 study published in the journal Cell said that people should "wear a cloth mask tightly on top of a surgical mask where the surgical mask acts as a filter, and the cloth mask provides an additional layer of filtration while improving the fit." As long as both these masks fit well, the researchers found that this could stop the spread of the coronavirus with more than a 90 percent efficacy. At the same time, the researchers clarified that reversing this mask order could change the fit and make it less effective. And for more ways your mask may not be as protective, If Your Mask Doesn't Have These 4 Things, Get a New One, Doctor Says.
The CDC also warns against double masking with KN95 masks. According to the agency, you should not combine a KN95 mask with any other mask, nor should you layer another KN95 mask on top of a KN95 mask. Best Life reached out to the CDC for further explanation on this stipulation, but has not yet received a response. And for more coronavirus news, The CDC Says You Don't Have to Do This Anymore Once You're Vaccinated.
Paul Hickey, president of PuraVita Medical, a company that manufactures KN95 masks, said layering KN95 masks could be dangerous. "KN95 masks are designed to be a respirator," he said. "A respirator is designed to create an airtight seal around your face so you only breathe air that comes in through the respirator material and the air you breathe out only goes through the respirator material. Because of this, if you double layer a KN95 respirator or an N95 respirator you will have a difficult time breathing and so yes, it could be dangerous. Respirators are designed to only be one layer and not double stacked."
On the other hand, Sam Barone, MD, chief medical officer of BioPharma and president of Halodine, said layering another mask with a KN95 may also negatively affect the KN95 mask's fit—which is already designed to be extraordinarily effective. "N95 and KN95 masks are designed to filter 95 percent of aerosol particulates through a tight fit provided by proper wear. Placing another mask over top of these masks would alter the fit and could result in them being less effective," Barone explained. And for more on the spread of coronavirus, This Is Where You're Most Likely to Catch COVID, New Study Says.
Alongside these updated guidelines to endorse double masking, the CDC published a study on this method on Feb. 10. According to the study, when someone is double masking by wearing a cloth mask over a disposable surgical mask, they reduce their exposure to possibly contaminated aerosol particles by around 90 percent. In comparison, a surgical mask on its own only blocks 56 percent of the particles when subjected to a simulated cough, while a cloth mask on its own only blocks 51 percent. And for more up-to-date information, sign up for our daily newsletter.
Eat Foods, Not Nutrients
Pollan, author of In Defense of Food: An Eater's Manifesto and The Omnivore's Dilemma: A Natural History of Four Meals, is professor of science and environmental journalism at the University of California, Berkeley.
Pollan says that where we've gone wrong is by focusing on the invisible nutrients in foods instead of on foods themselves. He calls this "nutritionism" -- an ideology that's lost track of the science on which it was based.
It's good for scientists to look at why carrots are good for us, and to explore the possible benefits of, say, substance X found in a carrot.
What happens next is well-meaning experts tell us we should eat more foods with substance X. But the next thing you know, the food industry is selling us a food enriched with substance X. We may not know whether substance X, when not in a carrot, is good or bad for us. And we may be so impressed with the new substance-X-filled product that we buy it and eat it -- even though it may have unhealthy ingredients, such as high-fructose corn syrup and salt.
Pollan identifies four myths behind this kind of thinking:
Pollan remembers that when fats were declared to be evil, his mother switched the family to stick margarine. His grandmother predicted that some day stick margarine would be the evil food. Today, we know that margarine was made with trans fats.
The trouble with the whole notion of "evil' and "blessed" ingredients is that they help the food industry sell us processed foods that are free of the evil thing or full of the blessed one. We buy them, not realizing they may contain many other ingredients that aren't good for us.
Collins agrees with Pollan's central theme that whole foods are vastly better for us than are processed foods. But our food system makes it hard for many Americans to get whole foods.
"If our food system made more whole foods at lower cost and made them more available, that would help with our public health," Collins says. "We need full-service groceries in urban centers, where people can get to them. Unfortunately, urban centers are getting filled with fast food stores and liquor stores. Pollan's rules are good, and it is one thing to eat by his rules, but making our environment such that people can live by the rules is not always easy."
Will the CDC be pushing for these kinds of changes? Yes, suggested Anne Haddix, chief policy officer at the CDC's Office of Strategy and Innovation, during the panel discussion following Pollan's remarks to the CDC.
"How we go forward on this will take some very different types of thinking than we have done in the past," Haddix said. "We have an opening we have not had for years. . Of the federal agencies trying to address food issues, CDC is uniquely positioned. We have to step out as leaders. . Now is the time to ramp up our efforts and reach out to people who make us uncomfortable and go for it."
Michael Pollan, Knight Professor of Journalism, University of California, Berkeley.
Janet Collins, PhD, director, National Center for Chronic Disease Prevention and Health Promotion, CDC, Atlanta.
Michael Pollan lecture and panel discussion, March 20, 2009 with: Janet Collins, PhD, director, National Center for Chronic Disease Prevention and Health Promotion, CDC, Atlanta Howard Frumkin, MD, MPH, director, National Center for Environmental Health and Agency for Toxic Substances and Disease Registry, CDC, Atlanta Anne Haddix, MD, chief policy officer, Office of Strategy and Innovation, CDC, Atlanta Arthur Liang, MD, MPH, director, National Center for Zoonotic, Vector-Borne & Enteric Diseases, CDC, Atlanta.
- Public Response Hotline (CDC)