Could policy changes be responsible for changing patterns of peanut allergy?
Policy advice led to changes in advice about early exposure to peanuts - what were the consequences and what can we learn?
This is adapted from a recent article I published in the Conversation but with some added context and detail.
Peanut allergy is one of the most common food allergies, affecting between 1% and 2% of people living in the west. And, for many years, their prevalence has been rising.
But a recent study out of the US shows that the rate of peanut allergy diagnoses in infants has actually declined. It appears this decline may be due to changes in allergy guidelines – highlighting the importance of introducing this common allergen early on.
What is a food allergy?
A food allergy is a type of allergic reaction which occurs when your immune system reacts inappropriately to things it should ignore like pollen or certain food types. The most common allergic condition is hayfever – a reaction to pollen. True food allergies are much rarer. There can, however, be some confusion as to what constitutes a food allergy. For example, a report by the UK Food Standards Agency found that although over 30% of adults reported some type of adverse reaction to foods only 6% were confirmed to have a true food allergy (using clinical criteria such as presence of food reactive antibodies). It is likely that the majority of the adults experiencing adverse reactions to foods actually had food intolerances that don’t involve immune-mediated attack. Of the true food allergies, peanut allergy is the most common – and also the most common cause of fatal food reactions.
How common are food allergies?
The proportion of people with food allergies in England has more than doubled between 2008 and 2018. Similar data in the US showed more than triple the number of people developed a food allergy between 1997 and 2008. The reasons for these increases are complex and due to many factors – including exposure to environmental pollutants, alterations in the gut microbiome and genetic predisposition. There also appears to be a link between certain inflammatory health conditions (such as atopic dermatitis and an infant’s likelihood of developing a food allergy. This latest study has shown that the US appears to have deviated from this overall trend, with peanut allergies actually falling in infants.
What are the new findings?
The study examined changes in the rates of peanut allergies since 2015. This was the year allergy guidelines in the US changed to encourage infants considered most at risk of food allergy (such as those with atopic dermatitis) to be introduced to peanuts early in life. Previous research had shown that these guideline changes had resulted in an increase in the number of parents introducing peanuts into their child’s diet by one year of age. The research team wanted to assess whether this had had any affect on peanut allergy rates, too.
They enrolled almost 39,000 children during the pre-guidelines phase (when advice was to avoid peanuts) and around 47,000 in the post-guidelines phase (after 2015). Allergy incidence in both groups was tracked for one to two years.
The research showed that the total rate of peanut allergy decreased from almost 0.8% to 0.5%. This meant fewer at-risk infants developed a peanut allergy following the guideline change.
These findings mirror prior work in the UK showing that early exposure to peanuts before the age of five was linked to a reduced likelihood of developing an allergy.
Food Allergy guidelines
In the late-1990s and early 2000s, the burgeoning incidence of food allergies and their life-threatening implications prompted sweeping policy changes in many western countries. For example, in the UK in 1998 and the US in 2000, guidelines changed to recommend high-risk allergens (such as peanuts) were completely avoided by pregnant women, breastfeeding mothers and infants considered at high risk for allergy. But, here is the thing- these guidelines were made in the absence of any rigorous studies actually showing peanut avoidance would have a positive effect. Indeed, animal studies had suggested there may be no benefits – showing that eating potential allergens early in life actually invokes an important phenomenon called oral tolerance. Oral tolerance is where the immune system ignores a potential allergen after it has been introduced to the gut through diet. How oral tolerance develops is complex involving several mechanisms that help immune cells to be effectively “switched off” so they don’t mistake certain foods for a threat. But despite the change in advice to avoid peanuts, rates of peanut allergies did not fall.
A major UK review conducted in 2008 consequently showed there was no clear evidence that eating or not eating peanuts (or foods containing peanuts) during pregnancy, while breastfeeding or in early childhood had any effect on the chances of a child developing a peanut allergy. As such, the advice in the UK to avoid peanuts (and eggs) during pregnancy and early childhood was reversed in 2009. A randomised trial conducted since this policy change came into place showed that among infants considered at high risk of allergy, consistent consumption of peanuts from 11 months of age resulted in an over 80% lower rate of peanut allergy by the age of five compared with children who had avoided peanuts. Other studies confirmed these findings, which subsequently led to guidelines changing in the US in 2015.
Questions still remain
It’s now increasingly clear that the early introduction of potentially allergic foods may actually benefit us and reduce our risk of developing a life-changing allergy. Nonetheless, there’s much we still don’t understand.
For example, while the mechanisms underpinning oral tolerance are being elucidated, we still don’t know what the best window of age is for safely invoking it. One approach are studies in children investigating using low doses of peanut or peanut antigens (Palforzia) to reverse peanut allergy. The Palforzia is gradually introduced to the diet in increasing doses under strict clinical supervision to re-educate the immune system and get it to ignore the peanut. The results are impressive: 67% of patients receiving treatment were able to tolerate 600 mg of peanut, whereas only 4% of patients taking a placebo were able to tolerate the same dose. We also need to understand how long this is effective for and why some do not respond.
Another puzzling questions is why infants with atopic dermatitis are most at risk of developing a food allergy. The hypothesis is that early exposure to food proteins through a disrupted skin barrier is what leads to allergy, as the immune system becomes sensitised to the food. This aspect of immunology is a whole area of research investigating the gut skin axis and why some gut conditions are linked to some skin conditions and vice versa.
It’s also important to note that overall, the incidence of food allergies is still increasing. While this recent US study offers hope for preventing some types of food allergies, questions still remain. For example, some people can develop food allergies during adolescence and adulthood. More must be done to understand why this happens.
There are also still barriers impeding access to diagnosis for severe food allergies. This means many at-risk patients have not been diagnosed, so they also have not been prescribed potentially life-saving treatments. These trends are magnified for people living in more deprived areas of the country.
Much more needs to be done to answer these questions and tackle food allergies more broadly.


