Food allergy “testing” is usually a bad idea
© 2015 Roy Benaroch, MD
People like tests. You get numbers, and maybe a printout, and there’s science and blood and things just feels more… serious, when testing is done. You can picture Marcus Welby (or perhaps a more modern physician), looking solemn, declaring “We’d better run some tests.”
Are medical tests magical and mysterious, and can they unlock the secrets of life? Usually, no. And among the worst and most misunderstood tests we do are food allergy tests.
A few recent studies illustrate this well. A review of about 800 patients referred to an allergy clinic found that almost 90% of children who had been told to avoid foods based on allergy testing could in fact eat them safely. The study, bluntly titled “Food allergen panel testing often results in misdiagnosis of food allergy” also found that the positive predictive value of food allergy blood tests—the chance that a positive test accurately predicted real allergy—was 2.2%. That much, much worse than the odds if you flipped a coin, and much, much worse than your odds of winning at a casino. If someone told you that a positive test was only correct 2% of the time, would you even do the test?
What about the other way of food allergy testing, with skin scratch or prick tests? A recent study about peanut allergy made big news when it turned out to show that early peanut exposure can prevent allergy. (This isn’t new news, by the way—I’ve written about that before. But I get fewer readers than the New England Journal of Medicine.) But hidden in the methods and statistics of that paper was another gem. The authors tested all of the enrolled babies for peanut allergy, at the beginning of the study. And most of the babies who “tested positive”, whether or not they then ate peanuts, did not turn out to be allergic. A true statement from the data from that study would be: If your baby tests positive for peanut allergy, your child is probably not allergic to peanuts.
Read that sentence again. Kind of makes your brain hurt, doesn’t it?
It is true that positive-tested kids were more likely than negative tested kids to be allergic—among the group with more allergies later (those who avoided peanuts), 35% of those who had positive tests developed allergy, versus 14% who had tested negative. But still, in either case, most of the kids who tested positive did not turn out to be allergic, whatever they ate or did.
The fundamental problem, I think, is that doctors either don’t understand or can’t seem to explain the difference between sensitization and allergy. None of these tests can actually test for allergy—they test for sensitization, which is different. We gloss over that distinction, and end up giving out bad advice. People should not be told to avoid food based on the results of allergy testing alone.
Bottom line: if you child eats a food without having a reaction, he or she is not allergic, and you should not do any testing for that food as a potential allergen. You should never do broad panels of “allergy tests”—they’re much more likely to mislead and confuse than to give useful information. Any food allergy testing that is done should only look at foods that seem to have caused reactions in the past, and even then any positive testing should be confirmed by what’s called an “open challenge.” Under safe conditions, usually under an allergists’ care, give the child some of the food to eat to see what happens. That’s the only real way to “test” for allergy.