I’ve noticed that a few anti-vaccine bloggers have lately taken to questioning whether “herd immunity”–ie, the principle borrowed from veterinary medicine which states that a high vaccination rate in a community provides protection to unvaccinated individuals as well–actually exists. Not sure how that one got started, as there’s ample evidence that herd immunity is real. It makes intuitive sense: if more people are vaccinated for measles, say, measles rates in a community drop, which in turn makes it harder for unvaccinated people to catch measles, too.
Here are a few studies that show herd immunity in action:
1) The first, recently published in the journal Pediatrics, looked at trends in chickenpox in children less than a year of age since 1995, the year chickenpox vaccine was first offered in the U.S. (Infants can’t receive chickenpox vaccine until after their first birthday.):
There has been close to a 90% drop in the rate of chickenpox among infants from 1995 to 2008, even though they are not eligible for the vaccine.
“Vaccinating children aged 12 months and older protects infants who are too young to be vaccinated,” says study researcher Adriana S. Lopez, MHS. She is an epidemiologist at the CDC in Atlanta.
2) Then there’s this summary of a 2005 study on pneumococcal vaccine, as described by Dr. Paul Offit from Children’s Hospital of Philadelphia:
“After American children began receiving the pneumococcal conjugate vaccine in 2000, for instance, the incidence of pneumococcus caused by the strains of bacteria in the vaccine fell by 55% among adults ages 50 and older, a group that didn’t even get the vaccine, according to a 2005 study in the Journal of the American Medical Association.”
3) And finally, see my post from September which described how older, unvaccinated siblings in Boston benefited from their younger sibs’ influenza vaccination.