Will I be more prone to developing other autoimmune diseases because I have celiac disease?

Yes, risk of specific autoimmune diseases such as type I diabetes, and autoimmune thyroid and liver disorders is much higher with celiac disease (text and table below from 1, figure below from 2).

“Good evidence exists for the increased prevalence of CD in first-degree relatives of patients with CD, patients with autoimmune diseases such as T1DM, and autoimmune thyroid disease (22) in some chromosomal aberration disorders and in selective IgA deficiency (Table 2). The prevalence of CD in T1DM has been investigated extensively and is 3% to 12%. The AHRQ report and the corresponding paper included 21 studies on T1DM with biopsy-proven CD, each with 50 participants (10). Two additional papers regarding children with T1DM have appeared: 1 reported 12% with CD (23) and 1 longitudinal study reported 7% (24). In addition, CD occurs more frequently than expected by chance in children with Turner syndrome (25) or Down syndrome. A 10-to 20-fold increase in CD prevalence has been reported in subjects with selective IgA deficiency (26).”

However, such generic risk estimation is defined at a population, and not individual, level through epidemiological studies. This is why current guidelines from the British Society of Gastroenterology and the American College of Gastroenterology recommend screening for other autoimmune diseases, especially type I diabetes and Hashimoto thyroiditis, in those diagnosed with celiac disease (3, 4). Individual risk estimation requires testing at the individual level.


1. Husby, S., et al. “European Society for Pediatric Gastroenterology, Hepatology, and Nutrition guidelines for the diagnosis of coeliac disease.” Journal of pediatric gastroenterology and nutrition 54.1 (2012): 136-160. http://www.medicel.unina.it/00_materiali/diagnosis/Guidelines_ESPGHAN_2012.pdf

2. Lundin, Knut EA, and Cisca Wijmenga. “Coeliac disease and autoimmune disease—genetic overlap and screening.” Nature reviews Gastroenterology & hepatology 12.9 (2015): 507. https://www.researchgate.net/profile/Knut_Lundin/publication/281485641_Coeliac_disease_and_autoimmune_disease_-_Genetic_overlap_and_screening/links/5979f8c8aca272e8cc0d3509/Coeliac-disease-and-autoimmune-disease-Genetic-overlap-and-screening.pdf

3. Ludvigsson, Jonas F., et al. “Diagnosis and management of adult coeliac disease: guidelines from the British Society of Gastroenterology.” Gut 63.8 (2014): 1210-1228. https://gut.bmj.com/content/gutjnl/63/8/1210.full.pdf

4. Rubio-Tapia, Alberto, et al. “ACG clinical guidelines: diagnosis and management of celiac disease.” The American journal of gastroenterology 108.5 (2013): 656. https://journals.lww.com/ajg/Fulltext/2013/05000/ACG_Clinical_Guidelines__Diagnosis_and_Management.7.aspx#pdf-link


A person who was exposed to Hep-B but never got sick or has recovered (HBsAg=negative, HBsAb = positive, HBcAb=positive) is not allowed to give blood. Is this because the HBsAg result is not 100% reliable? What is the percentage of escape?


, ,

Yes, a negative HBsAg (Hepatitis B surface antigen) blood test result isn’t 100% reliable and this answer explains how that could happen. Escape is the outcome of multiple dynamic donor and recipient variables that intersect with each other in a highly unpredictable manner (below from 1) so pinning down its percentage in hard numbers isn’t feasible.

Occult describes cases of Hepatitis B (Hep-B) infection where HBsAg is undetectable while replication-competent HBV (Hepatitis B virus) DNA may or may not be detectable in the blood (below from 1, text modified as bullet points by me and table).

“We have identified 25 published cases of recipient HBV infection acquired from donors in the HbsAg-negative window period and have summarized these data in Table 10. Cases (usually published as part of a case series) have been reported from the United Kingdom, Japan, and Germany and have been found either through donor-initiated lookback or via recipient-based surveillance systems that have captured and investigated cases of reported post-transfusion HBV infection.105-111

  • In the large majority of cases reported in Table 10, the transmitting donation was found to be positive for HBV DNA if tested by ID NAT [Individual donation Nucleic Acid Amplification Technology];
    • however, in three cases (Table 10, Case 1, and a single case in Case Series 3 and 5), HBV ID NAT was negative.
    • In two of these cases (Table 10, Case 1, and Table 10, Case Series 3), transmission was proven by sequence homology between donor and recipient isolates.
    • In one of these cases (Table 10, Case 1),105 the donor tested positive for HBsAg and HBV DNA 2 months after index donation. The recipient was under treatment for leukemia and when evaluated 12 weeks after transfusion tested HBV DNA positive despite the presence of passively transfused anti-HBs (from a HBV-vaccinated PLT [platelet] donor) at a concentration of 58 IU/L.
    • In the other case (Table 10, Case Series 3),107 the donor was positive for HBsAg 6 weeks after index donation but HBV DNA results were not reported.
  • In addition to the transmitting cases, several of these publications also included cases (one case in Case Series 2 and 11 cases in Case Series 4) 106,108 in which HBV DNA– positive components did not transmit infection to the recipient.”

“OBI [occult Hepatitis B infection] is defined as the presence of HBV DNA in the absence of HBsAg in patients who are not in the HBsAg negative window phase of acute infection.38,40 Almost all such donors have detectable anti-HBc [anti-HBV core antibody] and about half have anti-HBs [anti-HBV surface antibody]. The majority of these OBIs are thought to represent past HBV infections that have been controlled but not completely cleared by the immune system, but some may represent chronic HBV carriers that lost detectable HBsAg over time. In either case, HBV DNA is generally present in low concentration (<100 IU/mL) and can fluctuate over time, being detectable only intermittently even with the use of highly sensitive NAT assays. Recent studies of OBI donors in Europe have shown that, at least in this location, most OBI donors harbored mutated HBV strains.47-49 Many of these mutations occurred in the S gene encoding the major immunodominant region of HBsAg and it is possible that in some OBI cases, an altered form of HBsAg could be present but not detectable by at least some commercial HBsAg assays.”

In essence, OBI means HBV in liver but at much lower levels, and both its DNA and surface antigen, HbsAg, practically undetectable in blood (figures below from 2).

A more recent 2019 study (3) better clarifies how transmission could be possible under such circumstances. It examined 3 repeat donors from Slovenia who were not only HBsAg negative but also had undetectable HBV DNA by highly sensitive NAT and yet transmitted HBV to 9 recipients who were transfused with various blood components from these donors.

Long-term evaluation of these OBI donors showed that they alternated between phases where viremia (detectable virus in blood) was apparently absent and phases with very low but detectable virus loads. Blood products from these 3 OBI donors had been transfused into 47 patients and follow-up data was retrieved from 31 of them.

  • Seven of them (22.5%) had anti-HBs antibodies and weren’t infected.
  • Nine (29%) were definitely infected; >99% donor-recipient sequence homology in 5, probable in 3 and possible in 1.

Based on the specifics of the blood products transfused in these cases, this study revealed that OBI blood products such as ~200ml of fresh frozen plasma that presumably contained ~3200 virions or red blood cells containing ~20ml plasma (~320 virions) could transmit HBV to naive recipients. The study concluded

  • Transmission was possible with a minimal dose of as few as ~3.0 IU/ml (~<16 copies/ml) of HBV DNA, far lower than the previous infectious dose of 20 IU/ml (~1017 copies/ml).
  • The NAT sensitivity threshold necessary to prevent transmission needs to be lowered from the current 3.4 IU/ml to an even lower 0.15 IU/ml (or 0.8 genome equivalent/ml).

Lowering these cut-off thresholds won’t be easy to implement and aren’t cost-free since they necessitate testing much larger volumes of single donations and would preclude minipool testing (small sample derived from a large group of blood donors).

Another approach would be to treat blood donation components with specific pathogen reduction steps such as treatment with amotosalen, a Psoralen – Wikipedia, and UVA light.


1. Kleinman, Steven H., Nico Lelie, and Michael P. Busch. “Infectivity of human immunodeficiency virus‐1, hepatitis C virus, and hepatitis B virus and risk of transmission by transfusion.” Transfusion 49.11 (2009): 2454-2489. http://www.academia.edu/download/46065816/Infectivity_of_human_immunodeficiency_vi20160530-32615-16m0xmb.pdf

2. Raimondo, Giovanni, et al. “Update of the statements on biology and clinical impact of occult hepatitis B virus infection.” Journal of hepatology (2019). Update of the statements on biology and clinical impact of occult hepatitis B virus infection

3. Candotti, Daniel, et al. “Multiple HBV transfusion transmissions from undetected occult infections: revising the minimal infectious dose.” Gut 68.2 (2019): 313-321. https://dl.uswr.ac.ir/bitstream/Hannan/57032/1/2019%20GUT%20Volume%2068%20Issue%202%20February%20%2817%29.pdf


What is the function of the appendix in our body?


The human GI tract is largely a long tube stretching from mouth to anus. Largely because the enduring puzzle of the worm-like (vermiform) human appendix mars this tubal regularity near the ileo-cecal junction as a dead-end extension that projects off of the cecum, the bulbous, fleshy, front end of the colon (below from 1).

This answer briefly explains

  • The recently described notion that the healthy appendix could be a bacterial ‘safe house’ that helps to quickly re-populate the colon with beneficial microbiota lost during a diarrheal purge.
  • How the twinned appearance of Darwin’s idea of vestigiality and the industrialization-associated appendicitis epidemic helped embed the notion that the expendability of the human appendix is cost-free.

The healthy appendix could be a bacterial ‘safe house’ that helps to quickly re-populate the colon with beneficial microbiota lost during a diarrheal purge.

What possible digestive function could this relatively small finger-like structure possess, filled as it is with lymphoid tissue that seems ever so busy as to make it practically impossible to distinguish between its normal and inflamed counterparts, especially considering (below from 1)?,

“The internal diameter of the appendix, when open, has been compared to the size of a matchstick. The small opening to the appendix eventually closes in most people by middle age.”

Though Leonardo da Vinci famously illustrated the human appendix in 1492, his drawings weren’t published until the 18th century, which is why the history of anatomy attributes the discovery of the human appendix to Jacopo Berengario da Carpi – Wikipedia in 1521. In the centuries since, biologists have scratched around in vain trying to decipher the function of this intriguing organ even as it consistently failed to catch a break.

An implicit consensus about the presumed “uselessness” of the appendix prevailed unchallenged until 2007 when in the Journal of Theoretical Biology, a group led by William Parker at Duke University suggested an entirely plausible function for the human appendix, that (below from 2, figure below from 3),

“the human appendix is well suited as a ‘‘safe house’’ for commensal bacteria, providing support for bacterial growth and potentially facilitating re-inoculation of the colon in the event that the contents of the intestinal tract are purged following exposure to a pathogen”

Since then, a few other scientists accept the plausibility of this idea about the human appendix, that it (below from 4),

“may serve as a sort of bacterial “safe house,” allowing for the survival of symbiotic flora during severe bouts of diarrhea.”

Below from 5,

“Current evidence supports the hypothesis that the appendix is more than just an evolutionary vestige. Its location in the intestinal tract, but shielded from peristalsis and transiently passing contaminants in the fecal stream, make the appendix an ideal safe house for commensal bacteria (Figure 1). If the colon is purged following pathogen exposure, infection, and antibiotic treatment, the appendix could aid in reseeding the colon and reinstating a healthy microbiota. The biofilm in this vermiform appendage is thought to protect its members from colonization with pathogens [16,56]. Recent research also pointed towards the close contact between the appendix and lymphatic tissue, rendering the appendix an important secondary immune organ promoting growth of some types of beneficial gut bacteria [56].”

Diarrhea is the way the GI tract deals with perturbations to its equilibrium. Problem is sloughing off of its contents in this precipitous manner rids the colon of not just the bad, the troublemakers du jour, but the good bacteria as well. How does it replenish its beneficial gut microbiota post-diarrhea? The default and tacit consideration was that it would come in from the outside as it did originally – start over from scratch in other words.

Since the human microbiota remained under- or even unstudied until just recently, not much attention was paid to how it takes shape post-birth nor whether this community would or should start over from scratch following each diarrhetic episode over the course of a lifetime, a notion that on its face would seem antithetical from nature’s point of view – surely a lifelong association as essential as one’s gut microbiota couldn’t be left to the whims of chance?

In addition, that 2007 study (2) as well as others since (6, 7) also note that bacterial biofilms are more abundant in the appendix than anywhere else in the human colon. This further reinforces the idea of the healthy human appendix having an important role in actively seeding and shaping the gut microbiota composition.

The twinned appearance of Darwin’s idea of vestigiality and the industrialization-associated appendicitis epidemic helped embed the notion that the expendability of the human appendix is cost-free.

Darwin himself proposed that the appendix was vestigial. During his time, only the great apes and humans were known to have an appendix. Darwin suggested that a larger cecum and associated appendix were specialized for breaking down plant tissues such as leaves, a process now described as colonic/cecal fermentation, and that when human ancestors switched to a more easily digested fruit-based diet, the cecum shrank and the appendix became unnecessary, that it was an evolutionary remnant from a primate ancestor that ate leaves (8).

Any wonder then that in my schooldays, the appendix featured prominently among the list of vestigial organs that we dutifully memorized as structures deemed to have lost their ancestral function, Vestigiality – Wikipedia. We were given to understand that they were essentially useless remnants of the past, still hanging around who knows why even though the commonly used definitions of vestigiality are more nuanced (below from 9, emphasis mine).

“Vestigiality, in the biological sense, refers to the situation in which organs or organisms have lost all of their original function in a species, but nevertheless have been retained through evolution (they have not been de-selected).”

Wikipedia re-writes this definition as (emphasis mine),

“Vestigiality is the retention during the process of sexual reproduction of genetically determined structures or attributes that have lost some or all of their ancestral function in a given species.”

Note the terms ‘original’ and ‘ancestral’ in these definitions which should give pause and instill curiosity as to whether such structures might have since gone on to acquire new attributes and functions. Wouldn’t that better fit nature’s design, given its well-known proclivity for parsimony? Instead the presumably “useless” aspect of the appendix gained free rein in popular consciousness, helped along as appendectomies became routine over the course of the 20th century and remain even today the most common emergency surgical procedure in countries such as the US.

Inflamed appendix? Snip it out, suture and done – the appendix is clearly presumed an expendable organ now and this has been the case since at least the late 19th century when outbreaks of appendicitis started becoming widely prevalent in rapidly industrializing countries such as the US, something that only helped cement the notion of the vestigial nature of this organ, even though a careful accounting would have suggested that such outbreaks of the inflamed appendix pointed less to its apparent lack of utility per se and more to its usefulness as a harbinger of widespread digestive distress in response to too many unprecedented changes in living conditions that emerged in quick succession in the 19th and 20th centuries – chlorinated water supplies, indoor plumbing, flush toilets and sanitation, antibiotics, urban lifestyles that ever increasingly precluded contact with nature, to mention but a few of the important ones – too many rapid changes that have simply overwhelmed the adaptive capacity of our physiology.

This happenstance quirk of history – the coincidental appearance of Darwin’s vestigiality proposal and appendicitis outbreaks in rapidly industrializing countries – allowed the idea about the expendability of the appendix to get embedded into public consciousness so much so that (below from 1),

“Its major importance would appear to be financial support of the surgical profession.” Alfred Sherwood Romer and Thomas S. Parsons The Vertebrate Body (1986), p. 389.”

No surprise then that a 1990 study noted lifetime appendicitis risk of 8.6% for males and 6.7% for females in the US while lifetime risk of appendectomy was nearly twice as high at 12% for males and more than three times as high at 23% for females (10) – as clear a sign as any that the ‘snip, snip, out with the appendix’ formula has remained unchanged at least in the US since the days of William Osler – Wikipedia in the late 19th century when surgeons first started figuring out how to perform this operation safely (11). After all Osler himself noted how crucial the press of his day was in helping spread the mythical scourge of the so-called appendicitis epidemic (below from 11),

“In Osler’s time the press had extensively publicized appendicitis, extolling the marvelous wonders of surgery, something Osler commented on in the 1896 edition: “There is a well-marked appendicular hypochondriasis (italics mine). Through the pernicious influence of the daily press, appendicitis has become a sort of fad, and the physician has often to deal with patients who have a sort of fixed idea that they have the disease. The worst cases of this class which I have seen have been in members of our profession…”

The easily discarded appendix is thus perfectly emblematic of the throwaway culture we now find ourselves in.

Unfortunately, as we now grasp all too well, since Darwin’s time, a suite of inflammatory diseases (allergies, autoimmunities, cancers, inflammatory disorders such as IBS and IBD to name some prominent examples) have indeed become ever more commonplace in post-industrial societies and their link to the microbiota, specifically to its depletion within our bodies, remained unappreciated until fairly recently.

It took nearly a century to realize that most of these conveniences of present-day life that are so beguilingly addictive that they lend themselves to all too quick and unthinking adaptation by people the world over have also mediated “an epidemic of absence” to quote Moises Velasquez-Manoff; that unbeknownst to us, a large part of the microbiota that lived within our bodies generation after generation over evolutionary time started disappearing over the course of the 20th century, a process that’s only gathering pace as this way of living spreads to other rapidly industrializing countries, an idea embodied by the Hygiene hypothesis – Wikipedia.

In like fashion, it took nearly a century for epidemiologists to uncover data (12, 13) suggesting that these rapid-fire changes in living conditions that accompanied industrialization may also be the quite unnatural impetus (below from 14, emphasis mine) for the now more than century old appendicitis epidemics in countries such as the US and the UK that were industrializing pioneers.

Barker noticed that epidemics of appendicitis followed the introduction of indoor plumbing into various communities. This observation was followed by epidemiologic studies showing that appendicitis is associated with developed but not with developing countries. Almost at the same time, another epidemiologist, Strachan[15], found that a hyper-active immune system is a consequence of the hygienic environment following the industrial revolution[15]. Strachan’s observations point toward the idea that appendicitis, like many other allergic, autoimmune, and inflammatory diseases, is a result of biome depletion, a consequence of industrialization.”

Any wonder that the appendix figures in public consciousness in terms of what goes wrong with it so often in industrialized countries – excruciating pain that drives a person to seek emergency services that ends up with an emergency appendectomy? Doctors in the post-Darwin era are all too familiar with an appendix that can get inflamed in a life-threatening manner even as knowledge of the healthy appendix remains truly vestigial (pun intended).

Coming full circle on the need to overhaul age-old ideas about the human appendix, more recent studies also suggest Darwin was mistaken about the basis for its vestigiality itself. Comparative analysis not only suggests no simple direct relationship between cecum and appendix in terms of presence or size (below from 3 and 1), in a series of studies (6, 15), Parker and his colleagues plotted the diets of 361 living mammals on a mammalian evolutionary tree and found 50 appendix-bearing species to be so widely dispersed as to suggest the appendix may have evolved independently as many as 32 to 38 times (16) – if true, the fact that nature saw fit to evolve the appendix so many times over suggests its function remains wide open.

Finally, in another quirk of fate that can only make one shake one’s head in exasperation, today the research mouse is perhaps the biggest obstacle that prevents an improved understanding of the human appendix. After all, mice are today the mainstay of biomedical research and yet they can yield no insight into the appendix since they entirely lack one, just one in an ever-increasing list of fundamental differences that refute the laughably simplistic notion that the mouse is in any way, shape or form an appropriate or reasonable surrogate for human physiology.


1. Vestigiality of the human appendix

2. Bollinger, R. Randal, et al. “Biofilms in the large bowel suggest an apparent function of the human vermiform appendix.” Journal of theoretical biology 249.4 (2007): 826-831. http://www.mbio.ncsu.edu/mjc/old/20072008/Trent_paper.pdf

3. Laurin, Michel, Mary Lou Everett, and William Parker. “The cecal appendix: one more immune component with a function disturbed by post‐industrial culture.” The Anatomical Record: Advances in Integrative Anatomy and Evolutionary Biology 294.4 (2011): 567-579. https://onlinelibrary.wiley.com/doi/pdf/10.1002/ar.21357

4. Barlow, Andrew, et al. “The vermiform appendix: A review.” Clinical Anatomy 26.7 (2013): 833-842.

5. Tytgat, Hanne LP, et al. “Bowel biofilms: tipping points between a healthy and compromised gut?.” Trends in microbiology (2018). Bowel Biofilms: Tipping Points between a Healthy and Compromised Gut?

6. Smith, H. F., et al. “Comparative anatomy and phylogenetic distribution of the mammalian cecal appendix.” Journal of evolutionary biology 22.10 (2009): 1984-1999. https://onlinelibrary.wiley.com/doi/pdf/10.1111/j.1420-9101.2009.01809.x

7. Im, Gene Y., et al. “The appendix may protect against Clostridium difficile recurrence.” Clinical gastroenterology and hepatology 9.12 (2011): 1072-1077.

8. Darwin, Charles. The descent of man and selection in relation to sex. Vol. 1. D. Appleton, 1896.

9. Grimson, William, and Mike Murphy. “An evolutionary perspective on engineering design.” (2009). https://arrow.dit.ie/cgi/viewcontent.cgi?article=1002&context=engineducbks

10. Addiss, David G., et al. “The epidemiology of appendicitis and appendectomy in the United States.” American journal of epidemiology 132.5 (1990): 910-925. https://pdfs.semanticscholar.org/0cfc/1ff3a15ff1a226918375489a0870614bbd98.pdf

11. History of Medicine

12. Barker, D. J. P., and Julie Morris. “Acute appendicitis, bathrooms, and diet in Britain and Ireland.” Br Med J (Clin Res Ed) 296.6627 (1988): 953-955. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2545432/pdf/bmj00279-0009.pdf

13. Barker, D. J., et al. “Appendicitis epidemic following introduction of piped water to Anglesey.” Journal of Epidemiology & Community Health 42.2 (1988): 144-148. https://jech.bmj.com/content/jech/42/2/144.full.pdf

14. Sanders, Nathan L., et al. “Appendectomy and Clostridium difficile colitis: relationships revealed by clinical observations and immunology.” World Journal of Gastroenterology: WJG 19.34 (2013): 5607. Appendectomy and Clostridium difficile colitis: Relationships revealed by clinical observations and immunology

15. Smith, Heather F., et al. “Morphological evolution of the mammalian cecum and cecal appendix.” Comptes Rendus Palevol 16.1 (2017): 39-57. http://www.academia.edu/download/50751849/Smith_et_al._2017_CRP.pdf

16. Smith, Heather F., et al. “Multiple independent appearances of the cecal appendix in mammalian evolution and an investigation of related ecological and anatomical factors.” Comptes Rendus Palevol 12.6 (2013): 339-354. Multiple independent appearances of the cecal appendix in mammalian evolution and an investigation of related ecological and anatomical factors


Why has nobody found a cost-effective way to synthesize horseshoe crab blood for drug production?


, ,

The phrasing of this question suggests some misunderstanding of the issue at hand so the answer delves briefly into

  • Some basics about the horseshoe crab.
  • How horseshoe crabs became an indispensable linchpin for safety testing in present-day biomedicine.
  • How the practically unregulated biomedical use of horseshoe crabs harms them and endangers their very future.
  • How synthetic alternatives to horseshoe crab blood exist but lack regulatory approval, largely owing to inertia.

Horseshoe crab – Wikipedia blood is an integral component in ensuring the safety of biomedical products intended for contact with our circulating blood and other bodily fluids. A synthetic alternative, recombinant factor C, has existed for years, one that the European Pharmacopeia saw fit to recommend in 2016. Now if only the US Pharmacopeia and FDA followed suit, things would be golden indeed for the horseshoe crab, a surefire and much welcome change of pace for this now extremely beleaguered ancient creature.

Some Basics About The Horseshoe Crab

Actually more closely allied to scorpions and spiders, horseshoe crabs are lucky to make it to maturity by 10 to 12 years of age since a variety of marine organisms prey on their juveniles and only ~3 out of 100000 are estimated to survive past one year (1, 2). That’s actually less horrible than it sounds since each female spawns 100000 eggs at a time, ~20 times per year.

Left to its own devices an adult horseshoe crab, which has few predators other than humans, would live out its life peacefully mucking about in deeper waters along the Atlantic coastline and around Southeast Asia (below from 3), pretty much as nature intended, and every summer arrive onto shores in great numbers to spawn its eggs and breed.

In turn, migratory shorebirds such as the endangered Red Knots feed on these eggs, at the Delaware Bay for example, for sustenance during their annual 9000 mile-long marathon expeditions from the southernmost tip of South America to the Canadian Arctic. Such shorebird predation on their eggs helps keep the horseshoe crab populations in check, an integrated process as is the wont when the natural order of things prevails.

One might surmise that things must have proceeded in this vein for eons. After all, the fossil record suggests that this Living fossil – Wikipedia has remained virtually unchanged for hundreds of millions of years, since the days of the dinosaurs in fact, an odd little factoid that’s deeply humbling or at least it ought to be.

Unfortunately, two forms of ruthless and deadly human predators of horseshoe crabs showed up, one of which is indeed present-day biomedicine and the other whelk and eel fishers who use them as bait, voraciously so at one point to the tune of millions annually in the US alone for example (4). Don’t we already know the beginning, middle and end of the epitaph for planet earth, the one that goes, “human, the apex predator”? Ask the horseshoe crab. It knows since several of its species, lushly abundant just a few generations back, are now listed as either endangered or critically endangered.

How Horseshoe Crab Blood Became An Indispensable Linchpin For Safety Testing In Present-Day Biomedicine

Many unpleasant realities abound all around us that each one of us doesn’t necessarily haven’t to take on and wrestle with personally. Human society routinely throws up plentiful horrors that represent predicaments of such colossal scale that glossing over them the best one can seems the easiest option since the effort to manifestly change such systems is simply too overwhelming for any single individual. For example a vegan could say that they’ve made an ethical choice so the unbearable abomination and pain of factory farming is on others. Problem is there’s no escaping the horseshoe crab issue for anyone.

Like it or not, horseshoe crab blood helps ensure the safety of virtually every injectable drug and biomedical device available today. How we treat these ancient denizens is everyone’s business because we’ve made them indispensable for our biomedical safety these days and the life of any one of us may one day depend on this involuntary munificence of the horseshoe crab whether we acknowledge it or not. Here’s how.

Go to the doctor, get tested, get treatment. Needle, pacemaker, catheter, injectable drug, all sorts of surgeries, all sterile of course – at this point in our history we just take these and so many more relatively recent medical innovations for granted. We’ve come to expect any sort of injectable that comes in contact with our blood as it circulates within our body to be safe and to not send us into a death spiral of inflammation from an ingrained over-reaction to bacterial contamination.

The predicament with such an expectation is that it’s practically unrealistic since we live in a microbial world and not only do they live all around us and even within us, getting rid of their products is practically impossible as well. Specifically, Lipopolysaccharide – Wikipedia, pieces of skin of gram-negative bacteria if you will, which are everywhere and stubbornly disinclined to easy riddance.

After all this world belongs to microbes and we came along much later with a pronounced technological bent that automatically lends itself to tinkering and solutioneering of the sort that’s simply antithetical to the way nature operates. Taking blood out, putting it back in, injecting all manner of substances into it, invasive surgeries, transplants – in the blink of an eye, we humans invented and now take for granted such procedures and products that are categorically outlandish from nature’s perspective and yet many if not most among us expect them as a matter of course without really understanding the contortions necessary to ensure their safety.

Even though LPS is also called endotoxin or pyrogen (fever-inducing) in lay terms, it isn’t a classic toxin in the conventional sense. After all, plenty of LPS-laden gram-negative bacteria live lifelong within our guts too. It’s just a quirk of nature that when in the blood, LPS as well as many other bacterial products of its ilk provoke the sort of extremely strong immune response that can itself be life-threatening, the sort that often kills people from septic shock for example, not because such bacterial components themselves cause terrible damage when in blood but because our immune system tends to stereotypically over-react to them when they wind up in certain amounts in circulating blood.

How to ensure then that whatever we inject into our blood or stick into our bodies has levels of LPS far below those that trigger such cataclysmic responses? That’s where the now-unwittingly benighted horseshoe crab and more specifically its distinct blue blood come in (below from 5).

The Limulus amebocyte lysate – Wikipedia is used to detect extremely small amounts of LPS, which are ubiquitous in the environment and yet lethal if they get into the bloodstream in sufficient doses. Turns out the horseshoe crab’s blood cells, amebocytes, are exquisitely sensitive in detecting LPS – they contain enzymes that clot and immobilize endotoxins (6, 7) – and this is why the FDA requires injectables and medical equipment to be tested for safety with LAL tests (below from 8),

“The purified LAL has the capability of detecting one millionth of a billionth of a gram of endotoxin in less than 1 h (Mikkelsen, 1988).”

Over recent decades as global healthcare needs ramped up precipitously, the process of using horseshoe crab blood for safety testing spread from the US to Asia and LAL tests morphed into LAL/TAL/CAL, so named after the respective horseshoe crab species, Limulus polyphemus for LAL in the US, and Tachypleus tridentatus and Tachypleus gigas for TAL and Carcinoscorpius rotundicauda for CAL in Asia.

How The Practically Unregulated Biomedical Use Of Horseshoe Crabs Harms Them And Endangers Their Very Future

Predation that long ago spilled over into depredation, the entirely one-sided relationship of humans with horseshoe crabs has proven all too toxic for them even as it began by happenstance as such things often do. The numbers speak for themselves (below from 9),

“Ever since the FDA’s authorization, the use of horseshoe crabs for LAL production by the pharmaceutical industry has increased progressively. The number of crabs being bled increased from 130,000 in 1989 to 260,000 in 1997. By 2010, over half a million crabs were bled annually. Horseshoe crab blood is estimated a value of $15,000 per quart, according to the National Oceanic and Atmospheric Administration.

The simplicity, accuracy and sensitivity of its antimicrobial and anticarcinogenic reaction have made the horseshoe crab an exceptional species for biotechnology research. The expediency of this crab is demonstrated by its widespread application for responsive sensing of endotoxins, uropathogens, bacteriuria, fungal infections and sepsis in the health care industry.”

And indeed the value of horseshoe crab blood has increased even more since even that 2015 assessment (below from 10),

“Such is the demand that processed lysate from the crab’s blood is now, gram for gram, one of the most valuable liquids on Earth, with a reported price between $35,000 and $60,000 per gallon.”

Adult horseshoe crabs are typically bled of ~30% of their blood at a time and then released back into the water. While US sourcing companies claim mortality rates of ~<15% on average from these bleeds, skepticism about such self-reporting is warranted because the bleeding process for biomedical use is essentially unregulated in many if not most parts of the world and actual rates may be even as high as 50 to 75% to altogether 100% (below from 11).

“Few people understand how deeply the TAL/LAL industry affects the lives of nearly every man, woman, child and domestic animal in the world, who are dependent upon medical service for their health. The safety of much of the world’s pharmaceutical and medical devices must be tested for the presence of life-threatening endotoxins prior to public use, and the most reliable endotoxin detection test currently available is TAL/LAL. There is no indication that the world’s human and animal population will become less dependent on medical services in the years to come. In fact, as our global population expands, ages, and medical advancements improve and/or prolong life, we expect to become more, not less reliant upon endotoxin detection methodologies, which currently means TAL or LAL. It is questionable whether current harvesting levels for TAL/LAL can be sustained, much less meet the projected future demands of this rapidly growing market, particularly if Asian horseshoe crab species are harvested to functional extinction… Approximately 25% of the medical device market is currently dependent upon TAL/LAL for endotoxin detection.”

Desperate for a silver lining at this point? I certainly am. Remember fishermen, the other apex human predators of horseshoe crabs? Turns out US commercial harvesters sued the US government to continue harvesting horseshoe crabs in protected waters and lost. As a result harvesting for bait has declined precipitously in the US (below from 12, figure below from 13),

“…the government’s win resulted in the prohibition of horseshoe crab harvest for any reason in the National Seashore and in the Monomoy National Wildlife Refuge (actually a ban was instituted for the refuge until a new compatibility study could be completed) (Compatibility Determination Eastern Massachusetts National Wildlife Refuge Complex 2002). The ruling eventually resulted in a ban for the harvesting of horseshoe crabs in all federal waters (James-Pirri 2012).”

Problem is this stricture left the LAL industry specifically untouched (below from 12).

“To further protect the LAL industry that used far fewer crabs than the bait industry, and since mortality from bleeding was considered insignificant (most bled crabs were returned to their environment alive), the biomedical industry was exempt from restrictions on harvesting horseshoe crabs with the exception of a requirement to report the number of horseshoe crabs bled (ASMFC 1998; Novitsky 2009). The extremely small number of horseshoe crabs harvested specifically for research is considered inconsequential and taking for research purposes is completely exempt…

Today, there are four companies operating in the United States that produce LAL from Limulus polyphemus harvested from various locations along the Atlantic Coast. It should be noted that there exists a similar industry in Southeast Asia where other species of horseshoe crabs, namely Tachypleus tridentatus, Tachypleus gigas, and Carcinoscorpius rotundicauda are, or can potentially be used to make an [sic] LAL equivalent, Tachypleus amebocyte lysate (TAL) and Carcinoscorpius amebocyte lysate (CAL).”

Problem also because regulatory oversight of horseshoe crab use in Southeast Asia is practically non-existent (below from 11).

“Presently, the growth of the global healthcare industry is entirely dependent upon the harvest and collection of blood from live horseshoe crabs to produce TAL/LAL. Although direct mortality of horseshoe crabs due to LAL production is estimated to be relatively low, 8–15 % (Rudloe 1983 ; Walls and Berkson 2003), the mortality associated with TAL production is 100 % because after bleeding, the animals are sold to secondary markets for food and chitin production.”

Problem also because horseshoe crabs harvested for biomedical bleeding are no longer inconsequentially fewer in number. In fact, harvesting for bleeding has expanded in the US to now ~equal harvesting for bait (below from 5),

“…new oversight agencies were established to mediate the risks from over-harvesting, and restrictions were placed on the number of horseshoe crabs collected for bait in order to regulate populations. These agencies further generated programs for stock management, developed state quota regulations, and established best practices for biomedical harvesting. In 2015, 583,208 horseshoe crabs were harvested as bait for eel and whelk (Atlantic States Marine Fisheries Commission, 2016), a significant reduction from the millions that were once harvested (Atlantic States Marine Fisheries Commission, 2013)…

The Atlantic States Marine Fisheries Commission (ASMFC) reported that in 2015, 559,903 horseshoe crabs were transported to biomedical facilities for the production of LAL (Atlantic States Marine Fisheries Commission, 2016).”

Problem also because the biomedical industry has been strikingly incurious about the impact of bleeding on long-term horseshoe crab health. Catch ’em, bleed ’em, throw ’em back – that’s been the expedient motto of this coerced, wantonly cruel and vampiric transaction (below from 14).

Vitality of crabs released after they’re bled isn’t guaranteed at all. In fact, extremely skewed ratios in favor of males were suspected for many years and only confirmed by definitive studies since just 2010 (15, below from 12; tables from 11 and 16). Female crabs are much bigger than males meaning much larger blood volumes which is why they get bled predominantly.

“Early studies indicated that it took at least a week for the crab to regain blood volume and several weeks to regain baseline amebocyte counts (Anderson et al. 2013). Since it is impractical to maintain crabs in holding ponds until they regain blood volumes and cell count, a practice of one bleed per year became the norm. Although bled crabs were seldom tagged, a fresh scar or needle puncture mark on the arthrodial membrane was quite apparent so that even if a bled crab was recaptured in the same year, a trained technician could avoid a second bleeding if a scar was in evidence. However, there is no provision in the proposed BMPs [Best Medical Practices] for preventing crabs being bled twice or more during a single season, and the effect on crab mortality is unknown. Likewise, due to the design of the horseshoe crab’s circulatory system (open, i.e. no separate veins with capillaries connected to the arteries to circulate hemolymph back to the cardiac sinus), once the cardiac sinus (large tubular heart) and 11 major arteries (Shuster 2003) are empty, the blood flow slows to a drip or stops completely. It has been estimated that no more than 30% of the entire blood of an individual crab is ever removed using a gravity flow as opposed to vacuum aspiration (Novitsky 1991). This type of bleeding, i.e. using gravity flow, appears to have become an industry standard (Levin et al. 2003), but due to the secrecy associated with the biomedical industry and a lack of a provision in the BMPs, it remains unclear whether this method is used universally.”

The fewer females that survive bleeds, the fewer progeny in succeeding years so no wonder horseshoe crabs decline not just from harvesting for bait but also from long-term morbidity and mortality from bleeds.

To be useful is a good thing we are taught. Trust us to then turn around and ensure that too much of a good thing is an unfortunate fate as well since that is precisely the curse we humans have visited upon the poor horseshoe crab. When it comes to unbridled ambition and avarice, we can be counted on to never disappoint as this poor creature has now discovered to its own peril – a literal blue blood literally paying with its life for its blue bloodedness courtesy us capricious humans.

How Synthetic Alternatives To Horseshoe Crab Blood Exist But Lack Regulatory Approval, Largely Owing To Inertia

(below from 12).

“ there is no reason why a synthetic replacement for LAL cannot be designed. In fact, one LAL manufacturer currently sells a synthetic substitute along with a traditional LAL (Lonza 2014). This synthetic substitute was invented by scientists using one of the horseshoe crab genes responsible for the main enzyme component of LAL to engineer a reagent produced in yeast (Ding et al. 1977). According to the FDA however, this synthetic reagent is not by definition, LAL, i.e. a lysate (L) of Limulus (L) amebocytes (A) and thus cannot be licensed. The major users of LAL, the pharmaceutical industry (various lots of intravenous solutions, biologics, and medical devices are required to be tested with FDA-licensed LAL prior to release for distribution and use) do not have a choice of using a synthetic substitute until the FDA changes regulations. It is interesting to note that LAL may be one of only a few diagnostic reagents (if not the only one) that is regulated on its composition (LAL) rather than what it detects (endotoxin). As endotoxin has been standardized as to its toxicity (pyrogenic dose in humans), and an official reference standard is commercially available and accepted by several pharmacopeias and the FDA, any reagent that can accurately and routinely detect the pyrogenic dose of endotoxin, i.e. by testing with the reference standard, should be a ready substitute for LAL. The PyroGene TM synthetic reagent already does this (Lonza 2014), as do some other endotoxin tests currently under development or that have been described in the literature, such as the in vitro pyrogen test (Daneshian et al. 2006). Thus all those concerned with horseshoe crab conservation, especially state agencies responsible for the regulation of horseshoe crab harvest and the ASMFC [Atlantic States Marine Fisheries Commission], should actively encourage the FDA to allow LAL substitutes as long as the substitutes can be properly validated (i.e., shown to detect a pyrogenic level of endotoxin in an actual pharmaceutical drug and device).”

Ready for another silver lining at this point? I know I certainly am. Relatively meager and inadequate though they may yet be, the regulatory oversights that have developed in the US for horseshoe crab harvesting since the 1990s suggest that sustained effort to educate and create awareness (below from 17, 18) could go a long way in helping reverse their currently routine, expedient, ruthless and unsustainable exploitation. Now the need of the hour is to get widespread regulatory approval for synthetic alternatives for LAL/TAL/CAL, which would really do the trick in stemming at least the harm from increasingly unsustainable biomedical overuse.


1. Botton, Mark L., Robert E. Loveland, and Athena Tiwari. “Distribution, abundance, and survivorship of young-of-the-year in a commercially exploited population of horseshoe crabs Limulus polyphemus.” Marine Ecology Progress Series 265 (2003): 175-184. https://www.int-res.com/articles/meps2003/265/m265p175.pdf

2. Carmichael, Ruth H., Deborah Rutecki, and Ivan Valiela. “Abundance and population structure of the Atlantic horseshoe crab Limulus polyphemus in Pleasant Bay, Cape Cod.” Marine Ecology Progress Series 246 (2003): 225-239. http://www.int-res.com/articles/meps2003/246/m246p225.pdf

3. Horseshoe Crab – Barrier Island Ecology UNCW

4. Smith, David R., Michael J. Millard, and Ruth H. Carmichael. “Comparative status and assessment of Limulus polyphemus with emphasis on the New England and Delaware Bay populations.” Biology and conservation of horseshoe crabs. Springer, Boston, MA, 2009. 361-386. https://www.researchgate.net/profile/John_Tanacredi/publication/291233650_Biology_and_Conservation_of_Horseshoe_Crabs/links/59c913560f7e9bd2c01a4c20/Biology-and-Conservation-of-Horseshoe-Crabs.pdf#page=366

5. Krisfalusi-Gannon, Jordan, et al. “The role of horseshoe crabs in the biomedical industry and recent trends impacting species sustainability.” Frontiers in Marine Science 5 (2018): 185. The Role of Horseshoe Crabs in the Biomedical Industry and Recent Trends Impacting Species Sustainability

6. Bang, Frederick B., and J. L. Frost. “The toxic effect of a marine bacterium on Limulus and the formation of blood clots.” Biological Bulletin. Vol. 105. No. 2. 7 MBL ST, WOODS HOLE, MA 02543: MARINE BIOLOGICAL LABORATORY, 1953.

7. Levin, J., and F. B. Bang. “Clottable protein in Limulus: its localization and kinetics of its coagulation by endotoxin.” Thrombosis and Haemostasis 19.01 (1968): 186-197. https://www.researchgate.net/profile/Jack_Levin3/publication/17496086_Clottable_Protein_in_Limulus_Its_Localization_and_Kinetics_of_Its_Coagulation_by_Endotoxin/links/5cc88bef4585156cd7bd93a8/Clottable-Protein-in-Limulus-Its-Localization-and-Kinetics-of-Its-Coagulation-by-Endotoxin.pdf

8. Walls, Elizabeth A., Jim Berkson, and Stephen A. Smith. “The horseshoe crab, Limulus polyphemus: 200 million years of existence, 100 years of study.” Reviews in Fisheries Science 10.1 (2002): 39-73. https://www.researchgate.net/profile/Stephen_Smith42/publication/252068789_The_Horseshoe_Crab_Limulus_polyphemus_200_Million_Years_of_Existence_100_Years_of_Study/links/0deec531ccf00733b6000000/The-Horseshoe-Crab-Limulus-polyphemus-200-Million-Years-of-Existence-100-Years-of-Study.pdf

9. Das, A. P., B. Bal, and P. S. Mahapatra. “Horseshoe Crabs in Modern Day Biotechnological Applications.” Changing Global Perspectives on Horseshoe Crab Biology, Conservation and Management. Springer, Cham, 2015. 463-474.

10. This crab could save your life – if humans don’t wipe it out first

11. Gauvry, Glenn. “Current horseshoe crab harvesting practices cannot support global demand for TAL/LAL: The pharmaceutical and medical device industries’ role in the sustainability of horseshoe crabs.” Changing global perspectives on horseshoe crab biology, conservation and management. Springer, Cham, 2015. 475-482. http://horseshoecrab.org/press/2018/11/Current-Horseshoe-Crab-Harvesting-Practices-Cannot-Support-Global-Demand-for-TAL-LAL.pdf

12. Novitsky, Thomas J. “Biomedical implications for managing the Limulus polyphemus harvest along the northeast coast of the United States.” Changing Global Perspectives on Horseshoe Crab Biology, Conservation and Management. Springer, Cham, 2015. 483-500.

13. Kreamer, Gary, and Stewart Michels. “History of horseshoe crab harvest on Delaware Bay.” Biology and conservation of horseshoe crabs. Springer, Boston, MA, 2009. 299-313. https://www.researchgate.net/profile/John_Tanacredi/publication/291233650_Biology_and_Conservation_of_Horseshoe_Crabs/links/59c913560f7e9bd2c01a4c20/Biology-and-Conservation-of-Horseshoe-Crabs.pdf#page=306

14. The Last Days of the Blue-Blood Harvest

15. The Blood of the Crab

16. Owings, Meghan. “Effects of the biomedical bleeding process on the behavior and physiology of the American horseshoe crab, Limulus polyphemus.” (2017). https://scholars.unh.edu/cgi/viewcontent.cgi?article=2152&context=thesis

17. Kreamer, Gary, and Sharon W. Kreamer. “Green Eggs & Sand, Team Limulus, and More: Educating for Horseshoe Crab Conservation in the United States.” Changing Global Perspectives on Horseshoe Crab Biology, Conservation and Management. Springer, Cham, 2015. 557-574.

18. Gauvry, Glenn, and Ruth H. Carmichael. “Young Voices: Through the Arts, Future Environmental Stewards Have a Global Voice.” Changing Global Perspectives on Horseshoe Crab Biology, Conservation and Management. Springer, Cham, 2015. 587-593.


Given that immunoglobulin class switching is irreversible, is it true that central memory B cells are only incompletely spliced, so that they can give rise to more than one types of effector cells (plasma cells)?


Not sure there is such a thing as a central memory B cell.

Historically, the attributes of Somatic hypermutation – Wikipedia or SHM and Immunoglobulin class switching – Wikipedia or CSR were thought to separate a memory B cell from its naive counterpart.

Experiments have now confirmed that a B cell can be memory even without CSR. Specifically, memory B cells have been found to include incompletely spliced IgM-expressing B cells which by definition haven’t yet undergone CSR. However, such IgM+ B cells were found to have undergone SHM.

A brief primer on naive versus memory B cells may be useful at this point.

  • Naive B cells secrete IgM – naive here means a mature B cell that hasn’t yet interacted with its antigen.
  • This IgM is a low affinity germline antibody – affinity here refers to strength of interaction with antigen while germline means the product of a B cell that hasn’t yet undergone SHM, the process that typically exponentially increases an antibody’s affinity for its antigen.
  • Being low affinity and germline, IgM was historically considered not part of a memory immune response. Implicit in this assumption was the idea that SHM goes hand in hand with CSR, which meant that a memory B cell response couldn’t be IgM since it is pre-CSR.
  • SHM consists of mutations in the variable region of the immunoglobulin (antibody) molecule, a process that improves an antibody’s antigen binding specificity and therefore its affinity for that antigen while CSR entails replacement of the Fc portion of the IgM immunoglobulin (antibody) molecule with the Fc portion of other antibody isotypes such as the various IgGs (IgG1, 2, 3 or 4 in humans) or IgA or IgE.

The first time around a naive B cell’s antibody response to its antigen is slow, consisting of low affinity germline IgM and a full-blown high affinity class-switched antibody response takes 2 to 3 weeks on average to mature to full bloom.

  • In this maturation process that usually takes place within specialized compartments called germinal centers within lymph nodes and the spleen, the number of B cells responding to a particular antigen typically increase to several hundreds to thousands and sometimes even tens of thousands-fold.
  • Most of these activated cells eventually die, a few survive to become memory B cells while others differentiate to become plasmablasts that eventually become plasma cells – cells that lose their B cell receptors and live mostly in the bone marrow as antibody secreting factories, often for decades on end.

By comparison, when memory B cells reencounter their antigens, they get up to speed in spewing out highly specific and high affinity antibodies in a matter of days.

Germline IgM is a blunt instrument in other words – good enough in the beginning of an immune response to keep harm in abeyance while a more specific, sensitive, finely honed and targeted antibody response in the form of higher affinity, class-switched IgGs, IgA and/or IgE get going as more information percolates through the tiers of cells that make up the immune system and shapes the developing immune response. Such information takes the form of more antigens as well as secretory molecules such as cytokines and chemokines and cell-surface molecules such as receptors and ligands.

Both SHM and CSR were considered necessary hallmarks of differentiation into memory until research starting in the late 1990s began to show subsets of memory B cells that expressed IgM on their cell surface as well as secreted it. That wasn’t all. This IgM wasn’t germline but instead somatically hypermutated (1, 2, 3, 4). Some of this IgM is T cell-dependent (1, 2), other not (3, 4).

Thus SHM but not CSR is now considered a necessary feature of memory B cell response.

That said, much about memory B cells still remains unknown.

  • For example, though the cell surface molecule CD27 was considered a reliable marker of human memory B cells at one point, now it is better understood as one that marks post-activation B cells.
  • Lack of unique markers that reliably single out memory B cells means that studies are ever tussling with the issue of whether they dealt with bonafide memory B cells or with memory-like B cells.
  • What are the rules that determine memory B cell formation – what separates the activated B cells destined to die from those destined for memory fate? Why do some activated B cells stay IgM+ while others switch to other isotypes (IgGs, IgA or IgE)?
  • Upon reencounter with their antigen, do IgM+ memory B cells proliferate to give rise to only clonal progeny or can they give rise to B cells that secrete other antibodies? What rules determine which outcome prevails?
  • Do IgM+ memory B cells differentiate into plasmablasts (and eventually plasma cells)?
  • What determines the share of IgM+ memory B cells in any given immune response?

A table and figures from 5 and 6 summarize recent classifications of memory B cells, how they appear to develop and the basics of competing models that attempt to explain how they arise and how they’re maintained.


1. Klein, Ulf, Ralf Küppers, and Klaus Rajewsky. “Evidence for a large compartment of IgM-expressing memory B cells in humans.” Blood 89.4 (1997): 1288-1298.

2. Klein, Ulf, Klaus Rajewsky, and Ralf Küppers. “Human immunoglobulin (Ig) M+ IgD+ peripheral blood B cells expressing the CD27 cell surface antigen carry somatically mutated variable region genes: CD27 as a general marker for somatically mutated (memory) B cells.” Journal of Experimental Medicine 188.9 (1998): 1679-1689.

3. Kruetzmann, Stephanie, et al. “Human immunoglobulin M memory B cells controlling Streptococcus pneumoniae infections are generated in the spleen.” Journal of Experimental Medicine 197.7 (2003): 939-945.

4. Weller, Sandra, et al. “Human blood IgM “memory” B cells are circulating splenic marginal zone B cells harboring a prediversified immunoglobulin repertoire.” Blood 104.12 (2004): 3647-3654.

5. Seifert, M., and R. Küppers. “Human memory B cells.” Leukemia 30.12 (2016): 2283.

6. Shlomchik, Mark J., and Florian Weisel. “Germinal center selection and the development of memory B and plasma cells.” Immunological reviews 247.1 (2012): 52-63.


Is it possible to find out if I’m really allergic to various common medications without putting myself at risk?

Allergy tests should only be performed using well-established clinical protocols by a knowledgeable, properly trained and credentialed medical professional in a medical setting equipped with the tools of the trade necessary for medical emergencies (below from 1).

That aside, diagnosing drug allergy preemptively is possible but far from easy. Which drug? Which component(s)? Which test(s)? Depending on the number of medications and the complexity of modern drug manufacturing, the quest could end up as the search for the proverbial needle in a haystack.

The test part is the more straightforward bit so let’s cover that first.

  • Tests could be done immediately upon drug exposure or later.
  • Such tests include
    • Checking for mast cell enzymes, complement components or serum albumin in the blood.
    • Measuring basophils or drug-specific IgE (first figure below from
      2 ) in the blood, or
    • Skin prick or intradermal tests (second figure as well as table below from 3).
    • Finally, the double-blind placebo-controlled challenge represents confirmation of drug allergy – the proverbial gold standard.

Allergy reactions are classified into 4 types based on their response time plus the cell types and cellular products primarily responsible for the symptoms (figure below from 4).

Type I: Immediate and IgE, mast cells or basophils.

Type II: Delayed and IgG/IgM and cell death.

Type III: Delayed and immune complex (antigen bound to antibody) and activated complement.

Type IV: Delayed and CD4 T cells plus macrophages/monocytes, eosinophils, cytotoxic CD8 T cells or neutrophils.

Reason for increasing complexity in drug allergy testing is that though drugs typically have only two main components, the API or Active ingredient – Wikipedia and the inactive ingredients or Excipient – Wikipedia, recent years have seen an increase in the number of these so-called inactive ingredients in even the most common of drug formulations. Hand in glove with this development, reports of allergies or reactions (irritations) to such excipients (a drug’s inactive ingredients) have also been steadily increasing in recent years (below from 5).

This means it would be prudent to pay attention to not just the active ingredients in common medications but also to the inactive ones and avoid formulations containing ingredients that a person suspects they may be sensitive to.


1. Kattan, Jacob D. “Allergy to Non‐Antibiotic Drugs.” Allergy and Clinical Immunology (2015): 301-307.

2. Baldo, Brian. “IgE and drug allergy: antibody recognition of ‘small’molecules of widely varying structures and activities.” Antibodies 3.1 (2014): 56-91.

3. Chiriac, Anca Mirela, Jean Bousquet, and Pascal Demoly. “Principles of Allergy Diagnosis.” Middleton’s Allergy Essentials. Elsevier, 2017. 117-131.

4. Pichler, Werner J., et al. “Drug hypersensitivity reactions: pathomechanism and clinical symptoms.” Medical Clinics 94.4 (2010): 645-664.

5. Reker, Daniel, et al. ““Inactive” ingredients in oral medications.” Science translational medicine 11.483 (2019): eaau6753.https://www.quora.com/Is-it-possible-to-find-out-if-Im-really-allergic-to-various-common-medications-without-putting-myself-at-risk/answer/Tirumalai-Kamala

How are children’s immunizations tested for safety, prior to being approved safe and distributed to the public?


Since vaccines are meant for use in the healthy (below from 1), safety is much more their cornerstone than for drugs which are meant for use in the sick.

According to the US Code of Federal Regulations Title 21, Section 600.3 (from 2),

“The word safety means the relative freedom from harmful effect to persons affected, directly or indirectly, by a product when prudently administered, taking into consideration the character of the product in relation to the condition of the recipient at the time.”

Relative is the key word to keep in mind. When sick, just about any risk seems acceptable for the chance to get and feel better. When healthy, even a remote chance of injury looms large in the consciousness. This is why comparing vaccines to drugs is necessary for gaining an accurate perspective on vaccine safety .

  • Since drugs are meant to make those already sick better, target populations tend to be small. Since vaccines are meant to prevent disease in the healthy, target populations are large. Large and healthy target population naturally imposes an even greater safety burden on vaccines.
  • Drugs tend to be fairly simple molecules and their pharmacology is the main concern. On the other hand, vaccines are complex biologicals – necessary to understand and account for not just straightforward pharmacology but also extremely complicated immunology. Since making and storing vaccines is much more complicated (first figure below from 4), lot-by-lot surveillance becomes necessary. This naturally imposes higher costs on vaccine makers.
  • Since vaccines are meant to prevent future diseases, their benefits aren’t as readily apparent as those of drugs. A full picture of vaccine risk and benefit only emerges years later from post-licensure data after entire populations have been vaccinated (second figure below from 3).

Far easier to appreciate the value of a drug that rids the body of an immediate visible disease or symptom than of a vaccine that invisibly prevents occurrence of a future disease.

  • On the one hand, vaccines need to be safer than drugs since they are meant for the healthy.
  • On the other hand, they need to be able to drive robust and long-lasting immune responses that can effectively deter a future infection by a specific pathogen, all this when there’s no predicting if and when any given individual within the vaccinated population would even or ever encounter it.

Hit or miss, the dueling imperatives of safety and effectiveness impose inherently contradictory demands on a vaccine which only makes their very existence and undeniable track record of effectiveness all the more remarkable (below from 5 using data from 6 and 7; 8), considering knowledge of how the immune system functions is still incomplete today and was even more so the case back when many of these vaccines were developed.

Today’s vaccines are thus an unwitting casualty of their own past success, at least in more developed countries.

  • Greater the visible societal cost (disease and death) of a vaccine-preventable infectious disease, easier to appreciate a vaccine’s benefit.
  • Lower the rate of a vaccine-preventable infectious disease, lower its associated morbidity and mortality and hence lower the perceived utility of a vaccine against it even as rare adverse events attributable to the vaccine become a major focus of the public’s attention.

Three main factors influence a vaccine’s risk-benefit profile,

  • Disease and death risk from infection versus adverse effects from the vaccine – this is the big one that decides whether or not a vaccine is even feasible (below from 3).
  • Risk versus benefit profiles of various vaccine options for a given disease. For example, live versus inactivated versions of the same vaccine.
  • Intended target population for a given vaccine. For example, young, healthy children versus older adults with underlying health conditions.

To sum up,

  • Vaccines have to clear a much higher safety bar compared to drugs.
  • Vaccines take longer to develop and are much more expensive to get to market than drugs and,
  • No surprise there are far fewer vaccine makers compared to drug makers (9). Especially in many developed countries, got to have a strong stomach and maybe a hole or two in the head to get into the vaccine business these days.


1. Why Aren’t Vaccines Regulated like Drugs? – VAXOPEDIA

2. CFR – Code of Federal Regulations Title 21

3. World Health Organization. “Vaccine safety basics learning manual.” Geneva: WHO (2013). https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=2&cad=rja&uact=8&ved=2ahUKEwjWs43DvuDjAhUMUt8KHf78DPIQFjABegQIBxAC&url=https%3A%2F%2Fwww.who.int%2Fvaccine_safety%2Finitiative%2Ftech_support%2FVaccine-safety-E-course-manual.pdf&usg=AOvVaw1OVF2gKpGTaV7AcV-W86pY

4. Zimmermann, Petra, and Nigel Curtis. “Factors that influence the immune response to vaccination.” Clinical microbiology reviews 32.2 (2019): e00084-18. http://doc.rero.ch/record/324687/files/zim_fii.pdf

5. Schuchat, Anne. “The state of immunization 2013: we are the world.” South Dakota Medicine (2013). http://www.sdsma.org/docs/pdfs-new_site/Journal/2013/SDMSpecial%20Issue2013l.pdf#page=29

6. Roush SW, Murphy TV, Vaccine-Preventable Disease Table Working Group. Historical comparisons of morbidity and mortality for vaccine-preventable diseases in the United States. JAMA. 2007;298(18):2155-2163. Historical Comparisons of Morbidity and Mortality for Vaccine-Preventable Diseases in the United States

7. Centers for Disease Control and Prevention. Notifiable Diseases and Mortality Tables. MMWR. August 17 2012;61(32):624-637. https://www.cdc.gov/mmwr/PDF/wk/mm6132.pdf

8. Robertson, Corwin A. “The science of vaccination: establishing safety and efficacy.” South Dakota Medicine (2013). http://www.sdsma.org/docs/pdfs-new_site/journal/2013/sdmspecial%20issue2013l.pdf#page=40

9. Francis, Donald P., Yu-Ping Du, and Alexander R. Precioso. “Global vaccine supply. The increasing role of manufacturers from middle income countries.” Vaccine 32.41 (2014): 5259-5265.


Shouldn’t we be immune to pollen and the allergies it comes with?

If ‘immune to pollen’ means benign and not pathological anti-pollen immune responses then yes, those without pollen allergy should be immune to them and therefore resistant to the allergies they can induce. Allergens draw attention because of the immune responses they induce in those with allergies. What about allergen-specific immune responses of those without allergies? Clearly, such responses exist.

  • For one, many allergens are practically unavoidable since they are all around us. Some such as pollen and house dust mite are indeed ubiquitous. It thus stands to reason that allergic or not, all of us are potentially exposed to many allergens and would therefore make immune responses to them.
  • For another, not everyone is allergic and indeed even someone with an allergy isn’t allergic to every single allergen but rather to only one or a few structurally similar molecules. This means even those with allergies would make non-allergic immune responses to allergens to which they don’t have allergy.

Consider Sherlock Holmes in the The Adventure of Silver Blaze – Wikipedia when he notices the “curious incident of the dog in the night-time” which proves to hold the key to solving the mystery.

“Gregory (Scotland Yard detective): “Is there any other point to which you would wish to draw my attention?”

Holmes: “To the curious incident of the dog in the night-time.”

Gregory: “The dog did nothing in the night-time.”

Holmes: “That was the curious incident.””

Did the dog respond or not? A dog that barks catches one’s attention easily. A dog that should have barked but didn’t escapes the notice of others but not of Holmes. There was a midnight visitor so the dog should have barked but it didn’t. From this Holmes deduces that the dog did indeed respond but the way it would to a friend or an acquaintance and he thus infers that the midnight visitor wasn’t a stranger to the dog.

How we define issues guides our approach to how we tend to study them. An allergen induces allergic responses but only in someone with an allergy to it. The definition of an allergen is therefore not absolute but rather inherently contingent on whether or not it elicits an allergic response in a given individual.

How allergies are studied suggests that it’s easy to forget this contextual aspect to defining allergens. Just as Detective Gregory’s implicit expectation that the dog should have barked led him to ignore its action when it didn’t so too do allergic immune responses to allergens capture disproportionate attention, not just among the public but also among researchers.

Since those without allergies don’t ‘see a stranger’ when they encounter an allergen, their immune responses to these molecules are benign and not pathological. Such benign immune responses pass without notice even though they occur all the time – like the proverbial dog that didn’t bark.

This is why the literature on allergen-specific immune responses in those without allergies is comparatively meager and why allergic immune responses to allergens are the proverbial tip of the allergy iceberg that cannot reveal its full picture.

In recent years, allergy researchers have woken up to this fact and studies show that healthy, non-allergic individuals respond to the same bits of allergens as do those with allergies. What matters is that those with or without an allergy differ from each other in the types of immune responses they make to the same allergen(s). Below are a few such references for information.

  1. Carballido, José M., et al. “Bee venom phospholipase A2‐specific T cell clones from human allergic and non‐allergic individuals: cytokine patterns change in response to the antigen concentration.” European journal of immunology 22.6 (1992): 1357-1363.

2. Ebner, Christof, et al. “Nonallergic individuals recognize the same T cell epitopes of Bet v 1, the major birch pollen allergen, as atopic patients.” The Journal of Immunology 154.4 (1995): 1932-1940.

3. Akdis, Mübeccel, et al. “Immune responses in healthy and allergic individuals are characterized by a fine balance between allergen-specific T regulatory 1 and T helper 2 cells.” Journal of Experimental Medicine 199.11 (2004): 1567-1575. http://jem.rupress.org/content/jem/199/11/1567.full.pdf

4. Van Overtvelt, Laurence, et al. “Assessment of Bet v 1-specific CD4+ T cell responses in allergic and nonallergic individuals using MHC class II peptide tetramers.” The Journal of Immunology 180.7 (2008): 4514-4522. https://www.jimmunol.org/content/jimmunol/180/7/4514.full.pdf

5. Hinz, Denise, et al. “Lack of allergy to timothy grass pollen is not a passive phenomenon but associated with the allergen‐specific modulation of immune reactivity.” Clinical & Experimental Allergy 46.5 (2016): 705-719. Lack of allergy to timothy grass pollen is not a passive phenomenon but associated with allergen-specific modulation of immune reactivity

6. Kurtaj, Almedina, et al. “Natural protective immunity against grass pollen allergy is maintained by a diverse spectrum of response types.” Journal of Allergy and Clinical Immunology 140.6 (2017): 1746-1749. Natural protective immunity against grass pollen allergy is maintained by a diverse spectrum of response types

7. Sette, Alessandro, and Véronique Schulten. “It’s a lot of work to be nonallergic.” Journal of Allergy and Clinical Immunology 139.3 (2017): 769-770. It’s a lot of work to be nonallergic

8. Ahuja, Sunil K., et al. “Preservation of epithelial cell barrier function and muted inflammation in resistance to allergic rhinoconjunctivitis from house dust mite challenge.” Journal of Allergy and Clinical Immunology 139.3 (2017): 844-854. Preservation of epithelial cell barrier function and muted inflammation in resistance to allergic rhinoconjunctivitis from house dust mite challenge

9. Grifoni, Alba, et al. “Characterization and epitope identification of the T cell response in non-allergic individuals exposed to mouse allergen.” World Allergy Organization Journal 12.4 (2019): 100026. Characterization and epitope identification of the T cell response in non-allergic individuals exposed to mouse allergen


If we know the placebo effect exist, why don’t doctors always give placebos to their patients first to avoid secondary effects?

Short answer: the placebo effect is presently impossible to trigger on cue since not enough is known about how it works nor when or how strongly it could be manifested in any given patient.

In order to understand the placebo effect, it needs to be separated from the placebo. Confusing but necessary. The most important aspects of the placebo effect are that

  • It isn’t confined to placebos such as inert sugar pills alone but rather applies to each and every aspect of any type of medical intervention.
  • Its manifestation and effect size aren’t predictable, at least not yet. In fact, some patients may even experience harm, Nocebo – Wikipedia, the counterpart of the placebo effect.

Whether we like it or not, the placebo effect and its evil twin, the nocebo effect, are at play in each and every interaction between doctor and patient. In fact, the placebo effect is an integral part of medical practice itself. Thus, even bonafide drugs and other medical interventions such as surgeries depend to varying degrees on the placebo effect for their action.

For example, even the look, shape, size and color of bonafide drugs play a role in their effect, a fact long capitalized upon by generic drug makers (see below a 1978 cover of the medical trade journal Private Practice from 1).

Research shows that the doctor’s manner as well as the patient’s expectations inextricably and unpredictably combine to yield the placebo effect. Likewise, the rituals (e.g., the doctor’s office) and symbols (e.g., the doctor’s white coat) surrounding all aspects of medical treatment are also part of the placebo effect (below from 2, 3). The placebo itself is thus but one part of the placebo effect even as the effect itself operates even in the absence of the conventional placebo pill.

Empirical research shows that this effect is varyingly present in different patients where it serves to alleviate (placebo effect) or exaggerate (nocebo effect) certain symptoms associated with illness, specifically symptoms prone to subjective interpretations. For example, from different types of pain such as migraine, diabetic neuropathic or dental pain and fibromyalgia to neuropsychiatric conditions such as depression or Parkinson’s to various types of gastrointestinal discomfort such as duodenal ulcer to reflux disease, Crohn’s, ulcerative colitis and IBS to various types of itch.


1. Greene, Jeremy A. “The materiality of the brand: Form, function, and the pharmaceutical trademark.” History and Technology 29.2 (2013): 210-226.

2. Wager, Tor D., and Lauren Y. Atlas. “The neuroscience of placebo effects: connecting context, learning and health.” Nature Reviews Neuroscience 16.7 (2015): 403. The neuroscience of placebo effects: connecting context, learning and health

3. Benedetti, Fabrizio. “Placebo and the new physiology of the doctor-patient relationship.” Physiological reviews 93.3 (2013): 1207-1246. Placebo and the New Physiology of the Doctor-Patient Relationship


What was the first major public vaccination campaign?


Smallpox – Wikipedia was the target of not just the first but of several centuries-long sustained public vaccination campaigns (1). This was because

  • Knowledge of smallpox virus is the oldest among all viruses. For example, scientists visualized variola, cowpox and vaccinia viruses long before any other viruses.
  • Smallpox was also the first communicable disease for which a vaccine was initially discovered and later developed.

A deadly scourge, historical evidence suggests that smallpox had already been sporadically spreading from Asia/Eurasia since at least the 1st century CE. Back then dense populations necessary to sustain repeat epidemics were few and far between.

In subsequent centuries, maritime trade enabled global contact and helped establish new settlements in far off places while industrialization helped empty the countryside even as it encouraged cities to grow into densely populated, unhygienic and unsanitary urban conglomerations. These social changes also unfortunately proved to help spread smallpox globally.

Thus, starting in the 16th and accelerating by the 18th century, epidemic waves of smallpox began to periodically roil the rapidly industrializing Old World as well as its many newly discovered and conquered far-flung colonies in the New World, sometimes as frequently as every 2 to 5 years, waves that tended to wipe out thousands to tens of thousands at a time.

Variolation was the the early effort at smallpox control. A dangerous practice, it entailed inoculating smallpox itself from the scab of an infected person.

  • Despite high risk of disease, variolation became an accepted practice based on the observation that pockmarked individuals, that is those with incontrovertible signs of previous smallpox, never got it a second time. Surviving a first exposure to this infection thus meant lifelong protection from a highly lethal disease.
  • However, probably because it was highly risky, variolation never became a widespread public health measure and remained a cottage industry of sorts.

Variolation was eventually replaced by vaccination after Edward Jenner discovered that deliberate exposure to the relatively harmless cowpox virus, as frequently occurred among milkmaids, could durably and safely protect humans from its far more lethal smallpox counterpart.

In modern terms, Jenner’s vaccination substituted the lethal smallpox virus with the non-lethal but antigenically related cowpox/vaccinia virus.

Between 1807 and 1821, Bavaria (1807), Denmark (1810), Norway (1811), Bohemia and Russia (1812), Sweden (1816) and Hanover (1821) passed smallpox vaccination laws that made vaccination, usually of infants, legally compulsory.

In Great Britain (below from 2),

“The Vaccination Acts of 1840, 1841 and 1853 . . . made [vaccination] successively universal, free, non-pauperising and, finally, compulsory. The Acts of 1861,1867 and 1871 made vaccination enforceable by the appointment of Vaccination Officers, and finally compelled enforcement by making such appointments mandatory . . . the Act of 1867 permitted parents to be fined repeatedly until the child was vaccinated.. . the Act of 1871 . . . made negligent parents liable both for non compliance with the Act and for disobedience of a court order. In default of fines and costs, parents were sometimes committed to gaol and household goods were distrained for sale.”

While such laws helped initially control and eventually limit the spread of smallpox, they also engendered fierce and determined anti-vaccination movements that continue to this day in one form or another in several countries, having morphed from resistance to smallpox vaccination to resistance to other vaccines as well.

Figures below from 1 and 2 illustrate the timeline of early smallpox vaccination programs as well as some of their features and their effectiveness.


1. Plotkin, Stanley. “History of vaccination.” Proceedings of the National Academy of Sciences 111.34 (2014): 12283-12287. https://www.pnas.org/content/pnas/111/34/12283.full.pdf

2. Fenner, Frank, et al. Smallpox and its eradication. Vol. 6. Geneva: World Health Organization, 1988. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2491071/pdf/bullwho000