Does cosmic radiation often cause cancer in humans?


, , ,

Two things first.

  • Far from settled, causes of cancer continue to be vigorously debated (1).
  • Not radiation per se, rather Ionizing radiation – Wikipedia is considered damaging to tissues.

With respect to the cancer risk from ionizing cosmic radiation, this answer briefly summarizes

  • Broad outlines of the dispute between competing models.
  • As an illustrative example, epidemiological data that suggest flight personnel, a unique population group routinely occupationally exposed to higher levels of ionizing cosmic radiation, may have a two-fold increased risk for melanoma though surprisingly, cosmic ionizing radiation may not be the relevant risk factor.

Ionizing Radiation & Cancer Risk: Brief Consideration Of Competing Models

The relationship between dose and frequency of ionizing radiation on one hand and cancer on the other is neither straightforward nor settled, and is even a matter of considerable controversy and dispute.

A mainstay through much of the 20th century, the LNT (Linear no-threshold model – Wikipedia) model presumes a strictly linear relationship between substance (radiation, potential toxin) dose and toxicity, higher the dose, higher the risk of damage, with no dose considered damage-free.

Some trenchant, even downright cutting, analyses suggest (2, 3, 4, 5) that unquestioning acquiescence to the LNT model has been a travesty when defining cancer risk as well as for toxicology and public health policy in general.

The article (5) by Edward Calabrese – Wikipedia in particular is a highly entertaining foray into the genesis of the LNT in the mid-20th century, how it was contrived as an attempt to explain the mechanism for evolution and how though it failed in that purpose early in its life, it got a second wind ‘in service of and application for environmental risk assessment‘.

OTOH stands the concept of Hormesis – Wikipedia (from the Greek word meaning ‘to stimulate’), which favors a biphasic response, where the possibility exists that sometimes lower doses might even confer health benefits while doses above a certain threshold would be increasingly damaging.

The principle of hormesis holds that homeostatic feedback mechanisms in complex organisms allow for over-compensation in response to low doses of various stressors, a result of inevitable time lags that different tissues and organ systems take to communicate with each other (6).

A modification of the hormesis model expands the idea further, ascribing beneficial effects to a hormetic zone with doses both below and above it considered toxic (see below figures that compare LNT, Threshold and Hormesis models from a guest post by Dr. William Sacks on the blog, 7, maintained by Barry Brook (scientist) – Wikipedia and a modified version of the hormesis model from 8).

In the radiation field, the hormesis idea is called Radiation hormesis – Wikipedia (9), that contrary to the assumption of a strictly linear relationship between radiation dose and cancer potential, below a certain threshold, low radiation doses might reduce cancer incidence while doses above that threshold would increase it. Mechanisms by which low radiation doses might help stave off cancer include some largely mutually exclusive possibilities such as

  • More effective or appropriate activation of DNA and other repair enzymes.
  • Potentially pre-cancerous cells more effectively triggered to commit suicide.
  • Immune system activated more effectively or differently to be able to kill off pre-cancerous cells more efficiently.

All that said, as far as I’m aware, consensus statements from major scientific organizations currently do not accept radiation hormesis (10).

Ionizing Radiation & Cancer Risk: Brief Consideration Of Empirical Data On A High-Risk Group

Going from theory to empirical data, unsurprisingly, higher the altitude, higher the level of cosmic radiation (11). Frequent fliers, particularly flight personnel, are thus an obvious target population to study the relation, if any, between the effect of dose and frequency of exposure to ionizing cosmic radiation and risk for cancer.

As anticipated, many epidemiological studies on varying numbers of individuals (pilots, cockpit crew, flight attendants, other frequent fliers) and of varying quality have examined the link between cancers and flying (12). Problem is it’s difficult if not impossible to separate out the role of occupational versus equally relevant confounding lifestyle factors, which could serve to either increase or decrease cancer risk.

  • For one, Circadian rhythm – Wikipedia disturbance, an integral component of flying could by itself increase cancer risk.
    • As epidemiological studies probe the deleterious effect of night shift work on health, circadian rhythm disturbance is becoming a well-known risk factor for cancer in its own stead (13). For example, a 2018 meta-analysis of 61 studies concluded (see below from 14).

‘confirmed the positive association between night shift work and the risks of several common cancers in women. We identified that cancer risk of women increased with accumulating years of night shift work, which might help establish and implement effective measures to protect female night shifters.’

  • For another, flight personnel tend to be healthier compared to the general population with lower rates of hypertension, obesity and smoking, and higher levels of physical activity (15, 16). This decreases their cancer risk.

While such confounding factors make it difficult to separate signal from noise, several Meta-analysis – Wikipedia have found increased risk for various skin cancers among flight personnel. A 2015 meta-analysis considered to be the largest thus far as well as among the most robust of such studies examined a total of 19 studies with a total of 266431 participants and concluded (see below from 11)

‘Pilots and cabin crew have approximately twice the incidence of melanoma compared with the general population. Further research on mechanisms and optimal occupational protection is needed.’

More importantly, both this study (11) as well as a previous one (17) link increased melanoma risk not to ionizing radiation but rather to ultraviolet light, specifically to UVA. However, while UVA could go some way in helping explain higher risk for pilots, it does not do so for cabin crew. An accompanying editorial explains how ionizing cosmic radiation is unlikely to be relevant as well as how implicating UVA as the relevant risk factor also fails to fully explain (see below from 18).

‘Passengers and crewmembers are exposed to higher levels of radiation depending on cruising altitude, latitude, and solar activity.4 Pilots and flight crew receive an additional annual dose of 2 to 9 mSv [milli Sievert – Wikipedia], 5 which is significantly below the current exposure limit of 20 mSv per year according to the International Commission on Radiological Protection. 3 For comparison, a single chest x-ray provides roughly 0.02 mSv and a computed tomographic scan of the chest, abdomen, and pelvis provides approximately 8 mSv. The lifetime risk of dying of cancer among US adults is approximately 220 in 1000 and would be expected to increase modestly to approximately 223 in 1000 after a 20-year career as an airline crewmember (assuming 80 mSv of aggregate radiation exposure). 4 In addition to the fact that this modest dose of cosmic radiation should have little effect on cancer risk more generally,there is also no known relationship between even relatively high doses of ionizing (cosmic-type) radiation and melanoma. 5 Thus, it seems unlikely that cosmic radiation exposure is relevant in the observed increase in melanoma in this population…

Episodic UV exposure is also associated with an increased risk of developing melanoma, whereas frequent low to moderate UV exposure is more associated with risk for nonmelanoma skin cancers. As discussed by Sanlorenzo et al, 3 a pilot’s exposure to UV-B (280-320 nm) radiation through the windshield is minimal because it is blocked by glass, but more than half of UV-A (320-380 nm) radiation penetrates glass. 3 The additional UV-A exposure for pilots offers a possible explanation for their increased incidence of melanoma but does not account for the increased risk for the cabincrew.’

Bottom line, it appears flight personnel are at higher risk for melanoma while the exact reason(s) still remain unclear.


1. Tomasetti, Cristian, Lu Li, and Bert Vogelstein. “Stem cell divisions, somatic mutations, cancer etiology, and cancer prevention.” Science 355.6331 (2017): 1330-1334.…

2. Stebbing, A. R. D. “A mechanism for hormesis—a problem in the wrong discipline.” Critical reviews in toxicology 33.3-4 (2003): 463-467.

3. Doss, Mohan. “Correcting systemic deficiencies in our scientific infrastructure.” Dose-Response 12.2 (2014): dose-response.…

4. Bus, James S. ““The dose makes the poison”: Key implications for mode of action (mechanistic) research in a 21st century toxicology paradigm.” Current Opinion in Toxicology 3 (2017): 87-91.

5. Calabrese, Edward J. “Obituary notice: LNT dead at 89 years, a life in the spotlight.” Environmental research 155 (2017): 276-278.…

6. Stebbing, A. R. “Growth hormesis: a by-product of control.” Health Physics 52.5 (1987): 543-547.

7. Lessons about nuclear energy from the Japanese quake and tsunami

8. Lee, Duk-Hee, and David R. Jacobs. “Hormesis and public health: can glutathione depletion and mitochondrial dysfunction due to very low-dose chronic exposure to persistent organic pollutants be mitigated?.” J Epidemiol Community Health (2014): jech-2014.…

9. Vaiserman, Alexander M. “Radiation hormesis: historical perspective and implications for low-dose cancer risk assessment.” Dose-Response 8.2 (2010): dose-response.…

10. Valentin, Jack. The 2007 recommendations of the international commission on radiological protection. Oxford: Elsevier, 2007.….

11. Sanlorenzo, Martina, et al. “The risk of melanoma in airline pilots and cabin crew: a meta-analysis.” JAMA dermatology 151.1 (2015): 51-58.…

12. Di Trolio, Rossella, et al. “Cosmic radiation and cancer: is there a link?.” Future Oncology 11.7 (2015): 1123-1135.

13. He, Chunla, et al. “Circadian disrupting exposures and breast cancer risk: a meta-analysis.” International archives of occupational and environmental health 88.5 (2015): 533-547.…

14. Yuan, Xia, et al. “Night Shift Work Increases the Risks of Multiple Primary Cancers in Women: A Systematic Review and Meta-analysis of 61 Articles.” Cancer Epidemiology and Prevention Biomarkers 27.1 (2018): 25-40.

15. Pizzi, Costanza, et al. “Lifestyle of UK commercial aircrews relative to air traffic controllers and the general population.” Aviation, space, and environmental medicine 79.10 (2008): 964-974. Lifestyle of UK Commercial Aircrews Relative to Air Traffic Contr…: Ingenta Connect

16. dos Santos Silva, Isabel, et al. “Cancer incidence in professional flight crew and air traffic control officers: disentangling the effect of occupational versus lifestyle exposures.” International journal of cancer 132.2 (2013): 374-384. Cancer incidence in professional flight crew and air traffic control officers: Disentangling the effect of occupational versus lifestyle exposures

17. Hammer, Gaël P., Maria Blettner, and Hajo Zeeb. “Epidemiological studies of cancer in aircrew.” Radiation protection dosimetry 136.4 (2009): 232-239.

18. Shantha, Erica, Chris Lewis, and Paul Nghiem. “Why do airline pilots and flight crews have an increased incidence of melanoma?.” JAMA oncology 1.6 (2015): 829-830.…


What are the long term affects of the Zika virus?


This answer summarizes

  • Zika infection consequences in adults, largely mild symptoms but autoimmune Guillian-Barre syndrome in some.
  • Tragic long term consequences of severe birth defects from Zika infection during pregnancy.

Evidence from both groups bolster the notion that Zika is a neurotrophic virus.

Zika Infection Consequences In Adults

While the Zika virus was first identified in 1947 and it might have triggered earlier outbreaks, the first such reports began to come out only from 2007 or so, too short a period of time to be certain of all of its long term effects.

In most Zika outbreaks, infection in adults appeared typically mild so much so that many may not have even known they got it. Overall it appears only ~20% of Zika-infected adults show clinical symptoms, relatively mild ones such as headache, joint pain (arthralgia), rash and conjunctivitis (1).

In some recent Zika virus outbreaks, particularly the one in French Polynesia in 2013-2014, some Zika-infected adults were observed to develop Guillain–Barré syndrome – Wikipedia, an autoimmune condition caused by immune system attack of the peripheral nervous system (2).

Tragic Long Term Consequences To Newborns Of Zika Infection During Pregnancy

The 2015-2016 Zika outbreak in Brazil made clear Zika’s potentially destructive impact on the developing nervous system. Observed in the 2016-2017 Zika outbreak in Colombia as well (3) and now named Congenital Zika Syndrome, thus far it is the most consequential and devastating long term effect of Zika virus, outcome of infection of a pregnant woman and transplacental infection of the developing fetus (4, see below from 5). Zika infection during the first trimester and mother experiencing strong clinical signs of Zika infection appear to augur the greatest risk of CZS to fetus.

A prominent feature of CZS observed in Brazil’s Zika outbreak was Microcephaly – Wikipedia, where the baby’s head is much smaller compared to babies of similar age and sex (see below from 6). This outbreak witnessed a ~20-fold increase in microcephaly incidence in this region.

Even Zika-infected newborns who were apparently healthy at birth were observed to develop brain abnormalities later (see below from 7). This series of patients included several born with normal head circumference at birth who went on to have poor head growth later. Such infants had other neurological problems such as Hypertonia – Wikipedia, Dystonia – Wikipedia, Hemiparesis – Wikipedia and Epilepsy – Wikipedia as well.

Arthrogryposis – Wikipedia, malformation, particularly of the hand and feet, have also been observed in CZS (see below from 8).

Both microcephaly and arthrogryposis have also been observed together in CZS (see below from 4).

Brain atrophy and calcifications are some of the other brain abnormalities observed in CZS. Some CZS children born without any obvious brain abnormalities went on to develop Hydrocephalus – Wikipedia (accumulation of cerebrospinal fluid in the brain) at 3 to 12 months of age (4, 9).

Retina and optic nerve damage have also been observed in some CZS cases (10), which can occur even in the absence of microcephaly at birth (11).

The US CDC set up the US Zika Pregnancy Registry to coordinate with state, tribal, local and territorial health departments to track Zika-related pregnancy and birth outcomes. Its most recent report (12) on pregnancy outcomes for 1297 pregnancies with suspected Zika infections reported from 44 states from December 2015 to December 2016 concluded ~1 in 10 lab-confirmed Zika infections resulted in Zika virus-associated birth defects.

Analyzing the data trove since the 2015 outbreak in Brazil, several groups including the CDC have concluded there’s a causality link between Zika infection and microcephaly and other birth defects (13, 14, 15), thus adding to the long list of Vertically transmitted infection – Wikipedia. However, debate on the topic in the scientific literature also continues (16, 17, 18).

Definitive risk factors for CZS are still largely unknown (19). These include roles for co-infection with Dengue virus – Wikipedia and other Arbovirus – Wikipedia, genetic polymorphisms as well as differences in virulence between different Zika virus strains.


1. Petersen, Lyle R., et al. “Zika virus.” New England Journal of Medicine 374.16 (2016): 1552-1563.…

2. Cao-Lormeau, Van-Mai, et al. “Guillain-Barré Syndrome outbreak associated with Zika virus infection in French Polynesia: a case-control study.” The Lancet 387.10027 (2016): 1531-1539.…

3. Alvarado-Socarras, Jorge L., et al. “Congenital microcephaly: A diagnostic challenge during Zika epidemics.” Travel medicine and infectious disease (2018).…

4. Miranda-Filho, Demócrito de Barros, et al. “Initial description of the presumed congenital Zika syndrome.” American journal of public health 106.4 (2016): 598-600.

5. Britt, William J. “Adverse outcomes of pregnancy-associated Zika virus infection.” Seminars in perinatology. WB Saunders, 2018.

6. Moore, Cynthia A., et al. “Characterizing the pattern of anomalies in congenital Zika syndrome for pediatric clinicians.” JAMA pediatrics 171.3 (2017): 288-295. Characterizing the Pattern of Anomalies in Congenital Zika Syndrome

7. van der Linden, Vanessa. “Description of 13 infants born during October 2015–January 2016 with congenital Zika virus infection without microcephaly at birth—Brazil.” MMWR. Morbidity and mortality weekly report 65 (2016).…

8. Aragao, M. F. V. V., et al. “Spectrum of spinal cord, spinal root, and brain MRI abnormalities in congenital Zika syndrome with and without arthrogryposis.” American Journal of Neuroradiology 38.5 (2017): 1045-1053.…

9. Chimelli, Leila, et al. “The spectrum of neuropathological changes associated with congenital Zika virus infection.” Acta neuropathologica 133.6 (2017): 983-999.

10. Ventura, Camila V., et al. “Risk factors associated with the ophthalmoscopic findings identified in infants with presumed Zika virus congenital infection.” JAMA ophthalmology 134.8 (2016): 912-918. Ophthalmoscopic Findings in Infants With Presumed Zika Virus

11. Ventura, Camila V., et al. “Zika: neurological and ocular findings in infant without microcephaly.” The Lancet 387.10037 (2016): 2502.…

12. Reynolds, Megan R., et al. “Vital Signs: Update on Zika Virus-Associated Birth Defects and Evaluation of All US Infants with Congenital Zika Virus Exposure-US Zika Pregnancy Registry, 2016.” MMWR. Morbidity and mortality weekly report 66.13 (2017): 366-373.…

13. Tang, Bor Luen. “Zika virus as a causative agent for primary microencephaly: the evidence so far.” Archives of microbiology 198.7 (2016): 595-601.…

14. Rasmussen, Sonja A., et al. “Zika virus and birth defects—reviewing the evidence for causality.” New England Journal of Medicine 374.20 (2016): 1981-1987.…

15. CDC Press Releases

16. Frank, Christina, Mirko Faber, and Klaus Stark. “Causal or not: applying the Bradford Hill aspects of evidence to the association between Zika virus and microcephaly.” EMBO molecular medicine (2016): e201506058.…

17. Ribeiro, Bruno Niemeyer de Freitas, et al. “Congenital Zika syndrome and neuroimaging findings: what do we know so far?.” Radiologia brasileira 50.5 (2017): 314-322.…

18. Joob, B., and V. Wiwanitkit. “Spinal Cord, Spinal Root, and Brain MRI Abnormalities in Congenital Zika Syndrome.” American Journal of Neuroradiology 38.10 (2017): E77-E77.

19. Soriano-Arandes, Antoni, et al. “What we know and what we don’t know about perinatal Zika virus infection: a systematic review.” Expert review of anti-infective therapy 16.3 (2018): 243-254.

What parts of the body does the norovirus affect and in what ways?


What parts of the body does the norovirus affect and in what ways?

This answer summarizes

  • Current (circa 2018) understanding of how human norovirus gets transmitted.
  • The sites and cell types human norovirus is currently known to infect.
  • The major obstacles that currently stand in the way of accelerating human norovirus research.

Human Norovirus Transmission

Albert Kapikian – Wikipedia first identified Norovirus – Wikipedia in 1972. While it became increasingly known as a bane for cruise ship passengers in recent years, norovirus is today also one of the most common causes of acute gastroenteritis the world over (1).

Spread through fecal-oral or oral-oral routes, where vomiting and toilet flushing generate droplets and aerosols (2), norovirus (see below 3)

  • Can spread either directly or indirectly.
  • Is highly infectious, a human volunteer challenge study having shown as little as a single infectious viral particle to be infectious to as many as ~50% (4).

Human Norovirus Infection

An enteric virus, norovirus is known to infect or be capable of infecting various cells in the gastro-intestinal (GI) tract, including Enterocyte – Wikipedia (intestinal epithelial cell), Microfold cell – Wikipedia (M cell), Dendritic cell – Wikipedia, Macrophage – Wikipedia, B cell – Wikipedia, T cell – Wikipedia, (see below from 5).

Obstacles in Human Norovirus Research

Progress in understanding human norovirus infections has been slow. Two major constraints that impede human norovirus research have been

  • Lack of purified human norovirus stocks. Virus stocks prepared from human stool samples being usually contaminated with bacterial products such as LPS or with other enteric viruses such as rota makes it difficult to ascribe cell damage effects to norovirus alone.
  • Lack of replicable in vitro human cell culture systems to study human norovirus entry and replication.
    • Human strains of norovirus have proven notoriously difficult to maintain in cell culture. For many years, research efforts to grow them in human intestinal epithelial cells proved unsuccessful.
    • While scientists had greater success culturing mouse norovirus strains in mouse macrophages and dendritic cells (6), such efforts failed to translate to studies using human macrophages and dendritic cells (7), suggesting mouse and human noroviruses may have different replication needs, implying as well caution against directly extrapolating from mouse norovirus studies.
    • Though one group finally succeeded in being able to show human norovirus replication in human intestinal epithelial cell lines INT-407 and Caco-2 using a 3-D culture system in (8, 9, 10), doubts about the feasibility of this approach arose when other groups failed to replicate these results (11, 12).

In 2015, a collaborative study reported a breakthrough by managing to culture a human norovirus strain in simple human cell cultures (13, 14). The various novelties that helped accomplish this feat included,

  • Using a more pathogenic human norovirus strain, GII.4 Sydney 2012, in contrast to the more commonly used but less pathogenic strain, GI.1 Norwalk.
  • Using not intestinal epithelial cells nor macrophages but instead two human B cell lines, Raji and BJAB.
  • Previous studies used filtered virions or filtered norovirus-positive stool sample to inoculate the cell cultures. These scientists instead used unfiltered, unprocessed stool in what turns out to have been an inspired choice.
    • It is already well-known by now that norovirus use histo blood group antigens (HBGA) to help facilitate their entry into host cells.
    • In this novel culture system using human B cell lines, HBGA-expressing bacteria present in the unprocessed, unfiltered stool sample helped facilitate human norovirus entry into B cells.
    • The study specifically showed that human but not mouse norovirus entry into the tested human B cell lines required the presence of the commensal bacterium, Enterobacter cloacae.
    • Other enteric viruses such as mammary tumor virus (15), polio (16, 17) and reovirus (16) have also been reported to require bacteria or bacterial product to facilitate their entry and replication into host cells.

As is usually the case with scientific advances, this finding opens the door to more questions,

  • How do HBGA-positive bacteria such as E.. cloacae facilitate human noorovirus entry into host cells such as B cells?
  • Is human B cell infection by human norovirus replicable?
    • While a recent mouse model study (18) reported immune cells in the gut-associated lymphoid tissue to be a major target of human norovirus, a 2018 human study (19) couldn’t get multiple human norovirus strains to replicate in their cell culture system using a different human B cell line (JVM-2).
  • Is human B cell infection by human norovirus necessary for a productive infection?
    • A 2016 study (20) compared norovirus-infected immunodeficient pediatric patients who lacked an intact B cell compartment (n = 9) to those who did (n = 10) and found a similar ~60% detection rate for norovirus in stool.
  • How does human B cell infection affect effective anti-norovirus antibody responses?
    • A previous study (21) suggested that B cell antibody responses were required for controlling human norovirus replication.


1. Lopman, Benjamin A., et al. “The vast and varied global burden of norovirus: prospects for prevention and control.” PLoS medicine 13.4 (2016): e1001999.…

2. Kirby, Amy E., Ashleigh Streby, and Christine L. Moe. “Vomiting as a symptom and transmission risk in norovirus illness: evidence from human challenge studies.” PloS one 11.4 (2016): e0143759.…

3. de Graaf, Miranda, Nele Villabruna, and Marion PG Koopmans. “Capturing norovirus transmission.” Current opinion in virology 22 (2017): 64-70.…

4. Teunis, Peter FM, et al. “Norwalk virus: how infectious is it?.” Journal of medical virology 80.8 (2008): 1468-1476.

5. Karst, Stephanie M., and Christiane E. Wobus. “A working model of how noroviruses infect the intestine.” PLoS pathogens 11.2 (2015): e1004626.…

6. Wobus, Christiane E., et al. “Replication of Norovirus in cell culture reveals a tropism for dendritic cells and macrophages.” PLoS biology 2.12 (2004): e432.…

7. Lay, Margarita K., et al. “Norwalk virus does not replicate in human macrophages or dendritic cells derived from the peripheral blood of susceptible humans.” Virology 406.1 (2010): 1-11.…

8. Straub, Timothy M., et al. “In vitro cell culture infectivity assay for human noroviruses.” Emerging infectious diseases 13.3 (2007): 396.…

9. Straub, Timothy M., et al. “Human norovirus infection of Caco-2 cells grown as a three-dimensional tissue structure.” Journal of water and health 9.2 (2011): 225-240.…

10. Straub, Tim M., et al. “Defining cell culture conditions to improve human norovirus infectivity assays.” Water science and technology 67.4 (2013): 863-868.…

11. Herbst-Kralovetz, Melissa M., et al. “Lack of norovirus replication and histo-blood group antigen expression in 3-dimensional intestinal epithelial cells.” Emerging infectious diseases 19.3 (2013): 431.…

12. Takanashi, Sayaka, et al. “Failure of propagation of human norovirus in intestinal epithelial cells with microvilli grown in three-dimensional cultures.” Archives of virology 159.2 (2014): 257-266.…

13. Jones, Melissa K., et al. “Enteric bacteria promote human and mouse norovirus infection of B cells.” Science 346.6210 (2014): 755-759.…

14. Jones, Melissa K., et al. “Human norovirus culture in B cells.” Nature protocols 10.12 (2015): 1939.…

15. Kane, Melissa, et al. “Successful transmission of a retrovirus depends on the commensal microbiota.” Science 334.6053 (2011): 245-249.…

16. Kuss, Sharon K., et al. “Intestinal microbiota promote enteric virus replication and systemic pathogenesis.” Science 334.6053 (2011): 249-252.…

17. Robinson, Christopher M., Palmy R. Jesudhasan, and Julie K. Pfeiffer. “Bacterial lipopolysaccharide binding enhances virion stability and promotes environmental fitness of an enteric virus.” Cell host & microbe 15.1 (2014): 36-46.…

18. Grau, Katrina R., et al. “The major targets of acute norovirus infection are immune cells in the gut-associated lymphoid tissue.” Nature microbiology 2.12 (2017): 1586.

19. Oka, Tomoichiro, et al. “Attempts to grow human noroviruses, a sapovirus, and a bovine norovirus in vitro.” PloS one 13.2 (2018): e0178157.…

20. Brown, Julianne R., Kimberly Gilmour, and Judith Breuer. “.” Clinical Infectious Diseases 62.9 (2016): 1136-1138. Norovirus Infections Occur in B-Cell–Deficient Patients | Clinical Infectious Diseases | Oxford Academic

21. Zhu, Shu, et al. “Identification of immune and viral correlates of norovirus protective immunity through comparative study of intra-cluster norovirus strains.” PLoS pathogens 9.9 (2013): e1003592.…

What is the difference between “follicular dendritic cells” and “interdigitating dendritic cells”?

Rather than Follicular dendritic cells – Wikipedia (FDCs) versus interdigitating dendritic cells (IDC), the key comparison is between FDC and DC (Dendritic cell – Wikipedia) since IDC is a type of DC so properties ascribed to DC apply to it as well.

FDC are DCs in name and appearance only, the confusion stemming from appearance since both FDC and DC have the characteristic dendrites that got them their names in the first place. However, there are at least four fundamental differences between FDC and DC,

  • Location: FDC are found in follicles, Lymph node – Wikipedia follicles to be precise.
    • Anatomically, FDCs appear to form a cellular network within secondary lymphoid tissues.
    • DCs OTOH are widely distributed throughout the body, resident DC populations being found not only in lymph nodes but also in practically every tissue and organ in the body.
  • Hallmark trait: FDC are radio-resistant while DC aren’t. Radioresistance – Wikipedia in this particular instance means (mouse) FDCs have been found able to withstand radiation doses that kill off DCs.
  • Ontogeny: FDC ontogeny remained long disputed. While DC originate from bone marrow-derived precursors, FDC don’t appear to be bone marrow-derived and are presently considered to arise from specialized vascular Mural cell – Wikipedia and have a singular need for Lymphotoxin – Wikipedia.
  • Function: FDC present intact antigen directly to B cells while DCs present digested antigen bits (epitopes) indirectly (within MHC molecules) to T cells.
    • Unlike DC, FDC are known to retain intact Antigen – Wikipedia on their surface, specifically within the folds of their dendrites in the form of antigen-antibody complexes (Immune complex – Wikipedia) that disappear slowly over time.
    • The Germinal center – Wikipedia or GC is a specialized site within lymph nodes crucial not only for B cell proliferation and memory formation but also for generation of long-lived Plasma cell – Wikipedia from B cells.
    • In what seems to be mutual dependence, while FDC ability to retain intact antigen for long periods of time seems essential for formation and maintenance of GCs, the lymph node FDC network seems to depend on presence of B cells.
    • OTOH, DCs are considered the prototypical Antigen-presenting cell – Wikipedia that kickstart an adaptive immune response by presenting digested bits (epitopes) of antigen within Major histocompatibility complex – Wikipedia molecules to naive (antigen-inexperienced) T cells.

For long, FDCs remained a mysterious subset of cells studied only by a niche group of immunologists. Even today, study of FDCs remains a niche field. Key aspects of FDC biology were reviewed in 2014 (see below from Heesters, Balthasar A., Riley C. Myers, and Michael C. Carroll. “Follicular dendritic cells: dynamic antigen libraries.” Nature Reviews Immunology 14.7 (2014): 495.…).

Given that some blood cancers like lymphoma and myeloma also form “masses”, then why are they regarded as liquid tumors? Does the tumor microenvironment have an impact on the efficacy of CAR-T therapy?



Tumors that originate from blood-derived cells tend to be called liquid tumors.

CAR-T therapy has so far had the most promising results in targeting CD19+ B cells. Even though blood cells such as CD19+ B cells can form masses, such masses are still fundamentally different from solid tumors.

  • Blood cell tumor masses are typically homogeneous, usually being composed of one or few related cell types. OTOH solid tumors are composed of multiple unrelated cell types, including fibroblasts (CAF, Cancer-Associated Fibroblasts), neutrophils (TAN, Tumor-Associated Neutrophils), mesenchymal cells, Myeloid-derived suppressor cell – Wikipedia, to name a few.
  • Blood cell tumor masses tend to be simple collections that can aggregate and detach relatively easily. OTOH, solid tumors have characteristic tissue-like Extracellular matrix – Wikipedia which functions as a formidable physical barrier to incoming anti-tumor immune cells.

In the context of CAR-T technology targeting CD19 – Wikipedia, a molecule expressed by cells of the B cell lineage, it’s important to keep in mind that B and T cells evolved to interact with each other, something I’ve emphasized previously as an important yet often-overlooked feature that could help explain the exceptional success of anti-CD19 CAR-T cells (1).

Be they Cell surface receptor – Wikipedia and their ligands (e.g., CD40-CD40L), Cytokine – Wikipedia and their receptors (e.g., IL-4-IL4L), Chemokine – Wikipedia and their receptors (e.g., CXCR5-CXCL13), B and T cells express many lock-and-key pairs of each of these types of interacting molecules that are extremely important in mediating their interactions with each other.

This means that simply as part of its normal physiology, an anti-CD19 CAR-T cell would be able to leverage such intrinsic features of B cell-T cell biology to interact with CD19+ B cell tumors. This would be on top of the cell-surface anti-CD19 molecule that it’s been engineered to express to help it bind specifically to cell-surface CD19 expressed by B cells.

As illustrative examples, this answer shares compelling CT scan and MRI data from three published studies, two clinical trials from one NIH group (2, 3) and one case report from Mass General (4). They show anti-CD19 CAR-T cells to be capable of eliminating CD19+ B cell tumor masses, even when they’re located in the brain (4).



2. Kochenderfer, James N., et al. “Donor-derived CD19-targeted T cells cause regression of malignancy persisting after allogeneic hematopoietic stem cell transplantation.” Blood 122.25 (2013): 4129-4139.…

3. Brudno, Jennifer N., et al. “Allogeneic T cells that express an anti-CD19 chimeric antigen receptor induce remissions of B-cell malignancies that progress after allogeneic hematopoietic stem-cell transplantation without causing graft-versus-host disease.” Journal of clinical oncology 34.10 (2016): 1112.…

4. Abramson, Jeremy S., et al. “Anti-CD19 CAR T cells in CNS diffuse large-B-cell lymphoma.” New England Journal of Medicine 377.8 (2017): 783-784.…

Will anti-malarial bednets for children in developing countries significantly decrease resilience against malaria? Is preventing infection really a better social solution than adaptable immune systems?


Will anti-malarial bednets for children in developing countries significantly decrease resilience against malaria? Is preventing infection really a better social solution than adaptable immune systems?

Short answer: Truly prodigious in its parasitic capabilities, with >100 malaria parasite species capable of infecting a stupendous range of animals from primates to rodents to birds, even lizards and snakes, malaria is estimated to have co-evolved with humans for ~half a million years, cutting a swath through human populations all through history (1).

Given this history, framing one approach (human-made interventions) versus another (natural anti-malaria immunity) fails to get to grips with how to manage such an ancient and wily parasite. It’s also important to keep in mind that natural resistance to malaria builds up over time after multiple infections.

Maternal malaria during any stage of pregnancy can and often does adversely impact the fetus and newborn while babies are disproportionately prone to malaria death. This is why a major focus of public health interventions is preventing malaria during pregnancy and in babies.

Insecticide-treated bed nets (ITNs) are deployed to break the malaria transmission cycle by not just preventing infection but also killing mosquitoes. ITN isn’t the only large scale public health intervention in malaria-endemic African countries, which are presently the principal site of malaria morbidity and mortality. Rather, it is part of a four-pronged approach with the other three being Artemisinin-based combination therapy (ACT), intermittent preventive treatment in pregnancy (IPTp) and indoor residual spraying (IRS).

Problem is a single class of insecticide (pyrethroids) is approved for ITN use while prohibitive cost of the other three classes of insecticide also restricts IRS to mainly pyrethroids. Such mono insecticide use has led to development of insecticide resistance in Anopheles mosquitoes, resistance that is being reported from more and more countries. It also appears to drive mosquito behavior modification, changing indoor feeding to outdoors.

Longer answer expands on the known pros & cons of ITNs.

Important to note that currently, Africa, specifically sub-Saharan Africa, remains the main redoubt for malaria morbidity and mortality (see below from 2). For this reason, sub-Saharan Africa is the epicenter of malaria control.

Insecticide-treated Bed Nets (ITNs): Pros

Cheap and durable, commercially manufactured ITNs are considered a cost-effective public health intervention capable of serving as both a physical barrier as well as a mosquito killer. The WHO recommends using long-lasting commercial ITNs where the insecticide is either coated around or even incorporated into the net fibers, allowing for the insecticide’s biological activity to withstand multiple washes (3). Obviously ITN are most effective in areas where the mosquitoes feed indoors and where maximal biting occurs at night.

ITN breaks the malaria transmission cycle by preventing both a malaria-laden mosquito from biting a malaria-free human host as well as a malaria-free mosquito from biting a malaria-infected human host. Studies in the 1980s showing pyrethroids being able to repel and kill mosquitoes even as they were safe for humans (4) opened the door for their widespread use.

For these reasons the WHO has recommended widespread ITN use across malaria-endemic regions since at least 2000 (5). Antenatal clinics made large numbers of ITNs available to pregnant women at greatly subsidized prices as part of mass distribution campaigns launched by the ministries of health in various malaria-endemic countries (6).

Starting in 2003, a stream of studies have reported ITNs being able to kill substantial numbers of anopheline mosquitoes and reduce malaria transmission (4, 7, 8, 9, 10). Given such results and the large scale push by both the WHO and various health ministries, the proportion of households owning at least one such bed net in sub-Saharan Africa increased dramatically in just a handful years (see below left from 11). However, such a bird’s eye analysis glosses over the extremely wide variation in ITN coverage between different countries and even regions within the same country (see below right from 11).

Insecticide-treated Bed Nets (ITNs): Cons

On its face, a commendable aim is being achieved through economical means within the reach of millions across malaria-endemic African countries. Problem is only one class of insecticide, the Pyrethroid – Wikipedia, has been approved for use in ITNs. TB and HIV have already shown time and again that monotherapy antimicrobial use is a gateway to drug resistance. Likewise, intense monotherapy with insecticide imposes strong selection pressure leading to mosquito insecticide resistance, which has skyrocketed in recent years (12, 13, 14, 15, 16, 17, see below from 11), the biggest danger stemming from reports of mosquito populations being found resistant to all four classes of insecticide currently available for mosquito control (18, 19).

The other three insecticide classes being prohibitively expensive (2, 20) renders them beyond reach of most individuals in malaria-endemic countries and indeed even beyond reach of local public health authorities.

In its 2015 report, the WHO notes that only 52 of 97 countries with ITN and IRS vector control programs reported insecticide resistance in 2014 (11). Many malaria-endemic countries failing to conduct routine malaria vector surveillance means insecticide resistance can be easily missed.

Insecticide resistance develops not abruptly but gradually over a number of years, and so can be and indeed is easily missed. This happened in Mexico where sentinel sites reported very low levels of resistance between 2000 and 2003 and yet recorded >80% resistance frequency by 2007, which is why some authors consider insecticide resistance entails a tipping point (see below from 20).

Research also shows mosquito behavior modification in response to ITNs, mosquitoes previously known to be indoor feeders showing up instead in outdoor feeding (21).

Given the high likelihood that monotherapy would lead to resistance, clearly, widespread deployment of pyrethroid-based ILNs was ill-conceived. Scrambling to push out new classes of insecticide is underway. Will it undo the damage already done remains an open question.


1. Mosquito: The Story of Man’s Deadliest Foe (9780786886678): Andrew Spielman Sc.D., Michael D’Antonio: Books

2. World Health Organization. “Global plan for insecticide resistance management in malaria vectors.” (2012).…

3. Hamel, Mary J., and Umberto D’Alessandro. “Control of malaria during pregnancy: preventive strategies. Intermittent preventive treatment and insecticide-treated nets.” Encyclopedia of Malaria (2014): 1-10.

4. Lengeler, Christian. “Insecticide‐treated bed nets and curtains for preventing malaria.” The Cochrane Library (2004).…

5. World Health Organization. “Achieving and maintaining universal coverage with long-lasting insecticidal nets for malaria control.” (2017).…

6. Bhatt, Samir, et al. “The effect of malaria control on Plasmodium falciparum in Africa between 2000 and 2015.” Nature 526.7572 (2015): 207.…

7. Hawley, William A., et al. “Community-wide effects of permethrin-treated bed nets on child mortality and malaria morbidity in western Kenya.” The American journal of tropical medicine and hygiene 68.4_suppl (2003): 121-127.…

8. Feng, Gaoqian, et al. “Decreasing burden of malaria in pregnancy in Malawian women and its relationship to use of intermittent preventive therapy or bed nets.” PloS one 5.8 (2010): e12012.…

9. Boudová, Sarah, et al. “The prevalence of malaria at first antenatal visit in Blantyre, Malawi declined following a universal bed net campaign.” Malaria journal 14.1 (2015): 422. https://malariajournal.biomedcen…

10. Escamilla, Veronica, et al. “Effects of community-level bed net coverage on malaria morbidity in Lilongwe, Malawi.” Malaria journal 16.1 (2017): 142. https://malariajournal.biomedcen…

11. World Health Organization. “World malaria report 2015.” (2015).…

12. Glunt, Katey D., et al. “Long-lasting insecticidal nets no longer effectively kill the highly resistant Anopheles funestus of southern Mozambique.” Malaria journal 14.1 (2015): 298. https://malariajournal.biomedcen…

13. Bass, Chris, and Christopher M. Jones. “Mosquitoes boost body armor to resist insecticide attack.” Proceedings of the National Academy of Sciences 113.33 (2016): 9145-9147.…

14. Churcher, Thomas S., et al. “The impact of pyrethroid resistance on the efficacy and effectiveness of bednets for malaria control in Africa.” Elife 5 (2016).…

15. Glunt, Katey D., et al. “Empirical and theoretical investigation into the potential impacts of insecticide resistance on the effectiveness of insecticide‐treated bed nets.” Evolutionary Applications.…

16. Alout, Haoues, et al. “Consequences of insecticide resistance on malaria transmission.” PLoS pathogens 13.9 (2017): e1006499.…

17. Fouet, Caroline, Peter Atkinson, and Colince Kamdem. “Human Interventions: Driving Forces of Mosquito Evolution.” Trends in parasitology (2018).

18. Kisinza, William N., et al. “Multiple insecticide resistance in Anopheles gambiae from Tanzania: a major concern for malaria vector control.” Malaria journal 16.1 (2017): 439. https://malariajournal.biomedcen…

19. Ranson, Hilary, and Natalie Lissenden. “Insecticide resistance in African Anopheles mosquitoes: a worsening situation that needs urgent action to maintain malaria control.” Trends in parasitology 32.3 (2016): 187-196.

20. McCoy, Amanda, et al. “Towards an economics policy framework to combat malaria, in an era of insecticide resistance.” (2017).…

21. Russell, Tanya L., et al. “Increased proportions of outdoor feeding among residual malaria vector populations following increased use of insecticide-treated nets in rural Tanzania.” Malaria journal 10.1 (2011): 80. https://malariajournal.biomedcen…

How does a blood test for autoimmune diseases work?

How does a blood test for autoimmune diseases work?

A blood test in an autoimmune disease context is used to identify presence, type and quantity of Autoantibody – Wikipedia as well as of disease-associated or disease-specific biochemical markers, which are then used in combination with clinical, histopathological and other laboratory assessments to make an autoimmune diagnosis (see below from 1).

Consider Primary biliary cholangitis – Wikipedia or Primary Biliary Cirrhosis which is a liver-specific autoimmune disease. While anti-mitochondrial antibody (AMA) targeting a specific mitochondrial protein, PDC-E2 (Pyruvate dehydrogenase complex – Wikipedia), is considered one of its hallmarks, it can be found for many years before clinical symptoms appear.

Elevated levels of the liver enzyme, Alkaline phosphatase – Wikipedia (ALP) and histology showing liver destruction, specifically Ascending cholangitis – Wikipedia and bile duct destruction, are additional signs needed to make a PBC diagnosis. Note too that Elevated alkaline phosphatase – Wikipedia is PBC-associated, not PBC-specific, being observed in numerous diseases and thus insufficient on its own to make a definitive diagnosis of PBC.

Similarly, other organ-, tissue-, cell- or organelle-specific autoantibodies appear before clinical symptoms in Rheumatoid arthritis – Wikipedia (RA), Multiple sclerosis – Wikipedia (MS) and Diabetes mellitus type 1 – Wikipedia (T1D), meaning autoantibody on its own is not confirmatory of an autoimmune disease.

Rather, diagnosis of each autoimmune disease consists of a complex matrix of clinical signs and tissue- or organ-specific pathology in addition to specific autoantibodies and biochemical markers (see below from 1, note that this is not an exhaustive list of autoimmune diseases, merely of some of the most common ones).


1. Wang, Lifeng, Fu‐Sheng Wang, and M. Eric Gershwin. “Human autoimmune diseases: a comprehensive update.” Journal of internal medicine 278.4 (2015): 369-395.…

Is it possible to diagnose infections by analyzing the cytokine spectrum?



Is it possible to diagnose infections by analyzing the cytokine spectrum?

Rather than cytokine spectrum alone, blood gene signatures of not just cytokines but other genes as well are being mooted as diagnostic possibilities for infections. For this approach to become a practical reality, initial promising reports from some basic research labs need to be replicated by others using other, larger patient data-sets, and one or more such approach needs to be spun out to commercial entities having the resources necessary to accelerate their translation to clinical medicine. Currently, this type of research is still in the earliest stages of discovery work.

Irony is posting this answer after the US SEC charged Theranos, Elizabeth Holmes and Sunny Balwani with fraud (1, 2) only serves to emphasize just how important and lucrative it’s becoming to rapidly and accurately diagnose not just infections but also other diseases using just a drop of blood.

Potential moneymaking scope isn’t the only reason though. Increasingly irresponsible antibiotic usage over recent decades has made responsible antibiotic stewardship an urgent medical priority, where rapidly diagnosing an infection as viral rather than bacterial as early as possible could help minimize unnecessary antibiotic use (3, 4, 5).

As illustrative examples of proof of principle, this answer focuses on the work being done in the lab of Purvesh Khatri, a computational immunologist at Stanford University who has in recent years published a few high-profile studies on this topic (6). The approach is something we’re going to see more of in the coming years, scouring publicly available large genetic data-sets to discern clinically important patterns.

Khatri’s group examined such data-sets obtained from either whole blood or circulating blood cells (PBMC) from both healthy individuals as well as those with bacterial or viral infections or with other types of inflammatory conditions. Specifically, his group examines public Microarray – Wikipedia gene expression databases such as National Center for Biotechnology Information – Wikipedia (NCBI) GEO (Gene Expression Omnibus) looking to see if gene expression patterns map to specific infections.

Microarray analysis attempts to find differentially expressed genes between different sets of samples while examining thousands and even tens of thousands of genes simultaneously.

In their 2015 study, Khatri’s group analyzed 205 samples in 3 data-sets that included healthy controls as well as patients with respiratory viral or bacterial infections (7),

  • They identified 396 genes as being differentially expressed in bacterial versus viral respiratory infections.
  • They identified a unique signature for flu that set it apart from other viral infections.
  • They could even separately identify asymptomatic flu patients who were shedding the flu virus as well as those with flu-like symptoms but who weren’t infected with it.

In their 2016 study (8),

  • They identified 7 genes they claimed could differentiate bacterial or viral infections.
  • Validated these gene signatures in a group of 96 critically ill children.
  • They identified a 3-gene signature they claim could distinguish patients with active or latent tuberculosis, another infection or no infection. Specifically, they claim this test is far better at identifying those without TB unlike standard TB tests which often miss it in those patients who are unable to spit up sufficient amount of Sputum – Wikipedia needed for diagnosis.

While such basic research data appears promising at first blush, using microarray data for clinical diagnosis comes with its own set of limitations that require mitigation in order to make such an approach practical.

  • Need a robust approach to distinguish true from false positives.
  • Data reproducibility between data-sets. Since it can and often is the case that results range all the way from a given gene being found significantly differentially expressed to borderline significance to not significant at all, effect sizes are of greater importance. All the more reason for biologists to wean themselves away from the unfortunate tendency to focus on significance (p value) to the exclusion of more biologically meaningful measures.
  • Meta-analysis – Wikipedia of some publicly available human microarray data-sets suggest there’s greater value not in single large studies for a given disease but rather in larger numbers of smaller studies that are moderately powered (Statistical power – Wikipedia) (9).
  • Not all human microarray data studies make all their data publicly available. This is a science policy, not research matter, issue that can only be remedied by funders and regulators.
  • Researchers tend to focus their attention on better annotated genes, which are usually simply those that started being studied earlier, rather than on those that appear to have the strongest evidence supporting their role in a given disease. Such stereotypical ‘looking under the lamp-post‘ attitude stymies discovery of most relevant genetic signatures associated with a particular disease (see below from 10).

‘Collectively, our results provide an evidence of a strong research bias in literature that focuses on well-annotated genes instead of those with the most significant disease relationship in terms of both expression and genetic variation. We show that the inequality follows a “rich-getting-richer” pattern, where annotation growth is biased towards genes that were richly annotated in the initial versions of GO [Gene Ontology]. We believe this stems from the typical experimental design. To illustrate this, consider an omics experiment that generates a list of hundreds or thousands of interesting genes. To interpret these genes, researchers use GO and pathway analysis tools. The researchers then generate targeted hypotheses for validation by interpreting the list of significant GO terms, focusing the genes or proteins annotated with that GO term. The researchers learn more about those targeted genes, leading to additional GO annotations for the already annotated genes. In this process, the list of unannotated genes is simply ignored because pathway analysis tools cannot map them to any GO terms. Hence, the self-perpetuating cycle of inequality continues.

While focusing research on the best characterized genes may be natural because it is easy to formulate a mechanistic hypothesis of the gene’s function in disease, we propose that the researchers in the era of omics should instead allow data to drive their hypotheses. We have repeatedly shown that expanding research outside of the streetlight of well characterized genes identifies novel disease-gene relationships35–37, identifies FDA-approved drugs that can be repurposed for other diseases27, and identifies clinically translatable diagnostic and prognostic disease signatures27,30–34,39. For example, we have previously identified PTK7 as causally involved in non-small cell lung cancer37. At the time of publication, PTK7 was labelled as an orphan tyrosine kinase receptor. In a very short span, this finding was transformed into an antibody-drug conjugate targeting PTK7 that induced sustained tumor regression, outperformed standard-of-care chemotherapy, and reduced frequency of tumor-initiating cells in a preclinical study45. A Phase 1 clinical trial (NCT02222922) of PTK7 antibody drug conjugate, PF-06647020, has already completed with acceptable and manageable safety profile, and is now being considered for further clinical development. To enable researchers to pursue data-driven hypotheses, we have made our rigorously validated gene expression multicohort analysis data publicly available (MetaSignature) where it may be explored based on either diseases or genes of interest29,46. Focusing on genes with the strongest molecular evidence instead of the most annotations would enable researchers to break the self-perpetuating annotation inequality cycle that results in research bias.’


1. Elizabeth Holmes, Theranos C.E.O. and Silicon Valley Star, Accused of Fraud


3. Commitments to Responsible Use of Antimicrobials in Humans

4. MacDougall, Conan, and Ron E. Polk. “Antimicrobial stewardship programs in health care systems.” Clinical microbiology reviews 18.4 (2005): 638-656. Antimicrobial Stewardship Programs in Health Care Systems

5. Morency-Potvin, Philippe, David N. Schwartz, and Robert A. Weinstein. “Antimicrobial stewardship: how the microbiology laboratory can right the ship.” Clinical microbiology reviews 30.1 (2017): 381-407. How the Microbiology Laboratory Can Right the Ship

6. SOHN, EMILY. “Frontiers in blood testing.”…

7. Andres-Terre, Marta, et al. “Integrated, multi-cohort analysis identifies conserved transcriptional signatures across multiple respiratory viruses.” Immunity 43.6 (2015): 1199-1211.…

8. Sweeney, Timothy E., Hector R. Wong, and Purvesh Khatri. “Robust classification of bacterial and viral infections via integrated host gene expression diagnostics.” Science translational medicine 8.346 (2016): 346ra91-346ra91. Robust classification of bacterial and viral infections via integrated host gene expression diagnostics

9. Sweeney, Timothy E., et al. “Methods to increase reproducibility in differential gene expression via meta-analysis.” Nucleic acids research 45.1 (2016): e1-e1.…

10. Haynes, Winston A., Aurelie Tomczak, and Purvesh Khatri. “Gene annotation bias impedes biomedical research.” Scientific Reports 8.1 (2018): 1362.…

Are CCP antibodies, C-reactive proteins, and TH2 the same thing? If not, then how do they differ from one another?


Are CCP antibodies, C-reactive proteins, and TH2 the same thing? If not, then how do they differ from one another?

Cats, dogs, rabbits. Obviously animals but are they the same thing? Of course not. Similarly, neither are CCP antibodies, C-reactive protein and Th2. In fact, a Venn diagram – Wikipedia of the three wouldn’t overlap directly.

C-reactive protein – Wikipedia (CRP) is a product of the Innate immune system – Wikipedia while CCP antibodies and Th2 are products of the Adaptive immune system – Wikipedia.


The liver secretes CRP in response to dramatic physiological alterations such as acute injury, infection or Inflammation – Wikipedia. Persistently elevated CRP level is simply a sign of some type of underlying inflammatory disorder. By itself elevated CRP cannot pinpoint what could be wrong and thus is used in concert with other tests to try and diagnose the exact health issue (see below from 1).

CCP Antibodies

The adaptive immune system consists of B cell – Wikipedia and T cell – Wikipedia which express specific receptors on their surface. Antigen – Wikipedia is simply any molecule a specific portion of which can specifically bind to specific portions of a B or T cell receptor.

Secreted B cell receptor is Antibody – Wikipedia.

Anti–citrullinated protein antibody – Wikipedia (ACPA) or antibodies against citrullinated proteins (CCP antibodies), (Citrullination – Wikipedia), are implicated in a variety of autoimmunities, particularly Rheumatoid arthritis – Wikipedia (RA) though >40 years of studies have been unable to as yet provide evidence for cause and effect (2).

Th2 Helper T cell

Th2 is a type of T helper cell – Wikipedia that secretes an abundance of the cytokine IL-4. T helper cells are the master orchestrators of the immune system, helping coordinate its many activities, helping B cells secrete antibodies against specific antigens. For example, B cells that secrete CCP antibodies do so after receiving help from T cells that express receptors specific for the same citrullinated proteins these B cells target. T cell help consists of both cell-surface molecules that bind specific counterparts on the surface of B cells they’re helping as well as secreted molecules such as Cytokine – Wikipedia.

Highly controlled and artifactual experimental mouse models provide the bulk of the evidence for defined T helper cell subsets such as Th2 (3). However, evidence for their physiological existence in humans is far weaker.


1. Rhodes, Benjamin, Barbara G. Fürnrohr, and Timothy J. Vyse. “C-reactive protein in rheumatology: biology and genetics.” Nature Reviews Rheumatology 7.5 (2011): 282.…

2. Malmström, Vivianne, Anca I. Catrina, and Lars Klareskog. “The immunopathogenesis of seropositive rheumatoid arthritis: from triggering to targeting.” Nature Reviews Immunology 17.1 (2017): 60.

3. Walker, Jennifer A., and Andrew NJ McKenzie. “T H 2 cell development and function.” Nature Reviews Immunology (2017).

Are “super viruses” a concern with widespread usage of vaccines (similar to “super bacteria” resistant to antibiotics, from widespread usage of antibiotics)? Why are the two different?

Are ‘super viruses’ a concern with widespread usage of vaccines (similar to ‘super bacteria’ resistance to antibiotics, from widespread usage of antibiotics)? Why are the two different?

This answer explains

  • Differences between drug (antibiotic and antiviral) and vaccine resistance.
  • Why super virus is less likely to emerge in response to antiviral vaccine.
  • How natural bottlenecks in human-virus dynamic prevail to limit the scope for a super virus.
  • How intense artificial selection pressures imposed by unwitting or willful animal husbandry practices in industrial livestock production increase scope for a super virus. The Marek’s disease – Wikipedia (MDV) vaccine in chickens offers an illustrative example.

Differences Between Drug (antibiotic and antiviral) & Vaccine Resistance

Be they antibiotics against bacteria or antivirals against viruses, studies show drug resistance tends to emerge fairly rapidly. OTOH, be their target bacteria or virus, vaccine resistance occurs only rarely and typically takes many more years to emerge (see below from 1).

Note two interesting features of this comparison,

  • Vaccines compared in this study are human. Situation could be quite different with veterinary vaccines as this answer shares with the illustrative example of Marek’s disease – Wikipedia (MDV) vaccine in chickens.
  • Pertussis, pneumococcal and hepatitis B are human vaccines with documented data on resistance. All three are relatively simple sub-unit vaccines that offer limited numbers of targets. There may be something to the idea that the more complex a vaccine, greater the number of targets it offers the host’s immune system, the more robust and comprehensive the control it elicits, the stronger the capability to not just prevent disease but also infection and transmission.

The authors of this study (1) propose the reason for this difference between drug and vaccine resistance is two-fold,

  • Vaccines tend to be given prophylactically to healthy people, before they get the infection, whereas drugs are given therapeutically to an infected person, typically at a time when they harbor large, even enormous, numbers of the disease-causing organism.
    • Having already expanded and mutated during the infection’s incubation period, greater their number, greater the chance for mutations in the disease-causing organism. Mutated organisms also likely spread to new hosts even before the index host begins drug treatment.
    • In other words, typically the scale is usually already tipped against an antibiotic or antiviral. Indeed, studies show greater their number at the time of Rx, more likely drug resistance (1).
    • Being given prophylactically means vaccine-indued immune responses have much greater scope for preventing a disease-causing organism from even gaining a foothold in the first place, let alone be able to expand, mutate and then spread. A difference in kind from drugs in other words.
  • Drugs typically target one or few molecules whereas even relatively simple vaccines offer a multitude of targets to the host’s immune system.
    • Both TB and HIV specialists learned the hard way that using a single drug almost invariably leads to drug resistance while combination therapy or multiple drugs that target different pathways and use different mechanisms reduces its chances.
    • OTOH, vaccines typically offer multiple targets to the immune system. Even a single protein offers multiple epitopes as targets to T and B cells. B cells also undergo Somatic hypermutation – Wikipedia with the help of T cells. Post-vaccination antibody repertoires are thus typically broad and varied, even varying substantially between individuals. That makes it much more difficult for a mutated organism to evade immune responses. What may evade in one person may not in another whereas blanket evasion is very much the norm against a given drug.

Why Super virus Is Less Likely To Emerge In Response to Antiviral Vaccine

Super virus from antiviral vaccine is inherently different from super bacteria that develop in response to excess antibiotic usage in both humans and livestock. Antibiotics act on not just the target organism but also on our and animal microbiota. They also seep into soil and water from effluents from livestock operations and thus can also act on all sorts of environmental bacteria. Scope of antibiotic selection pressure on bacteria is thus enormous. OTOH, an antiviral vaccine, even one given to millions of humans or billions of livestock, is designed to target one single virus or a narrow set of related viruses. Its scope in the form of selection pressure that yields a super virus is thus inherently much more limited.

Whether it could emerge from vaccines requires we first consider what a ‘super virus’ could be. Typically super is prefixed to microorganisms that have acquired frightening new capabilities often as a result of mutations or genetic exchanges, acquisition of virulence genes being case in point for the latter. Such considerations theoretically limit chances of a super virus to either in-kind exchanges, i.e., between similar types of viruses, given some are RNA, others DNA, or mutations.

Having considered what a super virus could entail, its development is a two sides of the same coin issue.

  • One side is the capacity of organisms such as viruses and bacteria to replicate so much faster than us, which in turn enables them to adapt much more rapidly to selection pressures such as host immune responses to vaccines for example. Such adaptations however come with inbuilt constraints since bacteria and viruses cannot adapt away from features where the cost-benefit analysis for doing so entails extreme fitness costs such as death. Usually traits such as coat proteins or receptors used to invade cells, such features thus usually end up being durable targets of host immune responses for precisely such reasons. In other words, a mutually reinforcing cycle between virus and host tends towards a detente that limits the potential for a super virus to develop. This side of the coin, the normal human-virus dynamic, limits the chance for a super virus.
  • The problem comes from the other side of the coin which consists of intense artificial selection pressures that we humans ourselves unwittingly or willfully foist on viruses that could instigate the development of a super virus.

Let’s consider two examples that help illustrate the two sides of this coin, influenza and MDV, an economically important disease in chickens.

  • Natural bottlenecks in influenza-human dynamic limit the scope for a super virus.
  • ‘Leaky’ veterinary vaccine such as the MDV vaccine may increase scope for more virulent virus to emerge in chickens.

Natural Bottlenecks in Influenza-Human Dynamic Limit Scope for Super Virus

Influenza as an RNA virus serves as a useful illustrative example where a 2017 review explored how, contrary to the preconception that flu viruses evolve rapidly, they actually evolve quite slowly (see below from 2).

The authors remind that (see below from 2),

‘New antigenic variants of A/H3N2 viruses appear every 3–5 years, whereas new antigenic variants of A/H1N1 and influenza B viruses appear less frequently (2–5 years for A/H3N2 viruses compared with 3–8 years for A/H1N1 and influenza B viruses)12,23–25. Given that seasonal influenza viruses cause epidemics worldwide, infecting hundreds of millions of people each year1, and that each human is likely to be infected multiple times over their lifetime26,27, it is surprising that new antigenic variants appear so infrequently.’

‘Leaky’ Marek’s disease – Wikipedia (MDV) Vaccine May Increase Scope for More Virulent Virus

Efforts to control MDV in commercial chickens represents the other side of the coin in the form of the artificial selection pressure imposed by a ‘leaky vaccine’ often given to the billions upon billions of livestock that many humans now consider their birthright to consume without restraint whenever they want however much they want, a 20th century innovation enabled by industrialization of agriculture and refrigeration.

What does it take to stock supermarket after supermarket across the length and breadth of a vast country like the US for example with an overabundance of neatly packaged, pristine-looking meat, not to mention the prodigious meat consumption at tens of thousands of fast food joints? Overabundance not because such overabundance is essential for human health, far from it, not even solely because it’s convenient, no. Rather, because it’s something that 20th century technology made do-able. The rest of the modern meat-consuming ecosystem and cultural practices that followed flow from what is true of so much of modern life, that technology makes possible the previously unthinkable.

Economic rather than scientific or humane considerations underlie CAFO (Concentrated animal feeding operation – Wikipedia), where livestock are densely packed in substandard, largely unhygienic conditions through their increasingly truncated, miserable lives, coincidentally the same conditions likely to engender the emergence of new, more deadly viruses.

More prone to infections under such living conditions in turn necessitates more invasive measures such as prophylactic vaccines to control them, a human-made selection pressure on an unprecedented global scale on viruses harbored by livestock but that is what it takes to have an overabundance of relatively cheap meat be just a quick drive to the local supermarket in an increasing number of countries. Any surprise then that such conditions could encourage the development and spread of more virulent viruses, some of which might even be harmful to humans (3)?

While most longstanding human vaccines appear to be sterilizing, the same doesn’t appear to be the case with veterinary vaccines (4).

MDV is a frequent viral disease in chickens where it causes tumors and eventually death. Infected birds spread the virus when they shed their feather follicles. After this disease began to impact commercial chicken operations in the 1960s (5, 6), the USDA began a vaccination program with a vaccine that later studies showed was ‘leaky’, protecting the vaccinated chickens from the disease but unable to prevent virus transmission. As the years passed, vaccinated chickens were found to shed increasingly more virulent virus.

A controversial idea first mooted by a mathematical model in 2001 (7) suggested that imperfect vaccines that do not prevent infection but keep hosts alive might help more virulent pathogens to circulate since normally, such virulent organisms would take themselves out of circulation by killing their hosts. In contrast to a sterilizing vaccine, a ‘leaky vaccine’ is one capable of protecting the host while still allowing the transmission of the disease-causing organism. A ‘leaky vaccine’ is thus an imperfect vaccine.

Hypothesizing that the ‘leaky’ vaccine might be implicated in increasing virulence, a 2015 study (4) first infected both vaccinated and unvaccinated chickens with MDV and then mixed these infected chickens with unvaccinated, uninfected chickens.

  • Unvaccinated, uninfected chickens exposed to unvaccinated infected chickens did not become sick.
  • Unvaccinated, uninfected chickens exposed to vaccinated infected chickens died.

The authors concluded the ‘leaky’ vaccine was somehow implicated in the vaccinated infected chickens transmitting a more virulent MDV to the unvaccinated, uninfected chickens.

Note the authors cannot and did not conclude that vaccination was responsible for virulence increase. Rather they are careful to conclude only what their data could support, that it was sufficient to maintain more pathogenic strains in the chickens they tested (see below from 4),

‘MDV became increasingly virulent over the second half of the 20th century [19 ,21–24 ]. Until the 1950s, strains of MDV circulating on poultry farms caused a mildly paralytic disease, with lesions largely restricted to peripheral nervous tissue. Death was relatively rare. Today, hyperpathogenic strains are present worldwide. These strains induce lymphomas in a wide range of organs and mortality rates of up to 100% in unvaccinated birds. So far as we are aware, no one has been able to isolate non-lethal MDV strains from today’ s commercial (vaccinated) poultry operations [19 ,23 ]. Quite what promoted this viral evolution is unclear. The observation that successively more efficacious vaccines have been overcome by successively more virulent viral strains has prompted many MDV specialists to suggest that vaccination might be a key driver [19–24 ,34–37 ], though identifying the evolutionary pressures involved has proved challenging. There is no evidence in Marek’ s disease that vaccine breakthrough by more virulent strains has anything to do with overcoming strain-specific immunity (e.g., epitope evolution); genetic and immunological comparisons of strains varying in virulence suggest that candidate virulence determinants are associated with host– cell interactions and viral replication, not antigens [19 ]. The imperfect-vaccine hypothesis was suggested as an evolutionary mechanism by which immunization might drive MDV virulence evolution [2 ], but there has been no experimental confirmation. Our data provide that: by enhancing host survival but not preventing viral shedding, MDV vaccination of hens or offspring greatly prolongs the infectious periods of hyperpathogenic strains, and hence the amount of virus they shed into the environment.

Our data do not demonstrate that vaccination was responsible for the evolution of hyperpathogenic strains of MDV, and we may never know for sure why they evolved in the first place. Clearly, many potentially relevant ecological pressures on virulence have changed with the intensification of the poultry industry. For instance, as the industry has expanded, broilers have become a much larger part of the industry, and broiler lifespans have halved with advances in animal genetics and husbandry; all else being equal, this would favour more virulent strains [28 ], so too might greater genetic homogeneity in flocks [38 ] or high-density rearing conditions [13 ], or indeed increased frequencies of maternally derived antibody if natural MDV infections became more common as the industry intensified in the pre-vaccine era (Fig 3 ) [39 ]. But whatever was responsible for the evolution of more virulent strains in the first place (and there may be many causes), our data show that vaccination is sufficient to maintain hyperpathogenic strains in poultry flocks today. By keeping infected birds alive, vaccination substantially enhances the transmission success and hence spread of virus strains too lethal to persist in unvaccinated populations, which would therefore have been removed by natural selection in the pre-vaccine era.’


A bird-specific virus, humans need not fear getting sick from MDV. However, the same is not true for other viruses that livestock such as chickens and pigs harbor. Some bird and swine flu strains can and indeed do infect humans. Flu strains circulating between birds, swine and humans could mix (reassort) in new ways through a process called Antigenic shift – Wikipedia to create entirely new strains against which human immune response might provide little or no protection, creating the specter of frightening new global pandemics. The 2009 flu pandemic – Wikipedia is a recent example of such a phenomenon where bird, swine and human flu viruses appear to have reassorted and then combined with a Eurasian swine flu virus.

That different countries use different disease control measures in livestock adds to such selection pressure. For example, US and Europe cull chickens that get bird flu which stops further evolution of the virus in its tracks while southeast Asian countries use ‘leaky’ vaccines which do not (see below from 8).

“The most-virulent strain of avian influenza now decimating poultry flocks worldwide can kill unvaccinated birds in just under three days,” Read said. The vaccine against avian influenza is a leaky vaccine, according to Read. “In the United States and Europe, the birds that get avian influenza are culled, so no further evolution of the virus is possible,” Read said. “But instead of controlling the disease by culling infected birds, farmers in Southeast Asia use vaccines that leak — so evolution of the avian influenza virus toward greater virulence could happen.”

Given that bird flu can and indeed has jumped to humans, such measures could thus end up adversely impacting human health as well.


1. Kennedy, David A., and Andrew F. Read. “Why does drug resistance readily evolve but vaccine resistance does not?.” Proc. R. Soc. B. Vol. 284. No. 1851. The Royal Society, 2017. http://rspb.royalsocietypublishi…

2. Petrova, Velislava N., and Colin A. Russell. “The evolution of seasonal influenza viruses.” Nature Reviews Microbiology 16.1 (2018): 47.

3. Leibler, Jessica H., et al. “Industrial food animal production and global health risks: exploring the ecosystems and economics of avian influenza.” Ecohealth 6.1 (2009): 58-70.…

4. Read, Andrew F., et al. “Imperfect vaccination can enhance the transmission of highly virulent pathogens.” PLoS Biology 13.7 (2015): e1002198.…

5. Purchase, H. Graham, and E. Fred Schultz. “The economics of marek’s disease control in the United States.” World’s poultry science journal 34.4 (1978): 199-204.…

6. Leaky vaccines could lead to more virulent pathogens | PLOS ECR Community

7. Gandon, Sylvain, et al. “Imperfect vaccines and the evolution of pathogen virulence.” Nature 414.6865 (2001): 751.…

8. Some vaccines support evolution of more-virulent viruses