A Tale of Two Cholines

A Tale of Two Cholines
In which a distasteful subject provides
useful lessons about health and disease
By Will Block

I was sitting one day and thinking about
cannibalism, because that’s what guys like me do,
and I thought, suppose a guy was washed up on a
rocky island—how much of himself could he eat?

— Stephen King


Salvador Dali’s conception of cannibalism.
n the catalog of human taboos, cannibalism ranks number one. The thought of it makes our flesh crawl—the very flesh that some other human being might actually … eat. The thought is so repugnant, the taboo so strong, that about a decade ago, a debate raged among academic anthropologists over whether systematic cannibalism (isolated instances aside) had ever actually occurred. One faction argued that virtually all reports of it throughout history were suspect, for a variety of reasons, and they suggested that the whole idea was just a pervasive myth from the darkest recesses of our collective mind. Their argument did not hold up.

Human cannibalism is unquestionably real—it may even be occurring somewhere in the world as you read this. (On a more benign note, when we’re engaged in repair work, it’s common for us to “cannibalize” an unused car or computer, e.g., for needed parts.) It’s also sobering to remember that cannibalism is not uncommon in many other species in the natural world. It is thus, by definition, natural, at least in those species.

Is Something Eating You?

But there is yet another kind of cannibalism of which we humans are indisputably “guilty.” It’s called autocannibalism and has nothing to do with cars. It’s the process by which a substance that’s minding its own business somewhere in the body is attacked and “eaten” away from its existing site so that it can be transported elsewhere and used in another manner for which the need is more urgent. “Elsewhere” could be far removed from the source, or it could be close by—within the very same cell, for example.

In this article we’ll look at a particular kind of brain cell in which autocannibalism plays an important and, ultimately, destructive role, and we’ll see what we might do to help counteract that. (Hint: it involves a nutritional supplement.) To begin, we need to know what cell membranes are made of.

Choline Has a Dual Use in Brain Neurons …

Cell membranes—of brain cells, liver cells, skin cells, etc.—consist of a lipid bilayer, i.e., a two-molecule-thick sheet of lipid (fatty) molecules, which surrounds and contains the cell’s aqueous contents. (The bilayer is studded with countless protein molecules that act as receptors of various kinds, but that’s not important here.) Despite their extreme thinness, lipid bilayers are amazingly tough and resilient. Their two major constituents are cholesterol and a class of compounds called phosphoglycerides. Among the latter, one of the most prevalent is phosphatidylcholine.

Note the choline part of that name—it’s what we’re interested in here, in terms of both autocannibalism and nutritional supplementation. Choline, a nitrogenous alcohol, is an essential nutrient found (in decreasing order of concentration) in beef liver, chicken liver, eggs, wheat germ, bacon, dried soybeans, and pork.1 As a precursor to phosphatidylcholine (as well as to sphingomyelin, a related cell-membrane molecule of lesser importance), it’s a major contributor to cell-wall synthesis throughout the body—and that’s where the great bulk of our daily choline intake goes.

But choline is also a precursor to another molecule of vital importance in the nervous system: the neurotransmitter acetylcholine, which facilitates much of our brain function as well as all of our muscle movements. Acetylcholine is the defining feature of the cholinergic system of neurons in the central and peripheral nervous systems. The cholinergic system is unique in its dual use of choline—a situation that is, in a sense, its undoing.

. . . Which Is a Setup for Autocannibalism

Acetylcholine provides neurotransmission in certain regions of the brain that govern cognitive functions, such as memory, learning, and attention. These are the brain regions—and hence the brain functions—that are damaged and, ultimately, destroyed by Alzheimer’s disease. Two of the principal hallmarks of the disease, in fact, are drastically reduced acetylcholine activity and neurodegeneration, i.e., deterioration of brain cells, leading eventually to their death.

Do you see where this is going? There’s a common denominator in these two processes: it’s the choline molecule, needed for: (1) acetylcholine, which enables nerve impulses to be conveyed from one neuron to the next, and (2) phosphatidylcholine, which enables the neurons to exist in the first place by maintaining the structural integrity of the cell membranes. This dual role of choline is the setup, as it were, for cholinergic autocannibalism, a lose-lose proposition if there ever was one.

Robbing Peter to Pay Paul

When acetylcholine levels are being depleted in Alzheimer’s disease—or during the years before the disease becomes evident—the cholinergic neurons’ natural response is to seek precursor molecules from which to synthesize more acetylcholine. This is an example of homeostasis, the tendency of all cells and organisms to try to maintain balance in the levels and activities of their internal systems. In this case, the neurons use up whatever free choline may be floating around and then look to the only other major source: the phosphatidylcholine molecules of their own cell membranes.* Peter is about to get robbed to pay Paul!


*There is evidence that the membranes of the neurons’ mitochondria—tiny organelles that act as the “powerhouses” for cellular energy metabolism—are even more vulnerable to this process than the cell membranes.2 Considering the vital role that mitochondria play in sustaining life, this too is bad news for the neurons.


Needless to say, “eating” the choline out of the phosphatidylcholine molecules in the cell membranes in order to convert it to acetylcholine doesn’t do the membranes any good—quite the contrary. Neurotransmission may benefit for a while, but at the gradual expense of the neurons’ structural integrity. If the process continues long enough, the neurons will eventually disintegrate and die.

Why Choline Supplementation Is Important

Thus it may be that the dual use of choline by cholinergic neurons—and the autocannibalism that it engenders—is what makes these neurons selectively vulnerable in Alzheimer’s disease, and perhaps in other cholinergic disorders as well.2 (It’s important to note, however, that cholinergic deficits occur to some extent with normal aging. Significant cognitive impairment and, especially, Alzheimer’s disease are signs that these deficits have become much worse.)

This suggests the possibility that supplementation with choline might improve neuronal choline metabolism by reducing, at least, the need for autocannibalism. The trick is to get enough choline into the brain, and that depends on two things: (1) the concentration of choline in the blood, and (2) the proportion of choline that crosses the blood-brain barrier.

With Choline, Age Matters

The Institute of Medicine (IOM) of the National Academy of Sciences has concluded that an adequate daily intake of choline is 550 mg for men aged 19 or older, and 425 mg for women aged 19 or older.3 The normal diet of adult Americans is estimated to contain more than 550 mg/day.1 Thus there would seem to be no need for supplementation, especially as choline’s bioavailability does not decrease with age, according to the results of one study that examined this question in two groups of subjects (average ages 32 and 73).4

But there is a catch: although the bioavailability of choline to the blood remained unchanged with age, the bioavailability to the brain decreased drastically—by 73% in that study—apparently owing to an age-related impairment of choline’s ability to cross the blood-brain barrier. Amazingly, the IOM did not take this factor into account.

Age matters! In order to boost the amount of choline that enters the aging brain, it’s necessary to boost the amount of choline in the aging blood, which requires a much larger daily intake—about 2 to 3 g. This can easily be achieved through supplementation, and there is no known toxicity with these amounts. (Insufficient choline intake, on the other hand, is known to cause fatty deposits in the liver, because one of choline’s roles in the human body is to facilitate the transport of fats from the liver into the bloodstream.)

From Stone Age Cannibals to Nobel Prizes

In 1957, Vincent Zigas, an Australian physician, discovered something odd in the remote highlands of Papua New Guinea. The Fore, a tribe of former cannibals whom he was studying there, were succumbing in appalling numbers to a neurological disease they called kuru, which was their word for trembling or fear. It was a wasting disease that induced trembling and a progressive loss of coordination, as well as occasional outbursts of wild laughter for no apparent reason. And it often led to dementia.

Kuru, also called “the laughing death,” was invariably fatal within about a year, and the overwhelming majority of its victims were women and children. Nothing in medical knowledge could account for the bizarre symptoms and patterns of this malady.

Zigas was baffled and called upon D. Carleton Gajdusek, a pediatrician and virologist with the U.S. National Institutes of Health, for help. A great medical detective story began, engaging the efforts of scientists in many disciplines for more than a decade before the mystery was finally solved.1 Many hypotheses were proposed—genetic abnormalities, nutritional deficiencies, toxic substances, psychosomatic disorders, etc.—but none held up.

Oddly, even as the investigations were proceeding, the incidence of kuru was falling (despite the lack of any viable treatment), after having reached its devastating peak in the 1950s. Then cultural anthropologists learned from tribal elders that both cannibalism and kuru had been relatively new to the Fore—they had begun the former around 1910, and the latter began appearing some years later. They had abandoned cannibalism in the 1950s (at the urging of Australian government officials, missionaries, and scientists). These were important clues.

The Fore had practiced ritual cannibalism of their own dead, with elaborate rules regarding who could eat whom, and what parts of the body. It turned out that the grisly repasts had been enjoyed largely by the women and children. The men had generally avoided cannibalism because they believed it would make them vulnerable to their enemies’ arrows, and they refused to eat women in any case, for fear of being polluted (the Fore women had not yet discovered feminism).

Meanwhile, Gajdusek was making progress in the laboratory. His experiments with chimpanzees showed that kuru could be transmitted (with incubation periods averaging 20 months) via extracts from the brains of women who had died from the disease. Kuru, it turned out, was transmitted from human to human when the Fore women and children ate the brains of their dead. When this practice ceased, the disease itself began to die out—but it took about three decades because of the extraordinarily long incubation period (up to 20 years) in humans.

That puzzling aspect of the disease led to a new concept in medicine: extremely slow-acting viruses. It was the only plausible explanation at the time, even though the virus itself was never found. That’s because it wasn’t a virus at all, but a prion, a bizarre kind of protein whose discovery in 1982 by the American biologist Stanley Prusiner astounded the scientific world. Prions are infectious agents (incredibly tough ones) even though, unlike microbes—bacteria, viruses, fungi, and protozoa—they contain no DNA!

We now know that prions are responsible not only for kuru but also for a variety of other fatal diseases that destroy brain tissue, notably scrapie (in sheep), bovine spongiform encephalopathy (mad cow disease), and Creutzfeldt-Jakob disease (the human form of mad cow disease, and a close relative of kuru).

The knowledge gained from studying one exotic disease in a faraway place has expanded our understanding of other diseases much closer to home, including Alzheimer’s disease and multiple sclerosis. For his work in elucidating the mechanism of kuru, Gajdusek was awarded the Nobel Prize in medicine and physiology in 1976. And for his discovery of prions and the way in which they function, Prusiner won the Nobel Prize in 1997.

  1. McElroy A, Townsend PK. Medical Anthropology in Ecological Perspective, 2nd ed. Westview Press, Boulder, CO, 1989.

The Benefits Are Real

The potential benefits of supplemental choline are not theoretical, but real. In a pair of related studies in 1978, e.g., researchers found that a single 10-g dose of choline improved certain aspects of memory function in healthy young volunteers.5,6 This almost certainly indicates that their brain acetylcholine levels were temporarily increased, resulting in improved cholinergic neurotransmission.* For those of us who are no longer youngsters, maintaining healthy neurotransmission becomes ever more important as the years go by—and it would be reassuring to know that our brains have no need for (yuck!) autocannibalism.


*For more on the benefits of choline—and of its derivative CDP-choline, which is also an important brain nutrient that has proved to be helpful in elderly adults with cognitive impairment or Alzheimer’s disease—see “What’s Old with Acetylcholine Is New to Us” (May 2001), “Alzheimer’s Patients Benefit from CDP-Choline” (December 2004), “CDP-Choline Helps with Memory” (March 2005), and “Choline Battles Homocysteine” (April 2005).


References

  1. Zeisel SH. Nutritional importance of choline for brain development. J Am Coll Nutr 2004;23(6):621S-626S.
  2. Wurtman RJ. Choline metabolism as a basis for the selective vulnerability of cholinergic neurons. Trends Neurosci 1992;15(4):117-22.
  3. Hendler SS, Rorvik D, eds. PDR for Nutritional Supplements. Medical Economics Co., Montvale, NJ, 2001.
  4. Cohen BM, Renshaw PF, Stoll AL, Wurtman RJ, Yurgelun-Todd D, Babb SM. Decreased brain choline uptake in older adults. JAMA 1995;274(11):902-7.
  5. Sitaram N, Weingartner H, Caine ED, Gillin JC. Choline: selective enhancement of serial learning and encoding of low-imagery words in man. Life Sci 1978;22:1555-60.
  6. Sitaram N, Weingartner H, Gillin JC. Human serial learning: enhancement with arecholine and choline and impairment with scopolamine. Science 1978;201:274-6.


Will Block is the publisher and editorial director of Life Enhancement magazine.

Featured Product

  • Learn more about Choline benefits and implementation strategies.

FREE Subscription

  • You're just getting started! We have published thousands of scientific health articles. Stay updated and maintain your health.

    It's free to your e-mail inbox and you can unsubscribe at any time.
    Loading Indicator