Archive for the 'Health' Category

The latest head transplant news

Wrap your head around this!

Italian doc: I’ve found the key to head transplants

An Italian scientist has claimed that head transplants could be possible, after what he says is a major breakthrough in the technique. But another expert told The Local said the whole idea was potentially unethical.

Er, gee, you think?

This stuff isn’t as crazy as it sounds. A Russian scientist active in the 20th century did achieve some success with dog head transplants.

Vladimir Petrovich Demikhov was a real Soviet scientist. And he did the weirdest experiments with dog heads, keeping them alive separated from their bodies and transplanting them to other dog bodies.

He also attached heads and other parts to different dogs, resulting in weird hybrids that only survived for a few months. This research inspired the american doctor Robert White, another WW2 surgeon who followed the Soviet lead, performing the same experiments with rhesus monkeys.

Truth is, the science still seems to be in its infancy. Don’t expect human head transplants walking among us anytime soon.

However, the fact that it’s even considered as feasible is pretty astounding. And it raises all sorts of potential dramas. Will dying, mega-wealthy tycoons attempt to transplant their heads onto the bodies of teenage runaways? Will people be able to extend their life a few decades by living as mechanically supported human heads in a jar?

One can only hope.

Programming hunger

Last summer I had an experience that got me thinking about how much food we need to eat. I was in Paris with my Mom, and found that even though we were walking around most of the days, we only ate a couple meals per day. We had a breakfast, mostly of bread (you know the frogs and their bread) and then a regular meal in the afternoon. It was less than I would normally eat at home, yet I was never hungry.

The New Yorker blog has a post that connects to this, noting that why we get hungry is often unconnected to our need for energy.

More often than not, we eat because we want to eat—not because we need to. Recent studies show that our physical level of hunger, in fact, does not correlate strongly with how much hunger we say that we feel or how much food we go on to consume.

Even if you’ve had an unusually late or large breakfast, your body is used to its lunch slot and will begin to release certain chemicals, such as insulin in your blood and ghrelin in your stomach, in anticipation of your typical habits, whether or not you’re actually calorie-depleted.

This probably doesn’t surprise anyone, indeed I think we all observe this. You’re not hungry at all but a plate of fried chicken passes before you and whammo—pig out city!

The article states that we start to see part of our environment as cues to eat. We have a great snack on our favorite couch and we become conditioned—like Pavlov’s dogs—to associate that couch with snacking. We drive to the dentist and are reminded how there’s a great donut shop nearby and we start to crave donuts. I think this is partly why I experienced so little food craving in Paris—it is a city unfamiliar to me and I had not programmed in the environmental cues to stimulate hunger.

On a side note, I recall reading about a very ineffective campaign against drug use that was set up by some city. (I read about this a while back; can’t recall the location.) The government placed billboards in ghetto neighborhoods saying things like “Cocaine: It’s Evil” and showing a big pile of cocaine. Of course the result was rehabbing drug users saw these signs and thought, “Oh man I would love to snort a pile of coke like that right now!”


Today in my readings I came across mention of something I’d never heard of: breatharians. These are people who believe they can live without food by subsisting on air and sunlight. It sounds insane of course, but a google search reveals plenty of conversation about the topic. How do they do it? Well, for the most part they don’t.

In 1983, most of the leadership of the movement in California resigned when Wiley Brooks, a notable proponent of breatharianism, was caught sneaking into a hotel and ordering a chicken pie.[

Mmmm… chicken pie.

Also note:

Under controlled conditions where people are actually watched to see if they can do it without sneaking some real food, they fail. The name most commonly associated with breatharianism is Australia’s Jasmuheen (born Ellen Greve), who claims to live only on a cup of tea and a biscuit every few days. However, the only supervised test of this claim (by the Australian edition of 60 Minutes in 1999) left her suffering from severe dehydration[4] and the trial was halted after four days, despite Greve’s insistence she was happy to continue. She claimed the failure was due to being near a city road, leading to her being forced to breathe “bad air”. She continued this excuse even after she was moved to the middle of nowhere.

The various forms of human insanity seem to have no limits.

The downsides of mindfulness

I’ve read a bit about the practice called mindfulness which, for lack of a better description, is a kind of focused attention on your surroundings. But paying close attention to your sensory experiences of the moment you can, the argument goes, transcend a lot of your worries and break the limiting tether to your ego or self. I’ve made passable stabs at mindfulness, often at a park or in nature, and it can be quite refreshing—a sort of mental reset button.

Part of the idea of mindfulness is that you focus one a specific thing, say your breathing. If a disruptive thought comes in, say, “I have to do my taxes” (Shit! I DO have to do my taxes!!!), you recognize it and let it dissipate, then return your focus to the now. As you train your mind in this practice, you experience less disruptive thoughts.

I’ve wondered if there’s a potential downside to this. Much of creative thought is of the sort that pops in to your head your while you are thinking about something else. Wouldn’t mindfulness, with its focused approach (albeit a rather gentle focus), eliminate these moments of inspiration? The answer, according to this NY Times article, appears to be yes .

But one of the most surprising findings of recent mindfulness studies is that it could have unwanted side effects. Raising roadblocks to the mind’s peregrinations could, after all, prevent the very sort of mental vacations that lead to epiphanies. In 2012, Jonathan Schooler, who runs a lab investigating mindfulness and creativity at the University of California, Santa Barbara, published a study titled “Inspired by Distraction: Mind Wandering Facilitates Creative Incubation.” In it, he found that having participants spend a brief period of time on an undemanding task that maximizes mind wandering improved their subsequent performance on a test of creativity. In a follow-up study, he reported that physicists and writers alike came up with their most insightful ideas while spacing out.

“A third of the creative ideas they had during a two-week period came when their minds were wandering,” Schooler said. “And those ideas were more likely to be characterized as ‘aha’ insights that overcame an impasse.”

And that’s not all…

Another potential drawback to mindfulness has been identified by researchers at Georgetown University. In a study presented at the Society for Neuroscience annual meeting in November, they found that the higher adults scored on a measurement of mindfulness, the worse they performed on tests of implicit learning — the kind that underlies all sorts of acquired skills and habits but that occurs without conscious awareness. In the study, participants were shown a long sequence of items and repeatedly challenged to guess which one would come next. Although supposedly random, it contained a hidden pattern that made some items more likely to appear than others. The more mindful participants were worse at intuiting the correct answers.

“There’s so much our brain is doing when we’re not aware of it,” said the study’s leader, Chelsea Stillman, a doctoral candidate. “We know that being mindful is really good for a lot of explicit cognitive functions. But it might not be so useful when you want to form new habits.” Learning to ride a bicycle, speak grammatically or interpret the meaning of people’s facial expressions are three examples of knowledge we acquire through implicit learning — as if by osmosis, without our being able to describe how we did it. (Few of us can recite the rules of grammar, though most of us follow them when we speak.)

The solution is probably moderation in all things, including mindfulness.

Definitely don’t count your chickens

I’m a big fan of highlighting sentences that are so waffly and filled with caveats that they become meaningless. This one, from a recent Scientific American article on autism, is a current favorite.

If such futuristic scenarios ever materialize, we may one day be able to say that we are nearing a cure for children such as Adrianna and Jermaine’s young Jayden.

So IF these “futuristic” scenarios materialize, can we say we’ve defeated autism? Well, no, but at that point we MAY be able to say we’re NEARING a cure.

It’s practically a done deal.

How to innovate! (Don’t be too innovative.)

As a society, or species, (or whatever we are), we tend to laud forward thinking creative geniuses. When we find one, we hoist them onto a pedestal and treat them as an (to quote this article) “Übermensch [who] stands apart from the masses, pulling them forcibly into a future that they can neither understand nor appreciate.” This is true across all disciplines. Think of Einstein, Beethoven, Picasso, on and on.

So how does one become a genius? Clearly you have to innovate, to do something no one else has done. But there’s a catch here. You can’t be too innovative. You can’t be so ahead of the curve that nobody can really grasp what you’re saying or doing.

Let me propose a thought experiment. Jimi Hendrix travels to Vienna around 1750 and plays his music. Would he be lauded as a genius? Would his guitar playing be heard as the obvious evolution of current trends in music? No, he’d probably be regarded as an idiot making hideous noise and he might be burned at the stake.

But, let music evolve for around 220 years and yes, Jimi is rightfully regarded as a genius. His sound was made palatable by those who came before him, mainly electric blues guitar players of the 50s and 60s. (Obviously there are a lot of other factors (like race and class and sex) relevant to whom gets crowned a genius but I’m painting in broad strokes here.)

So the trick to being a genius is to be ahead of your time but not too ahead. The world of science and medicine is filled with examples. Gregor Mendel famously discovered that physical traits could be passed from one generation of life to another. In what was a major breakthrough in our understanding of biology, he theorized what we came to call genes. He published his results and was met with pretty much total indifference. It wasn’t until his work was rediscovered decades later that we applied them. Mendel was too ahead of his time.

The book “The Mind’s I” notes the mathematician Giovanni Girolamo Saccheri who contributed to the discovery of non-Euclidian geometry. His ideas were so controversial that even Saccheri himself rejected them! (At least he did according to the book; there seems some debate on this. See the last graph on the Saccheri wiki page.) Talk about being too ahead of your time.

But perhaps the best example of this sort of thing is Ignaz Semmelweis. The Hungarian physician…

…discovered that the incidence of puerperal fever could be drastically cut by the use of hand disinfection in obstetrical clinics.

That’s right, he basically came up with the crazy idea that doctors should wash their hands after touching sick people. Unfortunately…

Despite various publications of results where hand-washing reduced mortality to below 1%, Semmelweis’s observations conflicted with the established scientific and medical opinions of the time and his ideas were rejected by the medical community. Some doctors were offended at the suggestion that they should wash their hands and Semmelweis could offer no acceptable scientific explanation for his findings. Semmelweis’s practice earned widespread acceptance only years after his death, when Louis Pasteur confirmed the germ theory and Joseph Lister, acting on the French microbiologist’s research, practiced and operated, using hygienic methods, with great success.

Oh well. Semmelweis probably still had a great career and life right?

Umm, no.

In 1865, Semmelweis was committed to an asylum, where he died at age 47 after being beaten by the guards, only 14 days after he was committed.

Don’t be too ahead of the curve folks.

Cortisol: friend or enemy?!!

A while back I mentioned an event that occurred when I was trying to go to sleep one night. Why don’t I let me describe it?

…a couple nights ago I was lying half asleep and heard a clicking sound of the sort that houses often make — the wood of the frame settling a bit or something. My mind said, “Oh, an annoying sound… whatever,” but my body had a noticeable reaction. I felt that feeling of “tinglies” traveling from my neck down to my body. I presume these “tinglies” to be adrenaline released to the body from the brain and I presume the individual “pricks of tingliness” to be this adrenaline stimulating the synapses of various muscle nerves.

I doubt this is uncommon and I suspect most of us experience it. I, personally, experience it more in the morning. I’ll be lying there, pleasantly adrift, and a thought or sound will occur that jostles me from sleep. As much as I would like to go back to my reverie, I can feel my body waking up, against my wishes.

I’ve long suspected that the reason for this must be something to do with hormone production. Basically, we go to sleep and our body spends the night replenishing spent chemicals (like adrenaline and glucocorticoids, which, while I don’t fully understand, are similar to adrenaline in function.) In the morning we’ve got a vast reserve of these chemicals and it doesn’t take much to get them to be fired off.

Recently I was reading an article on coffee and it hinted at the same idea. We shouldn’t drink coffee in the morning, it said, because our body is already rich with excitatory cortisol. (Cortisol is a glucocorticoid which is a type of hormone.)

…if we are drinking caffeine at a time when your cortisol concentration in the blood is at its peak, you probably should not be drinking it. This is because cortisol production is strongly related to your level of alertness and it just so happens that cortisol peaks for your 24-hour rhythm between 8 a.m. and 9 a.m. on average (Debono et al., 2009). Therefore, you are drinking caffeine at a time when you are already approaching your maximal level of alertness naturally.

This jibes with my sense that some excitatory hormone is at its peak during the morning. So I was pretty much right about everything, as usual.

I’m not going to stop drinking coffee in the morning though.

The girls of Le Roy

A while back, I mentioned the girls of Le Roy, New York who, several years ago, began showing what appear to be—at least to the experts—symptoms of mass hysteria. The girls began experiencing Tourettes like body tics for which no environmental or biological cause could be found. This sort of thing is not that uncommon and has been reported throughout history.

The Atlantic has an interesting follow up on the story alleging that one women who experienced the symptoms effectively caught them off Facebook – she did not personally know any of the girls but “absorbed” their symptoms by reading about them. The article touches on two themes I often report on here. 1) The subconscious can cause bizarre, debilitating physical symptoms, and 2) We—society, humanity, etc.—are becoming overwhelmed with information. To wit…

Bartholomew said that mass hysteria spreads through sight and sound, and historically, one person would have to be in the same room as somebody exhibiting symptoms to be at risk of “catching” the illness. “Not anymore,” he says, noting that social media—“extensions of our eyes and ears”—speeds and extends the reach of mass hysteria. In a paper, he wrote, “Epidemic hysterias that in earlier periods were self-limited in geography now have free and wide access to the globe in seconds.” He says, “It’s a belief, that’s the power here, and the technology just amplifies the belief, and helps it spread more readily.”

Here’s a Dr. Drew episode featuring some of the girls.

I’ve got a gut feeling

Online mag The Verge has been doing some interesting stories lately including this one which notes that the future of psychiatry may be inside your stomach.

Her parents were running out of hope. Their teenage daughter, Mary, had been diagnosed with a severe case of obsessive–compulsive disorder (OCD), as well as ADHD. They had dragged her to clinics around the country in an effort to thwart the scary, intrusive thoughts and the repetitive behaviors that Mary felt compelled to perform. Even a litany of psychotropic medications didn’t make much difference. It seemed like nothing could stop the relentless nature of Mary’s disorder.

Their last hope for Mary was Boston-area psychiatrist James Greenblatt. Arriving at his office in Waltham, MA, her parents had only one request: help us help Mary.

Greenblatt also prescribed Mary a twice-daily dose of probiotics, the array of helpful bacteria that lives in our gut. The change in Mary was nothing short of miraculous: within six months, her symptoms had greatly diminished. One year after the probiotic prescription, there was no sign that Mary had ever been ill.

We all recognize there’s a relationship between our mood and gut. Anxiety often causes gastrointestinal issues and depression can result in a dull ache in the stomach. But we largely presume this relationship to be one way, our mind affecting our stomach.

For Greenblatt, this radical treatment protocol has actually been decades in the making. Even during his psychiatric residency at George Washington University, he was perplexed by the way mental disorders were treated. It was as if, he said, the brain was totally separate from the body. More than 20 years of work treating eating disorders emphasized Greenblatt’s hunch: that the connection between body and mind was more important than conventional psychiatry assumed. “Each year, I get more and more impressed at how important the GI tract is for healthy mood and the controlling of behavior,” Greenblatt said. Among eating disorder patients, Greenblatt found that more than half of psychiatric complaints were associated with problems in the gut — and in some patients, he says he has remedied both using solely high-dose probiotics, along with normalizing eating.

I’m reminded of a segment in one of Malcolm Gladwell’s books where he described the work of some scientists studying facial expressions. These two fellows were spending all their time making faces of different emotions at each other. After two weeks of making sad or depressed faces one of the guys reported that he was actually feeling miserable. The relation between mood and face was a two way street. Maybe the same is true with the stomach.

3d Printing and drug hacking?

First, a little update: I’m now back in the U.S. and back to blogging.

On my brief vacay there, I continued to contemplate the advent of 3D printers and their possible affects on the future. Today I started thinking about whether 3D printers could produce drugs (via a method similar to how such printers could potentially print food.) Specifically, I was wondering whether drug hackers could reverse engineer existing, often costly, prescription drugs and upload recipes for these drugs onto the web. These recipes could be downloaded and certain printers could “print” the drugs. (People doing the printing would still need the raw materials for the drugs of course; the printer would just assemble the materials. But these pirated drugs would clearly be much cheaper than the legal, patented versions.)

As is usually the case with these ruminations, I found that many people had already been thinking about the topic. I offer the podium to the libertarian

Now imagine a not-too-distant future world where 3D printers of all types are becoming more prevalent. It is easy to imagine machines designed to fabricate pharmaceuticals. If a new life-saving drug hits the market and costs thousands of dollars per year (due to the combination of the patent monopoly, the FDA system, medical licensure of doctors, government regulation of prescription drugs, and other state interventions), some consumers may prefer to “make their own” generic version, using reverse-engineered “recipes” floating around the web programmed into their own, or a friend’s, 3D drug printer. Just as the hacker community quickly cracks new iOS releases on the iPhone, say, it is not hard to imagine the drug-hacker community reverse engineering the composition and manufacturing method of pharmaceuticals–especially in this near future world with increasingly sophisticated and cheap analyzing and related equipment.

Now, these home-made generic pharmaceuticals might not be as good as the official ones. They might even be more dangerous. But to save thousands of dollars a year, many people might turn to this.

From there the post examines legal and enforcement issues related to drug hacking. Interesting stuff.