Archive for the 'Health' Category

Hooray for salt?

You may have heard the recent allegation that saturated fat, long thought to be evil, is actually fine. (This NY Times op-ed has details.) Along with red wine, coffee and chocolate, saturated fat seems to be another substance that the medical and diet industries got wrong for years.

When the revised opinion of saturated fats hit the news, I passed it on to several people in conversation. They would usually say something like, “Oh, so it’s ok for me to eat pepperoni pizza?” I would have to warn them, “That food is high in salt and salt is still bad.”

Except, maybe not. Peruse this NY Times editorial.

The current average sodium consumption in the United States is about 3,400 milligrams per day. This is mostly ingested in processed foods and is equivalent to the amount of sodium in about 1 1/2 teaspoons of salt. Dietary guidelines endorsed by the federal government and leading medical groups recommend reducing the average to 2,300 milligrams for the general population and 1,500 for groups deemed at greater risk, like adults older than 50, African-Americans, people with high blood pressure and diabetics, among others.

There is considerable evidence that lowering sodium can reduce blood pressure, but there is scant evidence that reducing blood pressure from levels that are not clearly high will necessarily reduce the risk of heart attacks, strokes and death.

Previous studies have found little evidence to support those low recommended sodium targets. Now a large study by researchers at McMaster University in Ontario, Canada, which tracked more than 100,000 people from 17 countries on five continents, has found that the safest levels of sodium consumption are between 3,000 and 6,000 milligrams.

My dad is on a 1500 milligram a day limit. Should I be worried that it’s too low? Maybe.

Other studies have found that very low levels of sodium can disrupt biochemical systems that are essential to human health or trigger hormones that raise cardiovascular risks.

To be fair, as the article states, the science is not settled here. But given that the track record of the health nannies is becoming more and more dubious I think an extra slice of pizza is justifiable.

Happy hospitals?

Many, many times here have I commented on my belief that pain has a significant emotional component. And, as I make my way in the world, I often see little clues supporting this thesis. For instance, today’s NY Times has an article on an effort to redesign hospital rooms to be more pleasent. One hospital first set up a test room to try out some happier designs.

After months of testing, patients in the model room rated food and nursing care higher than patients in the old rooms did, although the meals and care were the same.

But the real eye-opener was this: Patients also asked for 30 percent less pain medication.

Reduced pain has a cascade effect, hastening recovery and rehabilitation, leading to shorter stays and diminishing not just costs but also the chances for accidents and infections. When the new $523 million, 636,000-square-foot hospital, on a leafy campus, opened here in 2012, the model room became real.

So far, ratings of patient satisfaction are in the 99th percentile, up from the 61st percentile before the move. Infection rates and the number of accidents have never been lower.

This proves I am right about everything and all who oppose me should be punished.

Depression hurts (literally)

An idea I’m often talking about on this blog is the notion that emotions are felt as physical sensations. They are not merely ailments of the soul (which, of course, I don’t believe in) but are ailments of the body. This statement seems benign, but I think it’s really quite revolutionary, turning on end many of our assumptions about emotional states. For one thing, if emotions are physical feelings, perhaps negative emotions can be removed by removing their corresponding physical sensations (which is what pretty much any one does when they calm their nerves by having a drink, or use to sex to, as rapper Peaches once advised, “fuck the pain away.”)

In a thread about depression, a reader of Andrew Sullivan’s blog connects the emotional to the physical.

On a different note, another thing people don’t understand about severe depression is that it’s a physical experience. Aside from the lack of energy, which seems to be universal, the physical aspect is different for different people. For some people I’ve known, depression physically hurts. For me, it takes the form of a hollowness in the stomach. At my worst, in the bout that eventually led to my diagnosis, I could not eat at all. The very idea of food made me sick. I ended up in the hospital with an IV, having all sorts of tests done, and losing 20% of my body weight. It was months before I could eat any but the blandest of foods.

My mention of that Peaches tune got me thinking about her and I dug up this old video for the song. Never really got into her (I never liked her beats) but she had a certain kind of genius I suppose.

The latest head transplant news

Wrap your head around this!

Italian doc: I’ve found the key to head transplants

An Italian scientist has claimed that head transplants could be possible, after what he says is a major breakthrough in the technique. But another expert told The Local said the whole idea was potentially unethical.

Er, gee, you think?

This stuff isn’t as crazy as it sounds. A Russian scientist active in the 20th century did achieve some success with dog head transplants.

Vladimir Petrovich Demikhov was a real Soviet scientist. And he did the weirdest experiments with dog heads, keeping them alive separated from their bodies and transplanting them to other dog bodies.

He also attached heads and other parts to different dogs, resulting in weird hybrids that only survived for a few months. This research inspired the american doctor Robert White, another WW2 surgeon who followed the Soviet lead, performing the same experiments with rhesus monkeys.

Truth is, the science still seems to be in its infancy. Don’t expect human head transplants walking among us anytime soon.

However, the fact that it’s even considered as feasible is pretty astounding. And it raises all sorts of potential dramas. Will dying, mega-wealthy tycoons attempt to transplant their heads onto the bodies of teenage runaways? Will people be able to extend their life a few decades by living as mechanically supported human heads in a jar?

One can only hope.

Programming hunger

Last summer I had an experience that got me thinking about how much food we need to eat. I was in Paris with my Mom, and found that even though we were walking around most of the days, we only ate a couple meals per day. We had a breakfast, mostly of bread (you know the frogs and their bread) and then a regular meal in the afternoon. It was less than I would normally eat at home, yet I was never hungry.

The New Yorker blog has a post that connects to this, noting that why we get hungry is often unconnected to our need for energy.

More often than not, we eat because we want to eat—not because we need to. Recent studies show that our physical level of hunger, in fact, does not correlate strongly with how much hunger we say that we feel or how much food we go on to consume.

Even if you’ve had an unusually late or large breakfast, your body is used to its lunch slot and will begin to release certain chemicals, such as insulin in your blood and ghrelin in your stomach, in anticipation of your typical habits, whether or not you’re actually calorie-depleted.

This probably doesn’t surprise anyone, indeed I think we all observe this. You’re not hungry at all but a plate of fried chicken passes before you and whammo—pig out city!

The article states that we start to see part of our environment as cues to eat. We have a great snack on our favorite couch and we become conditioned—like Pavlov’s dogs—to associate that couch with snacking. We drive to the dentist and are reminded how there’s a great donut shop nearby and we start to crave donuts. I think this is partly why I experienced so little food craving in Paris—it is a city unfamiliar to me and I had not programmed in the environmental cues to stimulate hunger.

On a side note, I recall reading about a very ineffective campaign against drug use that was set up by some city. (I read about this a while back; can’t recall the location.) The government placed billboards in ghetto neighborhoods saying things like “Cocaine: It’s Evil” and showing a big pile of cocaine. Of course the result was rehabbing drug users saw these signs and thought, “Oh man I would love to snort a pile of coke like that right now!”

Breatharians

Today in my readings I came across mention of something I’d never heard of: breatharians. These are people who believe they can live without food by subsisting on air and sunlight. It sounds insane of course, but a google search reveals plenty of conversation about the topic. How do they do it? Well, for the most part they don’t.

In 1983, most of the leadership of the movement in California resigned when Wiley Brooks, a notable proponent of breatharianism, was caught sneaking into a hotel and ordering a chicken pie.[

Mmmm... chicken pie.

Also note:

Under controlled conditions where people are actually watched to see if they can do it without sneaking some real food, they fail. The name most commonly associated with breatharianism is Australia's Jasmuheen (born Ellen Greve), who claims to live only on a cup of tea and a biscuit every few days. However, the only supervised test of this claim (by the Australian edition of 60 Minutes in 1999) left her suffering from severe dehydration[4] and the trial was halted after four days, despite Greve’s insistence she was happy to continue. She claimed the failure was due to being near a city road, leading to her being forced to breathe “bad air”. She continued this excuse even after she was moved to the middle of nowhere.

The various forms of human insanity seem to have no limits.

The downsides of mindfulness

I’ve read a bit about the practice called mindfulness which, for lack of a better description, is a kind of focused attention on your surroundings. But paying close attention to your sensory experiences of the moment you can, the argument goes, transcend a lot of your worries and break the limiting tether to your ego or self. I’ve made passable stabs at mindfulness, often at a park or in nature, and it can be quite refreshing—a sort of mental reset button.

Part of the idea of mindfulness is that you focus one a specific thing, say your breathing. If a disruptive thought comes in, say, “I have to do my taxes” (Shit! I DO have to do my taxes!!!), you recognize it and let it dissipate, then return your focus to the now. As you train your mind in this practice, you experience less disruptive thoughts.

I’ve wondered if there’s a potential downside to this. Much of creative thought is of the sort that pops in to your head your while you are thinking about something else. Wouldn’t mindfulness, with its focused approach (albeit a rather gentle focus), eliminate these moments of inspiration? The answer, according to this NY Times article, appears to be yes .

But one of the most surprising findings of recent mindfulness studies is that it could have unwanted side effects. Raising roadblocks to the mind’s peregrinations could, after all, prevent the very sort of mental vacations that lead to epiphanies. In 2012, Jonathan Schooler, who runs a lab investigating mindfulness and creativity at the University of California, Santa Barbara, published a study titled “Inspired by Distraction: Mind Wandering Facilitates Creative Incubation.” In it, he found that having participants spend a brief period of time on an undemanding task that maximizes mind wandering improved their subsequent performance on a test of creativity. In a follow-up study, he reported that physicists and writers alike came up with their most insightful ideas while spacing out.

“A third of the creative ideas they had during a two-week period came when their minds were wandering,” Schooler said. “And those ideas were more likely to be characterized as ‘aha’ insights that overcame an impasse.”

And that’s not all…

Another potential drawback to mindfulness has been identified by researchers at Georgetown University. In a study presented at the Society for Neuroscience annual meeting in November, they found that the higher adults scored on a measurement of mindfulness, the worse they performed on tests of implicit learning — the kind that underlies all sorts of acquired skills and habits but that occurs without conscious awareness. In the study, participants were shown a long sequence of items and repeatedly challenged to guess which one would come next. Although supposedly random, it contained a hidden pattern that made some items more likely to appear than others. The more mindful participants were worse at intuiting the correct answers.

“There’s so much our brain is doing when we’re not aware of it,” said the study’s leader, Chelsea Stillman, a doctoral candidate. “We know that being mindful is really good for a lot of explicit cognitive functions. But it might not be so useful when you want to form new habits.” Learning to ride a bicycle, speak grammatically or interpret the meaning of people’s facial expressions are three examples of knowledge we acquire through implicit learning — as if by osmosis, without our being able to describe how we did it. (Few of us can recite the rules of grammar, though most of us follow them when we speak.)

The solution is probably moderation in all things, including mindfulness.

Definitely don’t count your chickens

I’m a big fan of highlighting sentences that are so waffly and filled with caveats that they become meaningless. This one, from a recent Scientific American article on autism, is a current favorite.

If such futuristic scenarios ever materialize, we may one day be able to say that we are nearing a cure for children such as Adrianna and Jermaine’s young Jayden.

So IF these “futuristic” scenarios materialize, can we say we’ve defeated autism? Well, no, but at that point we MAY be able to say we’re NEARING a cure.

It’s practically a done deal.

How to innovate! (Don’t be too innovative.)

As a society, or species, (or whatever we are), we tend to laud forward thinking creative geniuses. When we find one, we hoist them onto a pedestal and treat them as an (to quote this article) “Übermensch [who] stands apart from the masses, pulling them forcibly into a future that they can neither understand nor appreciate.” This is true across all disciplines. Think of Einstein, Beethoven, Picasso, on and on.

So how does one become a genius? Clearly you have to innovate, to do something no one else has done. But there’s a catch here. You can’t be too innovative. You can’t be so ahead of the curve that nobody can really grasp what you’re saying or doing.

Let me propose a thought experiment. Jimi Hendrix travels to Vienna around 1750 and plays his music. Would he be lauded as a genius? Would his guitar playing be heard as the obvious evolution of current trends in music? No, he’d probably be regarded as an idiot making hideous noise and he might be burned at the stake.

But, let music evolve for around 220 years and yes, Jimi is rightfully regarded as a genius. His sound was made palatable by those who came before him, mainly electric blues guitar players of the 50s and 60s. (Obviously there are a lot of other factors (like race and class and sex) relevant to whom gets crowned a genius but I’m painting in broad strokes here.)

So the trick to being a genius is to be ahead of your time but not too ahead. The world of science and medicine is filled with examples. Gregor Mendel famously discovered that physical traits could be passed from one generation of life to another. In what was a major breakthrough in our understanding of biology, he theorized what we came to call genes. He published his results and was met with pretty much total indifference. It wasn’t until his work was rediscovered decades later that we applied them. Mendel was too ahead of his time.

The book “The Mind’s I” notes the mathematician Giovanni Girolamo Saccheri who contributed to the discovery of non-Euclidian geometry. His ideas were so controversial that even Saccheri himself rejected them! (At least he did according to the book; there seems some debate on this. See the last graph on the Saccheri wiki page.) Talk about being too ahead of your time.

But perhaps the best example of this sort of thing is Ignaz Semmelweis. The Hungarian physician…

…discovered that the incidence of puerperal fever could be drastically cut by the use of hand disinfection in obstetrical clinics.

That’s right, he basically came up with the crazy idea that doctors should wash their hands after touching sick people. Unfortunately…

Despite various publications of results where hand-washing reduced mortality to below 1%, Semmelweis’s observations conflicted with the established scientific and medical opinions of the time and his ideas were rejected by the medical community. Some doctors were offended at the suggestion that they should wash their hands and Semmelweis could offer no acceptable scientific explanation for his findings. Semmelweis’s practice earned widespread acceptance only years after his death, when Louis Pasteur confirmed the germ theory and Joseph Lister, acting on the French microbiologist’s research, practiced and operated, using hygienic methods, with great success.

Oh well. Semmelweis probably still had a great career and life right?

Umm, no.

In 1865, Semmelweis was committed to an asylum, where he died at age 47 after being beaten by the guards, only 14 days after he was committed.

Don’t be too ahead of the curve folks.

Cortisol: friend or enemy?!!

A while back I mentioned an event that occurred when I was trying to go to sleep one night. Why don’t I let me describe it?

…a couple nights ago I was lying half asleep and heard a clicking sound of the sort that houses often make — the wood of the frame settling a bit or something. My mind said, “Oh, an annoying sound… whatever,” but my body had a noticeable reaction. I felt that feeling of “tinglies” traveling from my neck down to my body. I presume these “tinglies” to be adrenaline released to the body from the brain and I presume the individual “pricks of tingliness” to be this adrenaline stimulating the synapses of various muscle nerves.

I doubt this is uncommon and I suspect most of us experience it. I, personally, experience it more in the morning. I’ll be lying there, pleasantly adrift, and a thought or sound will occur that jostles me from sleep. As much as I would like to go back to my reverie, I can feel my body waking up, against my wishes.

I’ve long suspected that the reason for this must be something to do with hormone production. Basically, we go to sleep and our body spends the night replenishing spent chemicals (like adrenaline and glucocorticoids, which, while I don’t fully understand, are similar to adrenaline in function.) In the morning we’ve got a vast reserve of these chemicals and it doesn’t take much to get them to be fired off.

Recently I was reading an article on coffee and it hinted at the same idea. We shouldn’t drink coffee in the morning, it said, because our body is already rich with excitatory cortisol. (Cortisol is a glucocorticoid which is a type of hormone.)

…if we are drinking caffeine at a time when your cortisol concentration in the blood is at its peak, you probably should not be drinking it. This is because cortisol production is strongly related to your level of alertness and it just so happens that cortisol peaks for your 24-hour rhythm between 8 a.m. and 9 a.m. on average (Debono et al., 2009). Therefore, you are drinking caffeine at a time when you are already approaching your maximal level of alertness naturally.

This jibes with my sense that some excitatory hormone is at its peak during the morning. So I was pretty much right about everything, as usual.

I’m not going to stop drinking coffee in the morning though.