I’ve been reading a rather dense, philosophical book called “Freedom Evolves” by Daniel Dennett. I’m not sure how much I’m getting out of it but it does have one interesting nugget worth reporting on.
To understand this nugget I have to first describe our general view of reason and emotion. This view is that reason is sort of the antidote to emotion. Men’s emotions run wild and a stern application of reason is necessary to “talk them off the ledge.” (You see this point brought up often during this contentious election cycle.)
The view intimated the Dennett book is that, in fact, emotion is a cure for reason. Evolution “created” emotion to ward of the potential dangers of reason.
What do I mean? Well let’s say you were captured by some fiend and he asked you to make a choice. He was going to kill one person and that person could be your son or some guy in China whom you’d never met. A person operating on pure reason would have trouble with this decision. He or she might factor in the relative ages of these two people, deciding who still had the most life to live. He or she might try to take a guess at what productive things each potential target might do in their lives in order to ascertain who was the most valuable person.
An emotional person (e.g. the rest of us) would say “kill the Chinese guy.” We might be torn about it, but I think that’s the decision most of us would ultimately make because we would would have a strong emotional connection to our child and very little emotional connection to a stranger. (This brings to mind Peter Singer’s “Drowning Child” thought experiment.)
This idea—that emotion helps us make decisions—ties in with the work of Antonio Damasio. In his book, “Descartes’ Error” he described people who, due to some pathology, had lost their ability to really feel emotions. As a result, their decision-making abilities went in the toilet. I think Damasio described a fellow who was fired from his work because he couldn’t prioritize tasks. The boss would tell the guy to finish a report and he would miss the deadline because he spent 8 hours arranging staplers. He could not prioritize because every option had equal emotional “weight” (and that is to say, none.)
This also ties in with some fears I’ve seen expressed about artificial intelligence. The concern is that A.I. might be programed to do some task like construct a new kind of material and then decide that human bones are the best source for this new material and therefore the A.I. would instigate massive genocide to farm for human bones. It would do this because it would operating using only logic and no emotional weighting. (I’m using a vastly simplified example of this fear, but you get the drift.)