Moral dilemmas and neural dissent
Humans are crap with moral dilemmas — but don’t worry, it is your brains fault.
The crying baby dilemma and mental load do not play well. The switch and footbridge dilemma are also variations which can form the basis for a dark discussion over dinner. Essentially people must fall into two groups in these dilemmas, act or do not. “Not acting will produce a worse result, so I must act” is a consequentialism stand point. “I can’t do the action as the action is immoral regardless of the result” is the deontology school of thought. People will not be consistent.
The crying baby dilemma is a more brutal extension of the switch/footbridge dilemma. They focus on consequences being incurred with any action or in-action, but objectively worse impact when inaction is chosen. vSauce did a good deep dive and physical experiment to show that humans, when under ‘real’ circumstance, are inconsistent with their responses.
The author here links this inconsistency to the fact that the brain has two competing neural systems vying for dominance. Their research provide insight.
Through scanning (fMRI) participants when given this dilemma, they found that when impersonal dilemmas (like the switch dilemma) was given, the logic engines of the brain dominated activity. When the personal dilemmas (like the footbridge — as they involve emotion / action) were discussed, the emotional engines of the brain dominated. When the Crying Baby dilemma is discussed, because it is so abhorrent (suffocate the baby or your family, you and the baby die) this causes huge stress on the participants — illustrated by both logic and emotional engines struggling to resolve.
So which parts of the brain do what?
Here, the logic engine seems to be the dorsolateral pre-frontal cortex. Dorso means back and top; lateral means side; pre means ahead; frontal means front; cortex means brain area. I hope that is clear. Essentially go two inches above your eye and you aren’t too far away. This region helps with numbers, goals, intention and attention.
The mediator here, that is screaming out about the confusion between the two sides, is the anterior cingulate cortex: anterior also means front, and the cingulate is name for the long bit in the middle — highlighted:
Well it provides insight into why brain conditions such as frontotemporal dementia can cause changes which affect peoples emotional ability, for example. It also can give insight in how to dupe our own brains.
The author here devised a challenge — clog the brain up with mundane logic tasks and give the participants moral dilemmas at the same time, to see if there is an effect on their decisions. Participants showed more emotional bias when the moral dilemmas were given — hence changing human behaviour due to circumstance — when under ‘logic load’. [The experiment made participants follow a stream of numbers and press a button when they saw a trigger value.]
Another interesting point
Our brains have evolved to recognise personal and direct force and physical involvement as risk, hence we have a predisposition to not engage in it. In a mechanical world, we do not have the same connection as our brain has not evolved to recognise it — this is apparent in the difference from switching a switch to kill one versus pushing someone to prevent others when the result is the same.
This then stems to the tools that may be used to cause others harm, be it directly with weapons (think guns or nuclear weapons) or with constructs such as poverty. This is more abstract to our brains when recognising suffering — think stabbing vs shooting and which feels more abhorrent despite the result being the same. “What is the difference between refusing to save a child drowning who’s right in front of you and refusing to save a child who’s drowning in poverty on the over side of the world? Your rationalising mind…”
We are not built to be compassionate or violent from a distance, but it may be something we need “for creatures whose survival depends on cooperation”.
Inspired by Joshua D. Greene’s “Fruit Flies of the Moral Mind” from “What’s next? Dispatches on the future of science” 2009
Recently I began reading this book of science essays that were meant to assess the future. Seeming it is 10 years since the book was published, I thought it may be interesting to revisit these.