Natural morality describes a form of morality that is based on how humans evolved, rather than a morality acquired from societal norms or religious teachings. Charles Darwin’s theory of evolution is central to the acceptance of a natural moralityRef

We, humans, tend to think of emotions as dangerous forces that need to be strictly controlled by reason and logic. But that’s not how the brain works. In the brain logic and reason are never separate from emotion. Even nonsense syllables have an emotional charge, either positive or negative. Nothing is neutral. The seven networks of emotion in the brain: SEEKING, RAGE, FEAR, LUST, CARE, PANIC/GRIEF, and PLAY. In Pankseep’s Affective Neuroscience, he explains that there “is good biological evidence for at least seven innate emotional systems….” Panksepp explains that some of these “universally recognized emotions correspond to the ‘infantile’ feelings that young children exhibit.” These emotional systems are genetically encoded into the subcortical neurocircuitry of the mammalian brain. Stimulating different subcortical areas via electrodes produces emotional reactions in animals. In contrast, “We cannot precipitate emotional feelings by artificially activating the neocortex either electrically or neurochemically,” writes Panksepp. He points out, however, that “emotionality is modified by cortical injury.” He also emphasizes: “Emotive circuits have reciprocal interactions with the brain mechanisms that elaborate higher decision-making processes and consciousness.” Panksepp points out that each major emotional system “has intrinsic response patterning mechanisms” in place. Real world experience can, however, affect the natural expression of primal emotional systems. For example, Panksepp writes: “Most cats that have been reared only with other cats will hunt and kill mice and rats, but those that have been reared with rats from the time of birth show no such inclination.” Ref According to ScienceDaily, a number of recent studies support the role of emotions in moral judgment, and in particular a dual-process model of moral judgment in which both automatic emotional processes and controlled cognitive processes drive moral judgment,” explained Young. “For example, when people must choose whether to harm one person to save many, emotional processes typically support one type of non-utilitarian response, such don’t harm the individual, while controlled processes support the utilitarian response, such as save the greatest number of lives. Our study showed that utilitarian judgment may arise not simply from enhanced cognitive control but also from diminished emotional processing and reduced empathy.” The researchers’ findings show there is a key relationship between moral judgment and empathic concern, in particular, specifically feelings of warmth and compassion in response to someone in distress. In a series of experiments, utilitarian moral judgment was revealed to be specifically associated with reduced empathic concern, and not with any of the demographic or cultural variables tested, nor with other aspects of empathic responding, including personal distress and perspective taking. The study of 2748 people consisted of three experiments involving moral dilemmas. In two of the experiments, the scenario was presented to participants in both “personal” and “impersonal” versions. In the first experiment’s “personal” version, participants were told they could push a large man to his death in front of an oncoming trolley to stop the trolley from killing five others in its path. In the “impersonal” version, participants were told they could flip a switch to divert the trolley. In the second experiment’s “impersonal” scenario, participants were given the option of diverting toxic fumes from a room containing three people to a room containing only one person. In the “personal” scenario, participants were asked whether it was morally acceptable to smother a crying baby to death to save a number of civilians during wartime. The final experiment included both a moral dilemma and a measure of selfishnesses. The moral dilemma asked participants if it was permissible to transplant the organs of one patient, against his will, to save the lives of five patients. In the selfishness measure, participants were asked if it was morally permissible to report personal expenses as business expenses on a tax return to save money. This experiment provided the researchers with a sense of whether utilitarian responders and selfish responders are alike in having a lower empathetic concern. In other words, do utilitarians endorse harming one to save many simply because they endorse harmful, selfish acts more generally? The results suggest that the answer is no; utilitarians appear to endorse harming one to save many due to their reduced empathic concern and not due to a generally deficient moral sense. In each experiment, those who reported lower levels of compassion and concern for other people — a key aspect of empathy — picked the utilitarian over the non-utilitarian response. However, other aspects of empathy, such as being able to see the perspective of others and feel distress at seeing someone else in pain, did not appear to play a significant role in these moral decisions. Similarly, demographic and cultural differences, including age, gender, education, and religiosity, also failed to predict moral judgments. “Diminished emotional responses, specifically, reduced empathic concern, appear to be critical in facilitating utilitarian responses to moral dilemmas of high emotional salience,” the researchers concluded. ” Ref According to , surprisingly, our emotions do not appear to have much effect on our judgments about right and wrong in these moral dilemmas. A study of individuals with damage to an area of the brain that links decision-making and emotion found that when faced with a series of moral dilemmas, these patients generally made the same moral judgments as most people. This suggests that emotions are not necessary for such judgments. Our emotions do, however, have a great impact on our actions. How we judge what is right or wrong may well be different from what we chose to do in a situation. For example, we may all agree that it is morally permissible to kill one person in order to save the lives of many. When it comes to actually taking someone’s life, however, most of us would turn limp. Another example of the role that emotions have on our actions comes from recent studies of psychopaths. Studies suggest that clinically diagnosed psychopaths do recognize right from wrong, as evidenced by their responses to moral dilemmas. What is different is their behavior. While all of us can become angry and have violent thoughts, our emotions typically restrain our violent tendencies. In contrast, psychopaths are free of such emotional restraints. They act violently even though they know it is wrong because they are without remorse, guilt or shame. These studies suggest that nature handed us a moral grammar that fuels our intuitive judgments of right and wrong. Emotions play their strongest role in influencing our actions—reinforcing acts of virtue and punishing acts of vice. We generally do not commit wrong acts because we recognize that they are wrong and because we do not want to pay the emotional price of doing something we perceive as wrong. Ref According to John Shook, those who claim that only religion can supply moral objectivity are either ignorant of what the term ‘objective’ means, or they are using ‘objective’ in a peculiar way to actually mean ‘absolute’.  ‘Objective’ is the contrary of ‘subjective’ — where ‘subjective’ means dependence on a subject (an individual person), ‘objective’ means independent from an individual person.  Science knows about objectivity. And science can study subjectivity, too. However, if morality is reduced to just what psychology can grasp, then there is a risk of justifying moral subjectivity. In a previous post I pointed out how a naturalistic understanding of morality should support moral objectivity . And then I read Paul Bloom’s essay “How do morals change” which also worries that psychology can’t have the whole story about morality. Morality is a paradigm example of something that can be, and usually is, independent from any individual person. Whether a deed is moral or immoral does not depend on the judgment or feeling or whim of any single person. Unless that person is God, a religious person might say. However, the simplistic religious view that morality depends on the will of God is just subjectivism on a cosmic scale. Naturalists of course do not regard ethical ideals as absolute moral truths, either. However, people do appeal to ethical ideals when they compare, criticize, and modify the moralities of cultures. From the standpoint of naturalism, it is perfectly natural to expect people to try to change a morality using ethical thinking when they see problems with that morality. And it also quite natural to expect that ethical ideals are the sorts of things that people do not agree about, and that ethical ideals also change or disappear over time. Culture is not the opposite of nature; we are naturally moral. It is a misunderstanding of naturalism if you suppose that a naturalistic understanding of humans must entirely strip away culture and ignore how humans are cultured humans. If you want to study humans unaffected by culture, study early-term fetuses or study isolated genes, but you won’t find morality there. It is also a misunderstanding of naturalism if you expect that a naturalistic understanding of morality must derive warm moral ‘oughts’ from cold scientific facts. 19th Century naturalists once talked that way — they perpetuated the root religious notion that morality could be discerned in the natural design of things — again obscuring how people are naturally encultured. A contemporary naturalist should not repeat outdated religious notions. Humans naturally use the cultural wisdom bestowed by earlier generations. This natural fact explains how religions teach morals and pass down ethical ideals, by the way. There is nothing in objective morality that cannot fit into the naturalistic worldview. Wrongly supposing that morality can’t be natural is akin to supposing that agriculture can’t be natural. Basic morality and higher ethics, and even religious ethics can all fit into a naturalistic worldview. Ref

Ethical Thinking or Moral Reasoning Should be Rational AND Emotional

Doubt god(s)? No, I stopped believing Fairytales.

Axiology, Naturalism, Realism and Moral Theory Ideas

Axiological Atheism Morality Critique: of the bible god

Bible Morality and a Genocidal god of Watery Death?

How Do I Gain a Morality Persuasion or Make a Change to it?

Atheist Morality = Scientific Morality?

I have questions for someone believing all morality is subjective

Axiological Morality Critique of Pseudo-Morality/Pseudomorality?

MORALITY: values, morals, and ethics

Quickly Grasping Naturalistic Morality

Babies & Morality?

Think there is no objective morality?

Religions Promote Pseudo-Morality

Bible Morality and a Genocidal god of Watery Death?

Axiological Atheism Morality Critique: of the bible god

Morality: all subjective or all objective?

True Morality Not the Golden Rule…

Real Morality vs. Pseudo Morality

Believe in Good, Humanist Morality?

Axiological/axiology (value theory/value science) Atheism?