Braintrust: What Neuroscience Tells Us about Morality

DSCN0107 “Morality seems to me to be a natural phenomenon- constrained by the forces of natural selection, rooted in neurobiology, shaped by the local ecology, and modified by cultural developments.” (Churchland 191)

Braintrust: What Neuroscience Tells Us about Morality, by Patricia Churchland, is the perfect book for those keenly interested in psychology and philosophy. Of course, if one were to weigh the two subjects, as they are presented in the pages of this work, the balance would undoubtedly fall heavier on psychology’s side. To go even deeper than that, the topic Braintrust constantly comes back to is biology, because, as it is the case with any other human (or animal)-related conversation, it has its roots deep in the science of life. What Churchland is trying to prove, in a very eloquent manner, is the idea that morality exists because of the activity of neurons in the brain. This may, indeed, seem obvious, particularly for those who are adepts of Evolution; however, the road to proving this point is certainly not a straight or simple one.

As mentioned before, the idea holding the pages of the book together is that our brains are predisposed to morality, because evolutionarily, humans have evolved to be social animals (since living in large numbers provides more safety and security than living in small groups). Once one accepts this idea, it is by no means too big of a leap to assume that those who were inclined to respect others did much better (i.e. survived and reproduced at bigger rates) than those who didn’t.  With time, this respect for others turned into what we now think of as morality.

The book is relatively easy to read, but I wouldn’t necessarily recommend it as good for bedtime, because it does require an alert mind to pick up on all the rich information found within it. Some knowledge of biology (and psychology) helps quite a lot, but it is not required because the notes at the end aid to the comprehension of the more complicated concepts.  The two last chapters have much more to do with philosophy than the rest, so one might find them easier to read.

For those interested in the topic discussed in the book, here is a sort of summary of the work presented by the writer herself:

Here are the lessons one could learn from reading Braintrust, but keep in mind that some of them depend on the theory presented in the work being true:

1)Morality is a result of our social nature.

Indeed, the large majority of individuals like to be surrounded by people, who will, hopefully, admire and approve of their behaviour. Well, when it comes down to it, nobody truly approves of others’ actions when those actions may harm them, because of fear for their  own wellbeing. Thus, in time, humans learnt to conduct themselves in a manner that isn’t dangerous to others (or, from an altruistic perspective, in a manner that is beneficial to others).  If it is true that morality is a by-product of our desire to be a part of a group, then we need to see it as depend on other individuals.

2)Morality depends on the amount of resources available.

Part of the argument in this book is that morality developed because humans had an abundance of supplies, which reduced competition for survival. When this occurred, we were enabled to start caring for more individuals than ourselves and our immediate relatives. There is, however, a dark side to this lesson: if morality does depend on  resources, then when these resources are taken away, so is morality. This idea can be seen in The Road, by Cormac McCarthy, and in many other apocalypse-themed stories.

3)Morality depends on both emotions and rational thinking.

Many have treated morality as if it were based only on our reasoning capacities, and had nothing to do with our feelings. This seems to be false, and a little observation of our own behavior will prove that this is the case. Nobody denies that there are instances when we have to think things through in order to come up with a sound moral solution, but more often than not, our answers come instinctively, without us having taken the time to decide whether the matter at hand is “good” or “bad”.  This means that we have to realize that much of what we consider to be morally correct comes from habit, mimicry and other things that are imprinted in our minds without us being aware of it.

4) There are no known moral rules that are without exception true.

Churchland proves this point by particularly concentrating on the Golden Rule, since this is the one people most often think as always being true. I would personally argue that in an abstract sense it would stand, but that would mean taking this blog post in another direction that it is meant to go. Nonetheless, even if the Golden Rule is always true, there are still many, many other moral rules that are only true sometimes. What this means is that we must be a little more careful when we judge situations, because we might fall into the trap of believing that one event is not in accordance with the rules of morality, because of a similar event that occurred in the past, when in fact the two situations might differ enough that the rule used for one cannot be used for the second.

Morality is a tricky thing. Although on basic matters it seems to be the same for nearly all individuals belonging to a culture, there are differences even between individuals belonging to the same group regarding some issues (such as euthanasia and abortion). However, it seems that the human brain, when developed normally, is predisposed to adhering to some moral code, which is shaped after birth by circumstances, culture and society.