Wednesday 23 September 2009

On the Social Nature of Evil

Last 9th of September I wrote a post commenting on an academic article written by S. Ghoshal. In his article, Ghoshal, criticised some of the basic assumptions of classic management theories and claimed that events such as the Enron crack & Co. were happening because of the "ideology-based, gloomy vision" of such theories. Basically, in a few words, we can say that the "wrong" mental programming managers received was creating evil actors (you can read the full post here: Questioning the Basic Assumptions of Management Theories). Therefore, according to this article, the bad acting of some managers could be brought back to some of the features of the system/context. We are not intrinsically evil or good. Obviously there are people who are more "good" than others. But we should pay attention to the (social) context in which we act. Let's understand why.

One of the first tendencies I want to talk about is the famous Fundamental Attribution Error. Frequently described as one of the funding principles of Social Psychology, it describes the tendency for human beings to overestimate dispositional, personality-based explanations when trying to understand/explain someone actions, and therefore undervalue the influence of context. Let's make a quick example to make things clearer: I meet Peter, an old friend of mine, on the street. I'd like to have a chat with him, since it's been a long time since we last speak to each other. He only stops for less than 30 seconds and then runs away. I think to myself: "what a jerk he has become!". I don't take into consideration the context: Peter had to go to a funeral and was in a hurry. Moreover he was depressed and emotional because of the funeral and wasn't feeling like speaking to anyone. This was a very silly example but you see what I mean. I bet that you can think of a lot of examples of similar attribution errors.

Therefore sometimes we tend to overestimate the probability of people being evil (or rude, or shy, or inadequate, etc...) because of this bias we have. We tend not to take into consideration the power of context.

Let's dig deeper. In 1961 a Yale Researcher, Stanley Milgram, performed an experiment in order to understand how people react to authority under certain conditions, such as obeing to orders conflicting with their personal conscience. The results of the experiment were debated for years and today the experiment is frequently cited when trying to explain certain extreme events and/or behaviours (e.g.: the holocaust).

Let's see very briefly how Milgram Experiment works (you can read a more detailed explanation of the experiment here: Milgram Experiment): Participants were recruited with ads on newspapers and later told that they were participating to an experiment on learning (and that they were going to be paid for that). The experiment needed three persons/roles called "teacher", "experimenter" and "learner". Actual participants were assigned to the "teacher" role whereas researchers were in the other two positions. Participants did not know that the "learner" or "victim" was actually an actor.

After that, "learners" had to fulfill some kind of "learning task" and were later asked to answer some questions, "teachers" were controlling them. When "learners" gave wrong answers "teachers" had to to administer an electric shock to the victim whereas when the answer given was correct "teachers" could ask the next question. Every time the shock had to be increased by 15 volts. Shocks could range from 15 volts (hardly perceptible) to 450 volts (dangerous)! Obviously "victims" weren't receiving any actual shock, they were just pretending by lamenting and screaming when administered shocks. The role of the "experimenter" was to encourage the "teacher" to administer shocks when they wanted to halt the experiment.

The results were kind of unexpected. Even though many subjects were showing signs of tension and unease, a large percentage of them (65% in the first set of experiments) were arriving up to the final, massive 450 volt shock therefore obeying the experimenters's orders!

In commenting the results of the experiment, british philosopher and sociologist Z. Bauman wrote: cruelty correlates with certain patterns of social interaction much more closely that it does with personality features or other individual idiosyncracies of the perpetrators. Cruelty is social in its origins much more than it is charactereological.

We can therefore understand how context plays a crucial role in determining who we are. Incentive systems strongly determine how we act. We cannot say that people working at Enron (I'm using Enron as an example, but you can use the example you like the most) were evil, perverted humans. On this concern, American psychologist Philip Zimbardo (author of the controversial experiment famous as The Stanford Prison Experiment) wrote the book The Lucifer Effect. The concept describes the point in time when an ordinary, normal person first crosses the boundary between good and evil to engage in an evil action. [...] Such transformations are more likely to occur in novel settings, in “total situations,” where social situational forces are sufficiently powerful to overwhelm, or set aside temporally, personal attributes of morality, compassion, or sense of justice and fair play.

If you want to have an overview on the topic and/or hear more about the above-mentioned experiments check out the following video:

0 comments :