https://www.ted.com/talks/dan_ariely_on_our_buggy_moral_code/transcript?language=en
Now, this is important, because remember, when the moment the student stood up, it made it clear to everybody that they could get away with cheating, because the experimenter said, "You've finished everything. Go home," and they went with the money. So it wasn't so much about the probability of being caught again. It was about the norms for cheating. If somebody from our in-group cheats and we see them cheating, we feel it's more appropriate, as a group, to behave this way. But if it's somebody from another group, these terrible people -- I mean, not terrible in this -- but somebody we don't want to associate ourselves with, from another university, another group, all of a sudden people's awareness of honesty goes up -- a little bit like The Ten Commandments experiment -- and people cheat even less.
So, what have we learned from this about cheating? We've learned that a lot of people can cheat. They cheat just by a little bit. When we remind people about their morality, they cheat less. When we get bigger distance from cheating, from the object of money, for example, people cheat more. And when we see cheating around us, particularly if it's a part of our in-group, cheating goes up. Now, if we think about this in terms of the stock market, think about what happens. What happens in a situation when you create something where you pay people a lot of money to see reality in a slightly distorted way? Would they not be able to see it this way? Of course they would. What happens when you do other things, like you remove things from money? You call them stock, or stock options, derivatives, mortgage-backed securities. Could it be that with those more distant things, it's not a token for one second, it's something that is many steps removed from money for a much longer time -- could it be that people will cheat even more? And what happens to the social environment when people see other people behave around them? I think all of those forces worked in a very bad way in the stock market.
More generally, I want to tell you something about behavioral economics. We have many intuitions in our life, and the point is that many of these intuitions are wrong. The question is, are we going to test those intuitions? We can think about how we're going to test this intuition in our private life, in our business life, and most particularly when it goes to policy, when we think about things like No Child Left Behind, when you create new stock markets, when you create other policies -- taxation, health care and so on. And the difficulty of testing our intuition was the big lesson I learned when I went back to the nurses to talk to them.
So I went back to talk to them and tell them what I found out about removing bandages. And I learned two interesting things. One was that my favorite nurse, Ettie, told me that I did not take her pain into consideration. She said, "Of course, you know, it was very painful for you. But think about me as a nurse, taking, removing the bandages of somebody I liked, and had to do it repeatedly over a long period of time. Creating so much torture was not something that was good for me, too." And she said maybe part of the reason was it was difficult for her. But it was actually more interesting than that, because she said, "I did not think that your intuition was right. I felt my intuition was correct." So, if you think about all of your intuitions, it's very hard to believe that your intuition is wrong. And she said, "Given the fact that I thought my intuition was right ..." -- she thought her intuition was right -- it was very difficult for her to accept doing a difficult experiment to try and check whether she was wrong.
But in fact, this is the situation we're all in all the time. We have very strong intuitions about all kinds of things -- our own ability, how the economy works, how we should pay school teachers. But unless we start testing those intuitions, we're not going to do better. And just think about how better my life would have been if these nurses would have been willing to check their intuition, and how everything would have been better if we just start doing more systematic experimentation of our intuitions. - Dan Ariely
This was another thoroughly enlightening and refreshing presentation by Dan Ariely. I highly recommend watching or reading the entire Talk.
What the discussion boils down to is that we function very much according to how we see ourselves, which is why, even when given the opportunity to cheat and get a way with it, most people will only cheat a little bit - cheating more than that does not reflect how we see ourselves (as an 'honourable' person or whatever). Obviously the fact that most people are willing to cheat even a little bit and still regard themselves as being 'honourable' or as 'not a cheater' leads one to question the integrity of humanity as a whole.
The fact that most people are willing to cheat a little bit also makes it easier to see why the world works the way it does. Our moral code is skewed - even though it's just a little, it's enough to have a compounding effect on the entirety of our lives.
It is even more interesting how our self image changes in the presence of certain factors (like a moral code, or a member of our in- or out-group) - BUT it changes only for that moment. So when you're alone your self image will allow for a little bit of cheating (you're still a good person); when you are primed with a moral code you are much less likely to cheat at all (you're definitely a VERY good person); when someone from an out-group (you do not identify with them, like an opposing team) cheats big time then you are more likely to not cheat at all (the guys from that team are such douchebags, I am certainly not a douchebag like that guy); but when someone from your in-group (you identify with them personally, like team-mates) cheats big time then you are also more likely to cheat more (that person is a good person because I like them and if they can cheat and still be a good person then I can relax my moral codes a little and still be a good person and not think differently of myself). While these things are happening you still see yourself as a good person and perceive that your personal moral code is being upheld. This goes to show that personal morals are adaptable to outside stimuli and are not fixed in our society (generally speaking).
Dan's closing statement is absolutely accurate. Our personal opinions seem infallible to us - but the problem is that they are often unfounded and untested and yet we still treat them like the word of god. We need to develop the ability to consider things objectively and to question everything that we so readily accept as fact or truth. The recurring nature of our fallacies in thought, word and deed are a testament to the fact that the way we are living is not practical, nor will it produce the best possible life for everyone.
Now, this is important, because remember, when the moment the student stood up, it made it clear to everybody that they could get away with cheating, because the experimenter said, "You've finished everything. Go home," and they went with the money. So it wasn't so much about the probability of being caught again. It was about the norms for cheating. If somebody from our in-group cheats and we see them cheating, we feel it's more appropriate, as a group, to behave this way. But if it's somebody from another group, these terrible people -- I mean, not terrible in this -- but somebody we don't want to associate ourselves with, from another university, another group, all of a sudden people's awareness of honesty goes up -- a little bit like The Ten Commandments experiment -- and people cheat even less.
So, what have we learned from this about cheating? We've learned that a lot of people can cheat. They cheat just by a little bit. When we remind people about their morality, they cheat less. When we get bigger distance from cheating, from the object of money, for example, people cheat more. And when we see cheating around us, particularly if it's a part of our in-group, cheating goes up. Now, if we think about this in terms of the stock market, think about what happens. What happens in a situation when you create something where you pay people a lot of money to see reality in a slightly distorted way? Would they not be able to see it this way? Of course they would. What happens when you do other things, like you remove things from money? You call them stock, or stock options, derivatives, mortgage-backed securities. Could it be that with those more distant things, it's not a token for one second, it's something that is many steps removed from money for a much longer time -- could it be that people will cheat even more? And what happens to the social environment when people see other people behave around them? I think all of those forces worked in a very bad way in the stock market.
More generally, I want to tell you something about behavioral economics. We have many intuitions in our life, and the point is that many of these intuitions are wrong. The question is, are we going to test those intuitions? We can think about how we're going to test this intuition in our private life, in our business life, and most particularly when it goes to policy, when we think about things like No Child Left Behind, when you create new stock markets, when you create other policies -- taxation, health care and so on. And the difficulty of testing our intuition was the big lesson I learned when I went back to the nurses to talk to them.
So I went back to talk to them and tell them what I found out about removing bandages. And I learned two interesting things. One was that my favorite nurse, Ettie, told me that I did not take her pain into consideration. She said, "Of course, you know, it was very painful for you. But think about me as a nurse, taking, removing the bandages of somebody I liked, and had to do it repeatedly over a long period of time. Creating so much torture was not something that was good for me, too." And she said maybe part of the reason was it was difficult for her. But it was actually more interesting than that, because she said, "I did not think that your intuition was right. I felt my intuition was correct." So, if you think about all of your intuitions, it's very hard to believe that your intuition is wrong. And she said, "Given the fact that I thought my intuition was right ..." -- she thought her intuition was right -- it was very difficult for her to accept doing a difficult experiment to try and check whether she was wrong.
But in fact, this is the situation we're all in all the time. We have very strong intuitions about all kinds of things -- our own ability, how the economy works, how we should pay school teachers. But unless we start testing those intuitions, we're not going to do better. And just think about how better my life would have been if these nurses would have been willing to check their intuition, and how everything would have been better if we just start doing more systematic experimentation of our intuitions. - Dan Ariely
This was another thoroughly enlightening and refreshing presentation by Dan Ariely. I highly recommend watching or reading the entire Talk.
What the discussion boils down to is that we function very much according to how we see ourselves, which is why, even when given the opportunity to cheat and get a way with it, most people will only cheat a little bit - cheating more than that does not reflect how we see ourselves (as an 'honourable' person or whatever). Obviously the fact that most people are willing to cheat even a little bit and still regard themselves as being 'honourable' or as 'not a cheater' leads one to question the integrity of humanity as a whole.
The fact that most people are willing to cheat a little bit also makes it easier to see why the world works the way it does. Our moral code is skewed - even though it's just a little, it's enough to have a compounding effect on the entirety of our lives.
It is even more interesting how our self image changes in the presence of certain factors (like a moral code, or a member of our in- or out-group) - BUT it changes only for that moment. So when you're alone your self image will allow for a little bit of cheating (you're still a good person); when you are primed with a moral code you are much less likely to cheat at all (you're definitely a VERY good person); when someone from an out-group (you do not identify with them, like an opposing team) cheats big time then you are more likely to not cheat at all (the guys from that team are such douchebags, I am certainly not a douchebag like that guy); but when someone from your in-group (you identify with them personally, like team-mates) cheats big time then you are also more likely to cheat more (that person is a good person because I like them and if they can cheat and still be a good person then I can relax my moral codes a little and still be a good person and not think differently of myself). While these things are happening you still see yourself as a good person and perceive that your personal moral code is being upheld. This goes to show that personal morals are adaptable to outside stimuli and are not fixed in our society (generally speaking).
Dan's closing statement is absolutely accurate. Our personal opinions seem infallible to us - but the problem is that they are often unfounded and untested and yet we still treat them like the word of god. We need to develop the ability to consider things objectively and to question everything that we so readily accept as fact or truth. The recurring nature of our fallacies in thought, word and deed are a testament to the fact that the way we are living is not practical, nor will it produce the best possible life for everyone.
Comments
Post a Comment