Political Dogma Stroll’s Non-Political Moral Decision-Making Processes A Quant

时间:2022-10-07 01:22:35

Abstract

Ethical enigma kernelling concerns about actions against concerns about consequences have been dealt by philosophers and psychologists to measure “universal”moral intuitions. Although these enigmas contain no evident political content, we decipher that liberals are more likely than conservatives to be concerned about consequences, whereas conservatives are more likely than liberals to be concerned about actions. This denouement is exhibited in two large, heterogeneous samples and across several different moral dilemmas. In addition, manipulations of dilemma averseness and order of presentation suggest that this political difference is due in part to different sensitivities to emotional reactions in moral decision-making: Conservatives are very much inclined to “go with the gut” and let affective responses guide moral judgments, while liberals are more likely to deliberate about optimal consequences. In this article, extracting a sample from Western Europe, we report evidence that political differences can be found in moral decisions about issues that have no evident political content. In particular, we find that conservatives are more likely than liberals to attend to the action itself when deciding whether something is right or wrong, whereas liberals are more likely than conservatives to attend to the consequences of the action. Further, we report preliminary evidence that this is partly explained by the kernel of truth from the parodies conservatives are more likely than liberals to “go with the gut” by using their affective responses to guide moral judgment.

Key words: Ethical decision making; Affect; Enigma; Liberal; Conservative; Averseness

INTRODUCTION

Conservative is traditionally viewed as an individual who is reluctant to accept change in favour of preserving the status quo and traditional values and customs, whereas Liberal is a person who is deemed tolerant of different views, standards of behaviour and the belief in equality for all (Encarta Word English Dictionary). Various psychological studies have been done in order to understand the different conservative and liberal mindsets.

Stephen Colbert’s parody of American conservatism takes aim at right-wing claims to intuitive moral clarity about right and wrong actions, regardless of situation or consequences: “Since the beginning of my show I’ve led a crusade against facts. Too often, they upset the truth that’s in your gut” (The Colbert Report, January 9, 2007). The mirror-image parody of liberals presents them as irresolute moral flip-floppers, changing their moral convictions to suit the situation and its likely consequences. Although political differences in what partisans morally care about are well-known (Feather, 1979; Feldman, 2003; Graham, Haidt, & Nosek, 2009), these parodies suggest that political ideology may influence how people make moral decisions, regardless of what those decisions are about.

Some ethical enigmas force a choice between a morally aversive (or even gut-wrenching) action and a dormancy that produces even worse consequences. For instance,is it morally permissible to kill one person (action) in order to save the lives of many who would otherwise perish (consequence)? Philosophers (Foot, 1967) and legal theorists (Thompson, 1986) have employed such hypothetical enigmas to answer these questions normatively. More recently, behavioural and cognitive scientists have taken them under consideration to deliver a semantic account of the practices involved in ethical and moral decision-making. This empirical vestige denotes that when people choose inaction (i.e., they refuse to kill one person) these discretions are based largely on “hot”affective reactions of aversion to the action itself, whereas consequentialist responses (i.e., killing one to save many) are reached via “cold” processes of deliberative reasoning(Cushman, Young, & Hauser, 2006; Greene, Sommerville, Nystrom, Darley, & Cohen, 2001; Greene et al., 2009).

Despite having no explicit political content, there are doctrinal reasons to expect ideological differences in approaches to these enigmas. John Stuart Mill, after all, is the father of both liberalism and utilitarianism. Conservatives often express contempt for moral relativism and situational ethics, preferring rules that are binding and eternal (Hunter, 1991; Sowell, 2002), and may thus be more likely than moderates or liberals to object to actions violating such rules. Liberals, on the other hand, may be more likely to question the justification of rules and endorse civil disobedience or other forms of morallymotivated law-breaking (Kohlberg [1969] called this a hallmark of post-conventional thinking). Also, liberals are more likely to embrace efforts to make delicate adjustments to laws and traditions in order to maximize overall utility (Muller, 1997).

Having the hot/cold findings discussed above, however, it is possible that ideological differences in responses to these enigmas could be due to differential use of affect in moral decision-making. For instance, conservatives may be more inclined to reject consequence-optimizing actions because of their intuitive aversion to them, not because of a deliberate endorsement of deontological principles. This is predicted based on earlier work showing ideological differences in tolerance of ambiguity, needs for cognition and cognitive closure, and disgust sensitivity (see Jost, Federico, & Napier, 2009, for a review). We investigated whether conservatives would be more inclined to focus on actions, and liberals on consequences. We were also interested in whether any differences we found might be due to differential reliance on affective reactions. We gave several different moral dilemmas to a large and demographically diverse sample. Each dilemma had two versions that were presented sequentially. Dilemmas varied widely in terms of settings, actions proposed, and overall averseness, yet each set of dilemmas had the same logical structure. One version required an aversive action to prevent negative consequences (e.g., killing one patient in a hospital and using the organs from that patient to save the lives of four others), while another version had the same costs and benefit but required a less aversive action(e.g., redirecting deadly fumes in a hospital, killing one patient to save four others). In each case, we expected that conservatives would be more likely than liberals and moderates to prioritize the action in moral judgment, rejecting the action as morally impermissible despite its utilitarian justification.

To test whether a difference in sensitivity to affective reactions is the potential mechanism for ideological differences in moral reasoning, we presented the more and less aversive versions of each dilemma together, and manipulated their presentation order. If conservative moral judgments are more sensitive to affective reactions, then the strong affective response to the more aversive action may be more likely to linger and influence subsequent judgment of the less aversive version of the same dilemma. For example, seeing the organ transplant version first should make the fumes version seem more aversive, especially for conservatives.

MATERIAL AND METHODS

Participants

Participants were 656 visitors (48% female, median age 30) to the website; 473 were from the UK, 70 from Germany, 48 from France and 39 from other countries of the EU. Political identity was self-reported on a 10-point scale that included a 7-point liberalconservative continuum plus 3 additional options. There were 356 liberals (three scale points, from slightly to extremely liberal), 80 moderates, and 101 conservatives(three scale points). All analyses retained the sevenpoint strongly liberal to strongly conservative scaling. The 54 “libertarian”, 27 “other”, and 38 “don’t know/not political” were excluded, leaving a sample of 501.

Procedure

Participants self-selected to take a study described as“Moral Dilemmas What is the right thing to do in difficult situations?” Participants were instructed to go through 6 moral dilemmas and were asked a question about the right thing to do in each case. They were shown two different versions of each dilemma with exact same consequences but requiring different actions. Participants were randomly assigned to condition: either the more aversive version of each dilemma (previously called the “personal” version [Greene et al., 2001]) always came before the less aversive (“impersonal”) version, or the less aversive version always came before the more aversive version. The order of the six dilemma pairs was randomized for each participant. The dilemmas were adapted from Greene et al. (2001), and modified so that the two versions were the same except for the action required. The titles of the dilemmas “Trolley,”“Doctor”, “Father”, “Vaccine”, “Safari”, and “Lifeboat” and names describing the unique action (e.g., “Doctor Dilemma Fumes Version”) were visible to highlight the similarity within each pair. Participants answered“Is it morally appropriate for you to [do action] in order to [prevent some other danger]?” with a dichotomous Yes/No response. Then they answered “How certain are you about your answer?” with a 7-point scale from“extremely uncertain” to “extremely certain.” Full text of all dilemmas can be found in the supplements.

Averseness lingered for conservatives, but not for

liberals or moderates. To test whether the strength of the order effect for less aversive dilemmas varied by political ideology (i.e., whether politics moderated the Averseness X Order interaction) we tested a model with Averseness, Order, and their interaction as level-1 predictors and politics as a level-2 predictor. The main effects of Averseness, Politics, and Order, as well as the Averseness X Order interaction remained significant. In addition, there was a significant Averseness X Order X Politics interaction, B = -.12, t (5531) = 2.05, p = .04. To decompose this 3-way interaction, we tested separate models for liberals, moderates, and conservatives. This allowed us to determine how the Averseness X Order interaction varied across the three groups. Neither liberals(B = -.09, t (3688) = 1.18, p = .24) nor moderates (B= -.22, t (765) =1.20, p = .23) showed a significant Averseness X Order interaction, but conservatives did, B =-.31, t (1074) = 2.43, p = .02. We further decomposed this interaction for conservatives, testing the effect of order separately for less aversive and more aversive dilemmas.

Conservatives were more likely to reject the less aversive actions when they followed, rather than preceded, the more aversive actions, B = .45, t (534) = 3.80, p

DISCUSSION

First proof is extracted from the two large samples which denote that conservatives are more likely than liberals to respond to ethical enigmas based on the actions required, and liberals are more likely than conservatives to respond based on the consequences of inaction. The effect was consistent across a diverse set of dilemmas including a variety of roles, situations, actions, and tradeoffs. Interactions with this effect provided antecedent evidence that the political differences were due in part to conservatives’ greater sensitivity to affect in their moral decision-making. Conservatives were more affected by the order manipulation: seeing the more aversive versions first made them more likely to also reject the less aversive actions. Seeing the less aversive version of scenarios first did not have an effect on subsequent judgments of more aversive versions, suggesting that results were not driven simply by a desire to be consistent. The gut-level reaction to an action like removing someone’s organs against their will lingers, and makes a less aversive action like redirecting deadly fumes seem morally inappropriate as well but only for conservatives.

It is noteworthy that seeing the more aversive dilemma first increased rejections of the subsequent less aversive action, but seeing the less aversive dilemma first had no effect on responses to the subsequent more aversive version. The palpable affective reaction to aversive scenarios (e.g., removing someone’s organs against their will) may just be too strong to allow for cold consequence-weighing calculations (e.g., four lives > one life), even in a subsequent decision involving a less aversive action. This asymmetry in order effects supports the idea of affective primacy in moral judgment (Haidt, 2001): the aversion remained with participants, while the rational deliberation did not. Conservatives’ responses to the less aversive dilemmas in general and following the more aversive ones in particular support our hypothesis that conservatives’ moral judgments are more sensitive to intuitive affective reactions than are liberals’. However, our inference about the role of affect is indirect. Though previous research demonstrating that more aversive dilemmas elicit stronger affective reactions (Greene et al., 2001; 2009) supports our inference, a valuable next step would be to replicate these results while measuring affective reactions more directly (e.g., with physiological measures).

CONCLUSION

These descriptive moral decision-making differences do not necessarily imply any particular normative conclusions. Although moral consequentialism has been cast as normatively optimal in decision research(Baron & Spranca, 1997; Sunstein, 2005), this has been disputed (Bennis, Medin, & Bartels, 2010). Liberal consequentialism can be taken as a sign of wise and thoughtful deliberation in moral matters, or as irresolute flip-flopper in the face of changing circumstances. Conservatives’ action-focused stands can be taken as decisiveness and strong moral character, or as arrogant ignorance of the consequences of behaviour. Just as liberal and conservative moral education approaches make normative appeals to different sets of moral values(Graham, Haidt, & Rimm-Kaufman, 2008), liberals and conservatives may also differ in opinions about which approach to moral decision-making is normatively better. The descriptive results of our study cannot normatively tell us which ideological group is more virtuous, but they do suggest that individual difference approaches can contribute to our understanding of the processes of moral judgment and decision-making. Contrary to Universalist claims (Hauser, 2006), individuals do systematically vary in their responses to these abstract and hypothetical dilemmas. This systematic variation may account for why some conservatives consider themselves more virtuous, due to their moral consistency, while some liberals consider themselves more virtuous, due to their moral rationality. Further studies combining individual difference and experimental approaches will shed light on both the nature of political ideology and the mechanisms of moral decision-making. As a first step, the present findings indicate that conservatives are more likely to focus on actions (and liberals on consequences) in moral tradeoffs, and suggest that this is partially due to the stronger role that affects plays in conservative moral decision-making.

These findings can be highly useful and have significant implications for fields including political science, moral psychology, and decision science. They call into question popular perceptions of liberals as “bleeding hearts,” more affected than conservatives by feelings in their moral and policy opinions (Farwell & Weiner, 2000). The findings also suggest that partisans may differ even in their initial approaches to novel issues, with those on the left focusing more on likely consequences and those on the right focusing more on immediate reactions about the rightness or wrongness of an action regardless of the consequences. The fact that the differences were found using non-political dilemmas suggests that moral disagreements between ideological opponents involve not only different prioritizations of moral concerns (e.g., equality vs. social order), but different processes of moral decision-making. However, due to its artificiality these dilemmas are non-representative of everyday morality (Pincoffs, 1986), and thus future work is needed to determine how accurately such political deviations synthesize to other domains of ethical discretions.

ACKNOWLEDGMENTS

This research was supported by our friend and colleague Ms. Fahdila Azam (Lecturer, Department of Economics, BBSU) who helped me in the application of twolevel hierarchical logistic regression models. We are also indebted to Ms. Maria Abdeali (Research Fellow, Department of Experimental Psychology, Oxford University, UK) for her sincere efforts in the collection of data used in this research.

REFERENCES

Baron, J., & Spranca, M. (1997). Protected Values. Organizational Behaviour and Human Decision processes, 70, 1-16.

Bennis, W. M., Medin, D. L., & Bartels, D. M. (2010). The Costs and Benefits of Calculation and Moral Rules. Perspectives on Psychological Science, 5, 187-202.

Cushman, F. A., Young, L., & Hauser, M. D. (2006). The Role of Conscious Reasoning and Intuition in Moral Judgment. Psychological Science, 17, 10821089.

Farwell, L., & Weiner, B. (2000). Bleeding Hearts and the Heartless: Popular Perceptions of Liberal and Conservative Ideologies. Personality and Social Psychology Bulletin, 26, 845-852.

Feather, N. (1979). Value Correlates of Conservatism. Journal of Personality and Social Psychology, 36, 1617-1630.

Feldman, S. (2003). Values, Ideology, and the Structure of Political Attitudes. In D. O. Sears, L. Huddy, & R. Jervis(Eds.), Oxford Handbook of Political Psychology (pp. 477508). Oxford, United Kingdom: Oxford University Press.

Foot, P. (1967). The Problem of Abortion and the Doctrine of Double Effect. Oxford Review, 5, 5- 15.

Graham, J., Haidt, J., & Nosek, B. A. (2009). Liberals and Conservatives Rely on Different Sets of Moral Foundations. Journal of Personality and Social Psychology, 96, 1029-1046.

Graham, J., Haidt, J., & Rimm-Kaufman, S. E. (2008). Ideology and Intuition in Moral Education. European Journal of Developmental Science, 2, 269-286.

Greene, J. D. (2007). The Secret Joke of Kant’s Soul. In W. Sinnott-Armstrong, Ed., Moral Psychology (Vol. 3), The Neuroscience of Morality: Emotion, Disease, and Development (pp. 35-80). Cambridge, MA: MIT Press.

Greene, J. D., Cushman, F. A., Stewart. L. E., Lowenberg, K., Nystrom, L. E., & Cohen, J. D. (2009) Pushing Moral Buttons: The Interaction Between Personal Force and Intention Inmoral Judgment. Cognition, 111, 364-371.

Greene, J. D., Sommerville, R. B., Nystrom, L. E., Darley, J. M.,& Cohen, J. D. (2001). An fMRI Investigation of Emotional Engagement in Moral Judgment. Science, 293, 2105-2108.

Haidt, J. (2001). The Emotional Dog and Its Rational Tail: A Social Intuitionist Approach to Moral Judgment. Psychological Review, 108, 814-834.

Hauser, M. D. (2006). Moral Minds: How Nature Designed Our Universal Sense of Right and Wrong. New York: Harper Collins.

Jost, J. T., Federico, C. M. & Napier, J. L. (2009). Political Ideology: Its Structure, Functions, and Elective Affinities. Annual Review of Psychology, 60, 307-333.

Kohlberg, L. (1969). Stage and Sequence: The CognitiveDevelopmental Approach to Socialization. In D. A. Goslin, Ed., Handbook of Socialization Theory and Research. Chicago: Rand McNally, pp. 347-480.

Lombrozo, T. (2009). The Role of Moral Commitments in Moral Judgment. Cognitive Science, 33, 273-286.

Muller, J. Z. (Ed.) (1997). Conservatism. Princeton, NJ: Princeton University Press.

Petrinovich, L. & O’Neill, P. (1996). Influence of Wording and Framing Effects on Moral Intuitions. Ethology and Sociobiology, 17, 145-171.

Pincoffs, E. L. (1986). Quandaries and Virtues: Against Reductivism In Ethics. Lawrence, Kansas: University of Kansas Press.

Schwitzgebel, E., & Cushman, F. A. (2010). Expertise in Moral Reasoning? Order Effects on Moral Judgment in Professional Philosophers and Non-Philosophers. Manuscript Under Review.

Sowell, T. (2002). A Conflict of Visions: The Ideological Origins of Political Struggles. New York: Basic Books.

Is it morally appropriate for you to kill this man’s father in order to get money for the man and his family?[No/Yes]

Father Dilemma Circuit Breaker Version:

You are in hospital lounge waiting to visit a sick friend. A young man sitting next to you explains that his father is very ill. The doctors believe that he has a week to live at most. He explains further that his father has a substantial life insurance policy that expires at midnight. If his father dies before midnight, this young man will receive a very large sum of money. He says that the money would mean a great deal to him and his family, and that no good will come from his father’s living a few more days. After talking with him you can tell this man is in desperate need of the money to feed his family. The man asks you to go to the hospital basement and pull a circuit out of the circuit breaker, shutting off his father’s life support machines.

Is it appropriate for you to kill this man’s father in order to get money for the man and his family? [No/Yes]

Lifeboat Dilemma - Throw Overboard Version:

You are on a cruise ship when there is a fire on board, and the ship has to be abandoned. The lifeboats are carrying many more people than they were designed to carry. The lifeboat you’re in is sitting dangerously low in the water a few inches lower and it will sink. The seas start to get rough, and the boat begins to fill with water. It seems to you that there is only one way to stop the boat from sinking, and that is to start throwing other passengers overboard, starting with old people who are too weak to resist.

Is it morally appropriate for you to throw some of your fellow passengers overboard in order to save yourself and the other lifeboat passengers? [No/Yes]

Lifeboat Dilemma - Rope Version:

You are on a cruise ship when there is a fire on board, and the ship has to be abandoned. The lifeboats are carrying many more people than they were designed to carry. The lifeboat you’re in is sitting dangerously low in the water a few inches lower and it will sink. The seas start to get rough, and the boat begins to fill with water. A group of old people are in the water and ask you to throw them a rope so they can come aboard the lifeboat. It seems to you that the boat will sink if it takes on any more passengers.

Is it morally appropriate for you to refuse to throw the rope in order to save yourself and the other lifeboat passengers? [No/Yes]

Safari Dilemma - Torture Version:

You are part of a group of ecologists who live in a remote stretch of jungle. The entire group, which includes eight children, has been taken hostage by a group of paramilitary terrorists. One of the terrorists takes a liking to you. He informs you that his leader intends to kill you and the rest of the hostages the following morning. He is willing to help you and the children escape, but as an act of good faith he wants you to torture and kill one of your fellow hostages whom he does not like, if you refuse his offer all the hostages including the children and yourself will die. If you accept his offer then the others will die in the morning but you and the eight children will escape.

Is it morally appropriate for you to torture and kill one of your fellow hostages in order to escape from the terrorists and save the lives of the eight children? [No/ Yes]

Safari Dilemma - Help Version:

You are part of a group of ecologists who live in a remote stretch of jungle. The entire group, which includes eight children, has been taken hostage by a group of paramilitary terrorists. One of the terrorists takes a liking to you. He informs you that his leader intends to kill you and the rest of the hostages the following morning. He is willing to help you and the children escape, but as an act of good faith he wants you to help him kill one of your fellow hostages whom he does not like. You are to tell the man to meet you in a remote location to plan an escape, and when he goes there this terrorist will kill him, if you refuse his offer all the hostages including the children and yourself will die. If you accept his offer then the others will die in the morning but you and the eight children will escape.

Is it morally appropriate for you to help him kill one of your fellow hostages in order to escape from the terrorists and save the lives of the eight children? [No/Yes]

Trolley Dilemma Push Version:

A runaway trolley is heading down the tracks toward five workmen who will be killed if the trolley proceeds on its present course. You are on a footbridge over the tracks, in between the approaching trolley and the five workmen. Next to you on this footbridge is a stranger who happens to be very large. If you do nothing the trolley will proceed, causing the deaths of the five workmen. The only way to save the lives of these workmen is to push this stranger off the bridge and onto the tracks below, where his large body will stop the trolley, causing his death.

Once testing is complete the effects of the substances will be identified, and you will be able to start saving lives with your vaccine.

Is it morally appropriate for you to direct your organization to begin the testing, killing 100 patients with a deadly injection in order to identify a vaccine that will save millions of lives? [No/Yes]

Mining Shoot Version:

You are part of a four-person mining expedition. There is a cave-in and the four of you are trapped in the mine. A rock has crushed the legs of one of your crew members and he will die without medical attention. You’ve established radio contact with the rescue team and learned it will be 36 hours before the first drill can reach the space you are trapped in. You are able to calculate that this space has just enough oxygen for three people to survive for 36 hours, but definitely not enough for four people. The only way to save the other crew members is to shoot the injured crew member so that there will be just enough oxygen for the rest of the crew to survive

Is it morally appropriate to kill the injured crew member in order to save the lives of the remaining crew members? [No/Yes]

Mining Refuse Aid Version:

You are part of a four-person mining expedition. There is a cave-in and the four of you are trapped in the mine. A rock has crushed the legs of one of your crew members and he will die without medical attention. You’ve established radio contact with the rescue team and learned it will be 36 hours before the first drill can reach the space you are trapped in. You are able to calculate that this space has just enough oxygen for three people to survive for 36 hours, but definitely not enough for four people. The only way to save the other crew members is to refuse medical aid to the injured crew member so that there will be just enough oxygen for the rest of the crew to survive.

Is it morally appropriate to allow the injured crew member to die in order to save the lives of the remaining crew members? [No/Yes]

上一篇:浅谈建立学生学风积极性的几个环节 下一篇:宫颈癌更应综合防控