By Giannis Lois

 

John and Paul have been recently hired in the public sector of a developing country. In this country, public officials are usually underpaid, and this has led to a debate over whether minor bribes are not only expected but also morally justified. While John has taken a clear positive stance on this topic, Paul is rather skeptical about the moral repercussions of accepting bribes. Given that petty corruption is an established norm in this country, the government has launched a campaign for the deleterious effects of minor bribes and their condemnation as unethical. Would the government’s campaign or the corrupting behavior of John’s and Paul’s colleagues influence their behavior and what size (if any) of bribes would John and Paul consider appropriate?

Everyday life is full of examples that resemble this situation where empirical expectations of how others actually behave (i.e. descriptive norms) are in conflict with normative expectations of how one should behave (i.e. injunctive norms). From a policy-making perspective, the crucial question is how to optimally utilize social norms to reduce public officers’ willingness to accept bribes, even minor bribes.

Although huge corporate and government scandals in which few powerful individuals cheat a lot draw public attention, small transgressions of large numbers of people have just as large an impact on our daily lives. Examples of consumer and occupational frauds such as overstating insurance claims, wardrobing, tax deception, and petty institutional corruption are responsible for several trillions of annual losses only in the US.

Why Do People Cheat Only a Little Bit?

When given the opportunity to cheat, many individuals do cross ethical boundaries, but only to a limited extent. From a standard economic perspective, this behavior represents a puzzle as people are considered rational selfish beings who are interested only in maximizing their own payoffs. Therefore, their decision to cheat depends solely on its expected external benefits (e.g. getting money or getting a better position) and expected external costs (e.g. paying a fine or losing a job). In contrast to this classic economic perspective that focuses on external incentives, there exists ample evidence suggesting that people are also internally motivated to care about others and to value honesty.

Combining these two seemingly contradicting perspectives, researchers have provided explanations as to why honest people engage in dishonest actions only to a limited extent. One prominent explanation for this phenomenon is that people want to profit from minor acts of dishonesty while at the same time maintain a positive self-image of an honest and moral individual. To achieve this, they actively search for flexible self-serving justifications for their misbehaving or they remain unaware (i.e. ethically blind) of the moral repercussions of their actions.

Dishonesty Flourishes in Ambiguous Settings

These processes of self-serving justifications and ethical blindness are facilitated when there is no well-defined boundary between honesty and dishonesty. In a recent experimental study, we examined the role of this ambiguity on dishonest behavior by allowing rule violations to be the result of honest mistakes or various dishonest processes. More specifically, participants first performed a demanding task in which they had to identify the gender of faces that were overlaid on images depicting houses. Afterwards, they were given the opportunity to ask and receive extra money. They were instructed to make use of this opportunity only when the presented faces were presented inverted, i.e. upside down (difficult condition), as opposed to upright (easy condition). The more inverted faces they see the more money they are entitled to ask. This cognitively demanding task can result in honest mistakes if participants mistakenly encode in memory or mistakenly retrieve from memory that the presented faces were inverted when they actually were upright. In this ambiguous setting in which the distance between honesty and dishonesty is very small, minor rule violations (asking for the smallest possible amount of money when all faces were upright) can go under the radar or can be easily justified as honest mistakes. Consistent with this assumption, our findings showed that most individuals who violated the rule did so only to a small extent far below the maximum possible profit.

Given that real-life dishonesty does not take place in a social vacuum, one interesting question is how conflicting descriptive norms (how people actually behave) and injunctive norms (how people should behave) interact with each other to influence the tendency of honest people to engage in minor acts of dishonesty in this ambiguous settings.

Norms Influence the Magnitude of Dishonesty

A large body of empirical evidence has shown that cheating does not depend solely on the simple calculations of cost-benefit analysis or the internal motivation for honesty, but also on the social norms implied by the dishonesty of others or by beliefs about what constitutes honest behavior.

Coming back to our study, the absence of clear descriptive or injunctive norms led many individuals to commit minor rule violations (i.e. asking for the smallest possible amount of money when all faces were upright). This behavioral pattern changed dramatically in a second phase when people received false feedback that the average amount of money that others asked is very high (i.e. descriptive norms). The exposure to others’ high cheating increased the frequency of major rule violations (i.e. asking for large amounts of money when all faces were upright) but had no sizeable impact on the frequency of minor rule violations. A plausible explanation for this result is that in an ambiguous and novel situation, like our experimental setting, people pay attention to the social norm that is relevant to the situation and available at the decision phase. At the beginning, people are uncertain about what is the proper course of action due to the absence of both injunctive and descriptive norms. This uncertainty is eliminated once participants are informed that others engaged in major rule violations resulting in a selective increase in major dishonesty.

To avoid these harmful effects of uncertainty, one solution may be to remind people of what is the appropriate behavior in this context (i.e. injunctive norms). But what happens when information about others’ cheating behavior is presented alongside a rule reminder? Our findings showed that rule reminders (“A short reminder: You should ask for extra points only in difficult rounds where inverted faces are presented.”) are not sufficient to mitigate the increase in major rule violations driven by others’ misbehaving. In other words, descriptive norms seem to be more powerful than injunctive norms. However, there is a glimmer of hope: rule reminders led to a reduction of the frequency of minor rule violations suggesting that injunctive norms can be effective at minimizing the minor dishonesty of honest people.

Take-Home Message

These results highlight the powerful impact of negative descriptive norms over injunctive norms on dishonest behavior but also provide valuable novel insights into how to optimally take advantage of social norms to promote honest behavior.

Reminding people of how they should behave can potentially promote honesty in ambiguous settings that are characterized by honest people engaging in minor acts of dishonesty. For example, the return of used clothing (“wardrobing”), which costs annually several billion dollars, is mainly the result of many honest individuals returning just one shirt or sweater. Raising ethical awareness by reminding honest people the consequences of this behavior can have a crucial impact on this phenomenon.

However, these reminders are less likely to be efficient in preventing major dishonesty. The limited potential of moral reminders is further undermined by the presence of negative descriptive norms (i.e. others cheating a lot) that shatter any hope for shared honesty. Returning to the “wardrobing” example, studies have shown that this behavior is not only very popular but also many individuals perceive it as a common practice which strengthens the salience of the negative descriptive norm while weakening the effect of moral reminders.

Taken together, our findings highlight the importance of taking into account people’s empirical expectations regarding the prevalence of dishonest behavior. In this respect, any effort to implement effective policies that counteract dishonest behavior need to take into account to what extent it is clear how an honest person should behave in a certain situation (i.e. ethical salience) as well as the perceived prevalence of actual dishonest behavior in this situation. The ultimate aim should be to raise moral awareness while at the same time instill a feeling of shared honesty.

Coming back to the public officials’ corruption example, policy makers should address two independent issues. On the one hand, they should raise awareness about the ethical dimension of taking minor bribes, targeting honest individuals like Paul who are already considering the moral repercussions of their actions. On the other hand, they should ensure that only positive information about others’ behavior is disclosed thus discouraging major acts of dishonesty from individuals like John who pay less attention to the ethical dimension of their actions.

Giannis Lois
Giannis Lois is an assistant professor of Neuroeconomics at the Maastricht University School of Business and Economics. He holds a PhD in Cognitive Neuroscience from the University of Mainz. He has conducted interdisciplinary research in Neuroeconomics, Behavioral Economics, Cognitive Psychology, and Social Psychology. His current research interest lies in the intersection of Behavioral Economics and Social Psychology focusing on topics related to social change with respect to societal problems such as economic inequality.