Take your behavioral economics expertise to the next level with our new online ethics course.

More

* This introduction was originally published in the Behavioral Economics Guide 2014. To learn more about the subject and the latest ideas, please download our free annual Behavioral Economics Guides.

By Alain Samson, Ph.D.

Introduction

Think about the last time you purchased a customizable product. Perhaps it was a laptop computer. You may have decided to simplify your decision making by opting for a popular brand or the one you already owned in the past. You may then have visited the manufacturer’s website to place your order. But the decision making process did not stop there, as you now had to customize your model by choosing from different product attributes (processing speed, hard drive capacity, screen size, etc.) and you were still uncertain which features you really needed. At this stage, most technology manufacturers will show a base model with options that can be changed according to the buyer’s preferences. The way in which these product choices are presented to buyers will influence the final purchases made and illustrates a number of concepts from behavioral economic (BE) theories.

First, the base model shown in the customization engine represents a default choice. The more uncertain customers are about their decision, the more likely it is that they will go with the default, especially if it is explicitly presented as a recommended configuration. Second, the manufacturer can frame options differently by employing either an ‘add’ or ‘delete’ customization mode (or something in between). In an add mode, customers start with a base model and then add more or better options. In a delete frame, the opposite process occurs, whereby customers have to deselect options or downgrade from a fully-loaded model. Past research suggests that consumers end up choosing a greater number of features when they are in a delete rather than an add frame (Biswas, 2009). Finally, the option framing strategy will be associated with different price anchors prior to customization, which may influence the perceived value of the product. If the final configured product ends up with a £1500 price tag, its cost is likely to be perceived as more attractive if the initial default configuration was £2000 (fully loaded) rather than £1000 (base). Sellers will engage in a process of careful experimentation to find a sweet spot—an option framing strategy that maximizes sales, but set at a default price that deters a minimum of potential buyers from considering a purchase in the first place.

Rational Choice

In an ideal world, defaults, frames, and price anchors would not have any bearing on consumer choices. Our decisions would be the result of a careful weighing of costs and benefits and informed by existing preferences. We would always make optimal decisions. In the 1976 book The Economic Approach to Human Behavior, the economist Gary S. Becker famously outlined a number of ideas known as the pillars of so-called ‘rational choice’ theory. The theory assumes that human actors have stable preferences and engage in maximizing behavior.  Becker, who applied rational choice theory to domains ranging from crime to marriage, believed that academic disciplines such as sociology could learn from the ‘rational man’ assumption advocated by neoclassical economists since the late 19th century. The decade of the 1970s, however, also witnessed the beginnings of the opposite flow of thinking, as discussed in the next section.

Prospect Theory

While economic rationality influenced other fields in the social sciences from the inside out, through Becker and the Chicago School, psychologists offered an outside-in reality check to prevailing economic thinking. Most notably, Amos Tversky and Daniel Kahneman published a number of papers that appeared to undermine ideas about human nature held by mainstream economics. They are perhaps best known for the development of prospect theory (Kahneman & Tversky, 1979), which shows that decisions are not always optimal. Our willingness to take risks is influenced by the way in which choices are framed, i.e. it is context-dependent. Have a look at the following classic decision problem:

Which of the following would you prefer:

  1. A) A certain win of $250, versus
    B) A 25% chance to win $1000 and a 75% chance to win nothing?
  2. How about:
    C) A certain loss of $750, versus
    D) A 75% chance to lose $1000 and a 25% chance to lose nothing?

Tversky and Kahneman’s work shows that responses are different if choices are framed as a gain (1) or a loss (2). When faced with the first type of decision, a greater proportion of people will opt for the riskless alternative A), while for the second problem people are more likely to choose the riskier D). This happens because we dislike losses more than we like an equivalent gain: Giving something up is more painful than the pleasure we derive from receiving it.

Bounded Rationality

Long before Tversky and Kahneman’s work, 18th– and 19th-century thinkers were already interested in the psychological underpinnings of economic life. Scholars during the neoclassical revolution at the turn of the 20th century, however, increasingly tried to emulate the natural sciences, as they wanted to differentiate themselves from the then “unscientific” field of psychology (see summary in Camerer, Loewenstein and Rabin, 2011). The importance of psychologically informed economics was later reflected in the concept of ‘bounded rationality’, a term associated with Herbert Simon’s work of the 1950s. According to this view, our minds must be understood relative to the environment in which they evolved. Decisions are not always optimal. There are restrictions to human information processing, due to limits in knowledge (or information) and computational capacities (Simon, 1982; Kahneman, 2003).

Gerd Gigerenzer’s work on “fast and frugal” heuristics later built on Simon’s ideas and proposed that the rationality of a decision depends on structures found in the environment. People are “ecologically rational” when they make the best possible use of limited information-processing abilities, by applying simple and intelligent algorithms that can lead to near-optimal inferences (Gigerenzer & Goldstein, 1996).

While the idea of human limits to rationality was not a radically new thought in economics, Tversky and Kahneman’s ‘heuristics and biases’ research program made important methodological contributions, in that they advocated a rigorous experimental approach to understanding economic decisions based on measuring actual choices made under different conditions. About 30 years later, their thinking entered the mainstream, resulting in a growing appreciation in scholarly, public, and commercial spheres.

Mental Accounting

The economist Richard Thaler, a keen observer of human behavior and founder of behavioral economics, was inspired by Kahneman & Tversky’s work (see Thaler, 2015, for a summary). Thaler coined the concept of mental accounting. According to Thaler, people think of value in relative rather than absolute terms. They derive pleasure not just from an object’s value, but also the quality of the deal – its transaction utility (Thaler, 1985). In addition, humans often fail to fully consider opportunity costs (tradeoffs) and are susceptible to the sunk cost fallacy.

Why are people willing to spend more when they pay with a credit card than cash (Prelec & Simester, 2001)? Why would more individuals spend $10 on a theater ticket if they had just lost a $10 bill than if they had to replace a lost ticket worth $10 (Kahneman & Tversky, 1984)? Why are people more likely to spend a small inheritance and invest a large one (Thaler, 1985)?

According to the theory of mental accounting, people treat money differently, depending on factors such as the money’s origin and intended use, rather than thinking of it in terms of the “bottom line” as in formal accounting (Thaler, 1999). An important term underlying the theory is fungibility, the fact that all money is interchangable and has no labels. In mental accounting, people treat assets as less fungible than they really are. Even seasoned investors are susceptible to this bias when they view recent gains as disposable “house money” (Thaler & Johnson, 1990) that can be used in high-risk investments. In doing so, they make decisions on each mental account separately, losing out the big picture of the portfolio.

Consumers’ tendency to work with mental accounts is reflected in various domains of applied behavioral science, especially in the financial services industry. Examples include banks offering multiple accounts with savings goal labels, which make mental accounting more explicit, as well as third-party services that provide consumers with aggregate financial information across different financial institutions (Zhang & Sussman, 2018).

Another concept related to mental accounting captures the fact that people don’t like to spend money. We experience pain of paying (Zellermayer, 1996), because we are loss averse. The pain of paying plays an important role in consumer self-regulation to keep spending in check (Prelec & Loewenstein, 1998). This pain is thought to be reduced in credit card purchases, because plastic is less tangible than cash, the depletion of resources (money) is less visible and payment is deferred. Different types of people experience different levels of pain of paying, which can affect spending decisions. Tightwads, for instance, experience more of this pain than spendthrifts. As a result, tightwads are particularly sensitive to marketing contexts that make spending less painful (Rick, 2018).

Too Much Information: Choice Overload

Humans’ bounded rationality is particularly well illustrated by the concept of choice overload. Also referred to as ‘overchoice’, this phenomenon occurs as a result of too many choices being available to consumers. Overchoice has been associated with unhappiness (Schwartz, 2004), decision fatigue, going with the default option, as well as choice deferral—avoiding making a decision altogether, such as not buying a product (Iyengar & Lepper, 2000). Many different factors may contribute to perceived choice overload, including the number of options and attributes, time constraints, decision accountability, alignability and complementarity of options, consumers’ preference uncertainty, among other factors (Chernev et al., 2015).

Choice overload can be counteracted by simplifying choice attributes or the number of available options (Johnson et al., 2012).

Limited Information: The Importance of Feedback

Bounded rationality’s principle of limited knowledge or information is one of the topics discussed in the 2008 book Nudge. In the book, Thaler and Sunstein point to experience, good information, and prompt feedback as key factors that enable people to make good decisions. Consider climate change, for example, which has been cited as a particularly challenging problem in relation to experience and feedback. Climate change is invisible, diffuse, and a long-term process. Pro-environmental behavior by an individual, such as reducing carbon emissions, does not lead to a noticeable change. The same is true in the domain of health. Feedback in this area is often poor, and we are more likely to get feedback on previously chosen options than rejected ones.

The impact of smoking, for example, is at best noticeable over the course of years, while its effect on cells and internal organs is usually not evident to the individual. Traditionally, generic feedback aimed at inducing behavioral change has been limited to information ranging from the economic costs of the unhealthy behavior to its potential health consequences (Diclemente et al., 2001). More recent behavior change programs, such as those employing smartphone apps to stop smoking, now usually provide positive and personalized behavioral feedback, which may include the number of cigarettes not smoked and money saved, along with information about health improvement and disease avoidance.

Information Avoidance

Behavioral economics assumes that people are boundedly rational actors with a limited ability to process information. While a great deal of research has been devoted to exploring how available information affects the quality and outcomes of decisions, a newer strand of research has also explored situations where people avoid information altogether.

Information avoidance in behavioral economics (Golman et al., 2017) refers to situations in which people choose not to obtain knowledge that is freely available. Active information avoidance includes physical avoidance, inattention, the biased interpretation of information (see also confirmation bias) and even some forms of forgetting. In behavioral finance, for example, research has shown that investors are less likely to check their portfolio online when the stock market is down than when it is up, which has been termed the ostrich effect (Karlsson et al., 2009). More serious cases of avoidance happen when people fail to return to clinics to get medical test results, for instance (Sullivan et al., 2004).

While information avoidance is sometimes strategic, it can have immediate hedonic benefits for people if it prevents the negative (usually psychological) consequences of knowing the information. It usually carries negative utility in the long term, because it deprives people of potentially useful information for decision making and feedback for future behavior. Furthermore, information avoidance can contribute to a polarization of political opinions and media bias.

“Irrational” Decision Making: The Example of the Psychology of Price

Boundedly rational choices, made due to limits in our thinking processes, especially those we make as consumers, are illustrated well in Dan Ariely’s popular science book Predictably Irrational.  A good portion of the research he discusses involves prices and value perception. One study asked participants whether they would buy a product (e.g. a cordless keyboard) for a dollar amount that was equal to the last two digits of their US social security number. They were then asked about the maximum they would be willing to pay. In the case of cordless keyboards, people in the top 20% of social security numbers were willing to pay three times as much compared to those in the bottom 20%. The experiment demonstrates anchoring, a process whereby a numeric value provides a non-conscious reference point that influences subsequent value perceptions (Ariely, Loewenstein, & Prelec, 2003).

Ariely also introduces the concept of the zero price effect, namely when a product is advertised as ‘Free’, consumers perceive it as intrinsically more valuable. A free chocolate is disproportionately more attractive relative to a $0.14 chocolate than a $0.01 chocolate is compared to one priced at $0.15. To a ‘rational’ economic decision maker, a price difference of 14 cents should always provide the same magnitude of change in incentive to choose the product (Shampanier, Mazar, & Ariely, 2007). Finally, price is often taken as an indicator of quality, and it can even serve as a cue with physical consequences, just like a placebo in medical studies. One experiment, for instance, gave participants a drink that purportedly helped mental acuity. When people received a discounted drink their performance in solving puzzles was significantly lower compared to regular-priced and control conditions (Shiv, Carmon, & Ariely, 2005).

Price can also be an ingredient for a decoy effect. Choices often occur relative to what is on offer rather than based on absolute preferences. The decoy effect is technically known as an ‘asymmetrically dominated choice’ and occurs when people’s preference for one option over another changes as a result of adding a third (similar but less attractive) option.  Ariely (2008) illustrates this with subscription options advertised by The Economist newspaper. Subscription options included web-only content for $59, print-only for $125, or print and web combined, also for $125. Ariely asked his students. As you would expect, 0% chose the print-only subscription.  84% chose the print-online combination, and 16% the web-only subscription. When repeated the poll without the print-only option, 32% opted for print-only, while 68% preferred to go web-only. The presence of the inferior option (print-only for $125) made the web and print subscription seem like a better deal.

Predictably Irrational and Nudge alerted the public to a new breed of economists influenced by the study of behavioral decision making that was pioneered by Kahneman and Tversky’s work (sometimes referred to as ‘choice under uncertainty’). The psychology of homo economicus—a rational and selfish individual with relatively stable preferences—has been challenged, and the traditional view that behavior change should be achieved by informing, convincing, incentivizing or penalizing people has been questioned (Thaler & Sunstein, 2008). The field associated with this stream of research and theory is behavioral economics (BE), which suggests that human decisions are strongly influenced by context, including the way in which choices are presented to us. Behavior varies across time and space, and it is subject to cognitive biases, emotions, and social influences. Decisions are the result of less deliberative, linear, and controlled processes than we would like to believe.

Dual-System Theory

Daniel Kahneman uses a dual-system theoretical framework (which established a foothold in cognitive and social psychology of the 1990s) to explain why our judgments and decisions often do not conform to formal notions of rationality. System 1 consists of thinking processes that are intuitive, automatic, experience-based, and relatively unconscious. System 2 is more reflective, controlled, deliberative, and analytical. Judgments influenced by System 1 are rooted in impressions arising from mental content that is easily accessible. System 2, on the other hand, monitors or provides a check on mental operations and overt behavior—often unsuccessfully.

Example 1: Availability and Affect

System 1 is ‘home’ of the heuristics (cognitive shortcuts) we apply and responsible for the biases (systematic errors) we may be left with when we make decisions (Kahneman, 2011). System 1 processes influence us when prior exposure to a number affects subsequent judgments, as evident in the anchoring effects discussed previously (Tversky & Kahneman, 1974). One of the most universal heuristics is the availability heuristic. Availability serves as a mental shortcut if the possibility of an event occurring is perceived as higher simply because an example comes to mind easily (Tversky & Kahneman, 1974); for instance, a person may deem pension investments too risky as a result of remembering a family member who lost most of her retirement savings in the recent recession. Readily available information in memory is also used when we make similarity-based judgments, as evident in the representativeness heuristic.

Finally, another ‘general purpose’ heuristic is that of affect, namely good or bad feelings that surface automatically when we think about an object. Applying the affect heuristic can lead to black-and-white thinking, which is particularly evident when people think about an object under conditions that hamper System 2 reflection, such as time pressure. For example, consumers may consider food preservatives’ benefits as low and costs as high, thus leading to a significant negative risk-benefit correlation (Finucane, Alhakami, Slovic, & Johnson, 2000).

The role of affect in risky or uncertain situations is also evident in the risk-as-feelings model (Loewenstein, Weber, Hsee, & Welch, 2001). ‘Consequentialist’ accounts of decision making tend to focus on expectations along with the likelihood and desirability of possible outcomes. The risk-as-feelings perspective explains behavior in situations where emotional reactions to risk differ from cognitive evaluations. In these situations, behavior tends to be influenced by anticipatory feelings, emotions experienced in the moment of decision making.

Example 2: Salience

Availability and affect are processes internal to the individual that may lead to bias. The external equivalent of these processes is salience, whereby information that stands out, is novel, or seems relevant is more likely to affect our thinking and actions (Dolan et al., 2010). For example, a technological device can be framed as being 99% reliable or having only a 1% failure rate, thereby emphasizing either positive or negative information. Salience also underlies heuristic judgments that rely on external cues. Some psychologists have derived effort-reducing heuristics that simplify consumer decision making. The brand name heuristic, for example, suggests that salient cues in the form of brand names can be used to infer quality (Maheswaran, Mackie, & Chaiken, 1992). In terms of degrees of visual salience, one study found a congruence effect between price and font size, where showing a lower sale price in a small print size relative to the regular price resulted in greater purchase likelihood than presenting the sale price in a relatively large font (Coulter & Coulter, 2005). Finally, the salience of options can also be manipulated by rearranging the physical environment; for instance, a change as simple as moving water bottles closer to the cashier in a cafeteria has been shown to increase the salience and convenience of this healthier drink choice and thereby significantly boost water sales (Thorndike, Sonnenberg, Riis, Barraclough, & Levy, 2012).

Example 3: Status Quo Bias and Inertia

While many heuristics and biases are the result of quick impressions, the automatic character of System 1 is also reflected in a human aversion to change.  One aspect in this respect is evident in the formation of habits, automatic behavioral patterns that are the result of repetition and associative learning (Duhigg, 2012). The preference for things to remain the same, such as a tendency not to change behavior unless the incentive to do so is strong, has been termed the “status quo bias” (Samuelson & Zeckhauser, 1988). Inertia is one form of people’s propensity to remain at the status quo (Madrian & Shea 2001), a well-known manifestation of which includes low rates of pension plan enrolment when people have to make the effort to sign up (‘opt-in’). In this case, an effective way to increase enrolment rates is to change the default—what happens when people do not make an active choice. Inertia, procrastination, and a lack of self-control are problems that make changes in default options from opt-in to opt-out an effective strategy, so, instead of having to take action to enroll (opt-in), people now have to make an effort to dis-enroll (opt-out) (Thaler & Sunstein, 2008). Nudging with defaults is one of the primary tools of the ‘choice architect’ (Goldstein, Johnson, Herrman, & Heitmann, 2008).

Example 4: Optimism Bias and the Overconfidence Effect

System 1 is dominated by gut feelings and a limits to information processing, which may lead people to be overly optimistic. People often overestimate the probability of positive events and underestimate the probability of negative events happening to them in the future (Sharot, 2011). For example, we may underestimate our risk of getting cancer and overestimate our future success on the job market.  A number of factors can explain unrealistic optimism, including perceived control and being in a good mood (Helweg-Larsen & Shepperd, 2001).

The overconfidence effect is observed when people’s subjective confidence in their own ability is greater than their objective (actual) performance (Pallier et al., 2002). It is frequently measured by having experimental participants answer general knowledge test questions. They are then asked to rate how confident they are in their answers on a scale. Overconfidence is measured by calculating the score for a person’s average confidence rating relative to the actual proportion of questions answered correctly.

A big range of issues have been attributed to overconfidence. Among investors, for example, overconfidence has been associated with excessive risk-taking (e.g. Hirshleifer & Luo, 2001),  concentrated portfolios  (e.g. Odean, 1998) and overtrading (e.g. Grinblatt & Keloharju, 2009).

Overconfidence is one of the manifestations of optimistic bias, which, according to Daniel Kahneman “may well be the most significant of the cognitive biases.”

Temporal Dimensions

Another important domain of BE introduces a time dimension to human evaluations and preferences. This area acknowledges that people are biased towards the present and poor predictors of future experiences, value perceptions, and behavior.

Time Discounting and Present Bias

According to time-discounting theories, present events are weighted more heavily than future ones (Frederick, Loewenstein & O’Donoghue, 2002); for example, many people prefer to receive £100 now over £110 in a month’s time. Discounting is non-linear, and its rate is not constant over time. People’s preference for receiving £100 a week from now versus £110 a month and one week from now will not be the same as their preference for receiving £100 a year from now versus £110 a year and one month from now. Although the gap is one month in both cases, the value of events that are farther in the future falls more slowly than those closer to the present (Laibson, 1997).

In addition to inertia, future discounting is another key problem that explains low retirement savings rates. One piece of research suggests that behavioral change could be achieved by helping people connect with their future selves. In the study, people who saw an age-progressed avatar of themselves were more likely to accept future financial rewards over immediate ones (Hershfield et al., 2011).

Diversification Bias and the Empathy Gap

Time inconsistency also occurs when our present self fails to predict accurately the preferences of our future self, a point illustrated well by diversification bias (Read, & Loewenstein, 1995). When shopping for multiple future consumption episodes, I may choose the variety pack of cereal, only to realize two weeks later that I would have enjoyed my breakfasts more if I had just stuck to my favorite kind. In the case of food, diversification bias should be particularly strong if you make your purchasing decision when you’re satiated (e.g. right after a meal). This inability to appreciate fully the effect of emotional and physiological states on decision making is known as the (hot-cold) empathy gap, a term coined by George Loewenstein, one of the founders of the field of behavioral economics. Hot states include a number of visceral factors, ranging from negative emotions associated with high levels of arousal (e.g. anger or fear) to feeling states (e.g. pain) and drive states (e.g. thirst, cravings related to addiction, or sexual arousal) (Loewenstein, 2000). The best known illustration occurs in sexual decision making, whereby men in a ‘cold’, unaroused state often predict that they will use a condom during their next sexual encounter, but when they are in an aroused ‘hot state’ they may fail to do so (Ariely & Loewenstein, 2006).

Forecasting and Memory

When we make plans for the future, we are often too optimistic. For example, we are subject to committing the planning fallacy by underestimating how long it will take us to complete a task and ignoring past experience (Kahneman, 2011). Similarly, when we try to predict how we will feel in the future, we may overestimate the intensity of our emotions (Wilson & Gilbert, 2003). The level of happiness that I expect to feel during my next vacation, for example, is likely to be higher than how I will rate it during the actual experience. There are different explanations for this error, including how we remember past events. My memory of a past holiday is likely to be non-representative of the holiday overall (Morewedge, Gilbert, & Wilson, 2005), and I may evaluate my last vacation based on the most pleasurable points and its end, for example, rather than the average of every moment of the experience (the peak-end-rule; Kahneman & Tversky, 1999). Finally, as my vacation days go by, I will simply get used to it and my happiness will level out. According to the concept of hedonic adaptation, changes in experiences tend only to induce happiness temporarily as we get used to new circumstances (Frederick & Loewenstein, 1999).

Social Dimensions

Contrary to the homo economicus view of human motivation and decision making, BE does not assume that humans make choices in isolation, or to serve their own interest. Aside from cognitive and affective (emotional) dimensions, an important area of BE also considers social forces, in that decisions are made by individuals who are shaped by—and embedded in—social environments.

Trust

Trust, which is one of the explanations for discrepancies between actual behavior and that predicted by a model of self-interested actors, makes social life possible and permeates economic relationships.

Although neoclassical economic theory suggests that trust in strangers is irrational, trust and trustworthiness can be widely observed across societies. In fact, reciprocity (discussed later) exists as a basic element of human relationships and behavior, and this is accounted for in the trust extended to an anonymous counterpart (Berg et al., 1995).

Both trust and trustworthiness increase when individuals are closer socially, but the latter declines when partners come from different social groups, such as nationality or race. Furthermore, high status individuals are found to be able to elicit more trustworthiness in others (Glaeser et al., 2000).

Trust has been investigated in experimental games. In trust games, participants are asked to split money between themselves and someone else. Player A is asked to determine an initial endowment of zero or a higher value (e.g. $5). The money is then multiplied (e.g. tripled to $15) by the experimenter and given to Player B, who is then asked to return an amount of zero or a higher value back to Player A. The game is about reciprocity and trust, because Player A must decide how much of the endowment to give to Player B in the hope of receiving at least the same amount in return. In the original experiment (Berg et al., 1995), 30 out of 32 first players sent money, and 11 of these 30 decisions resulted in a payback that was greater than the initial amount sent. This finding confounds the prediction offered by standard economic assumptions.

Trust has been linked to the concept of “betrayal aversion” (Bohnet, Greig, Herrmann, & Zeckhauser, 2008): People take greater risks when they are faced with a given probability of bad luck than the same probability of being cheated by another person.

Dishonesty

In human relationships, deception is often considered a violation of trust, while in standard economics, dishonesty can be seen as a natural by-product of actors with self-interested motives. However, the BE perspective does not consider humans to be more honest; rather, it takes a more social-psychological perspective by showing that dishonesty is not just about tradeoffs between external incentives (such as material gain) and costs (such as punishments). Dishonesty is the product of situations as well as both internal and external reward mechanisms, which often involves self-deception—the reframing of dishonest acts (e.g. not declaring all of your income to the tax authorities) in a way that makes them appear less dishonest (Mazar & Ariely, 2006).

People typically value honesty, tend to have strong beliefs in their morality and want to maintain this aspect of their self-concept (Mazar et al., 2008). Self-interest may conflict with people’s honesty as an internalized social norm (a concept discussed later), but the resulting cognitive dissonance can be overcome by engaging in self-deception, creating moral “wiggle room” that enables people to act in a self-serving manner. When moral reminders are used, however, this self-deception can be reduced, as demonstrated in laboratory experiments conducted by Mazar and colleagues. It is not surprising, then, that a lack of social norms is a general driver of dishonest behavior, along with high benefits and low costs of external deception, a lack of self-awareness, as well as self-deception (Mazar & Ariely, 2006).

Fairness and Reciprocity

Behavioral research on individual decision making in social contexts often relies on experimental games. Along with behavioral decision theory, behavioral game theory is the second major theoretical area found in behavioral economics. Typically, these games endow participants with rewards (e.g. tokens), which then change hands based on choices made by individuals within the rules of the game. This occurs over the course of one or more rounds of playing. The outcome of the game is evident in the way rewards are split between players, and the results often show that people have inequity aversion, i.e. they prefer fairness over inequality in many contexts (Fehr & Schmidt, 1999).

Fairness is related to a human desire for reciprocity, our tendency to return another’s action with another equivalent action. Reciprocity, however, can have positive and negative aspects. As Ernst Fehr’s work in this area has shown, people’s responses to positive actions are often kinder than a self-interest model would predict, but on the flipside it can also lead to punitive responses to negative actions (Fehr & Gaechter, 2000). In the real world, charities sometimes use reciprocity to their advantage. For example, one field experiment investigating donation behavior showed that people who received a large gift with a donation solicitation letter had a 75 percent higher donation frequency compared to a ‘no gift’ baseline condition (Falk, 2004).

Social Norms

The sociologist Alvin Gouldner referred to reciprocity as a “generalized moral norm” (Gouldner, 1960). Social norms are implicit or explicit behavioral expectations or rules within a society or group of people (Dolan et al., 2010), and they are an important component of identity economics, which considers economic actions to be the result of both monetary incentives and people’s self-concepts (Akerlof & Kranton, 2010).  Our preferences are not simply a matter of basic tastes; they are also influenced by norms, as manifested in gender roles, for example.

Norms vary across cultures and contexts. For example, while market norms would dictate that payment is required for a good or service, social norms are quite different—would you offer to pay a family member for the meal that he has prepared for you (Ariely, 2008)? Sometimes social norms of exchange such as reciprocity and market norms co-exist in the same sphere. For instance, while market exchange norms dictate that I will charge a client for a consulting job, I may also give that client free advice, on some occasions, in the hope that the favor will be reciprocated in the future.

Social norms signal appropriate behavior or actions taken by the majority of people (although what is deemed ‘appropriate’ is itself subject to continual change). Along with informational feedback (e.g. the amount of money saved by not drinking alcohol), descriptive normative feedback (e.g. how one’s drinking level compares to the national average) is often used in health behavior change programs (Diclemente et al., 2001), while non-profit organizations sometimes use normative information to affect donation levels. One study compared contribution levels for a public radio fundraiser in the US. When potential donors were provided with social information signaling norms (e.g. “We had another member, they contributed $300”), they saw up to a 12% increase in average contribution amounts (Shang & Croson, 2009).

Consistency and Commitment

Human susceptibility to feedback about social norms is related to our desire to maintain a positive view of who we are as a person. When the outcome of an action threatens this desire, we may change our behavior, though we often simply change our attitudes or beliefs. When this happens, we usually resort to ‘rationalization’, which is a form of cognitive dissonance reduction (Festinger, 1957). Unlike the rational choice view of human decision making, where preferences guide choices, rationalization implies the opposite: Sometimes preferences can justify actions after the fact (March, 1978). Cognitive dissonance theory is an illustration of the human need for a continuous and consistent self-image (Cialdini, 2008).  In an effort to align future behavior, being consistent is best achieved by making a commitment, especially if it is done publicly. Thus, pre-committing to a goal is one of the most frequently applied behavioral devices to achieve positive change.

The ‘Save More Tomorrow’ program, aimed at helping employees save more money, illustrates a number of behavioral biases and remedies, including commitment (Thaler & Benartzi, 2004). The program gives employees the option of pre-committing to a gradual increase in their savings rate in the future, each time they get a raise.  The program avoids the perception of loss that would be felt with a reduction in disposable income, because consumers commit to saving future increases in income. People’s inertia makes it more likely that they stick with the program, because they have to opt out to leave.

Herd Behavior and Market Bubbles

People’s susceptibility to social forces is also evident in herd behavior, which occurs when people do what others are doing instead of using their own information or making independent decisions. The idea of herding has a long history in philosophy and crowd psychology. It is particularly relevant in the domain of finance, where it has been discussed in relation to the collective irrationality of investors, including stock market bubbles (Banerjee, 1992).

Economic (or asset) bubbles form when prices are driven much higher than their intrinsic value. Well-known examples of bubbles include the US Dot-com stock market bubble of the late 1990s and housing bubble of the mid-2000s. According to Robert Shiller (2015), who warned of both of these events, speculative bubbles are fueled by contagious investor enthusiasm (see also herd behavior) and stories that justify price increases. Doubts about the real value of investment are overpowered by strong emotions, such as envy and excitement.

Other biases that promote bubbles include overconfidenceanchoring, and representativeness which lead investors to interpret increasing prices as a trend that will continue, causing them to chase the market (Fisher, 2014). Economic bubbles are usually followed a sudden and sharp decrease in prices, also known as a crash.

Summary and Implications

Behavioral economics (BE) uses psychological experimentation to develop theories about human decision making and has identified a range of biases as a result of the way people think and feel. BE is trying to change the way economists think about people’s perceptions of value and expressed preferences. According to BE, people are not always self-interested, benefits maximizing, and costs minimizing individuals with stable preferences—our thinking is subject to insufficient knowledge, feedback, and processing capability, which often involves uncertainty and is affected by the context in which we make decisions. Most of our choices are not the result of careful deliberation. We are influenced by readily available information in memory, automatically generated affect, and salient information in the environment. We also live in the moment, in that we tend to resist change, are poor predictors of future behavior, subject to distorted memory, and affected by physiological and emotional states. Finally, we are social animals with social preferences, such as those expressed in trust, reciprocity and fairness; we are susceptible to social norms and a need for self-consistency.

If you’d like to learn more about behavioral economics, including recent developments, practical applications, and challenges, please download our free Behavioral Economics Guide.

References and Further Reading

Ajzen, I. (1991). The theory of planned behavior. Organizational Behavior and Human Decision Processes, 50, 179-211.

Akerlof, G., & Kranton, R. (2010). Identity Economics. Princeton, NJ: Princeton University Press.

Allcott, H. (2011). Social norms and energy conservation. Journal of Public Economics, 95(5), 1982-2095.

An, S. (2008). Antidepressant direct-to-consumer advertising and social perception of the prevalence of depression: Application of the availability heuristic. Health Communication, 23(6), 499-505.

Ariely, D. (2008). Predictably Irrational. New York: Harper Collins.

Ariely, D., Loewenstein, G. (2006). The heat of the moment: The effect of sexual arousal on sexual decision making. Journal of Behavioral Decision Making, 19,  87-98.

Ariely, D., Loewenstein, G., & Prelec, D. (2003). “Coherent arbitrariness”: stable demand curves without stable preferences. Quarterly Journal of Economics, 118, 73-105.

Arkes, H. R., & Blumer, C. (1985), The psychology of sunk costs. Organizational Behavior and Human Decision Processes, 35, 124-140.

Aronson, E., Wilson, T., & Akert, A. (2005). Social Psychology (5th ed.). Upper Saddle River, NJ: Prentice Hall.

Arrow, K. (1958). Utilities, attitudes, choices: A review note. Econometrica, 26 (1): 1-23.

Banerjee, A. (1992). A simple model of herd behavior. Quarterly Journal of Economics, 107, 797-817.

Barone, M. J., & Tirthankar, R. (2010). Does exclusivity always pay off? Exclusive price promotions and consumer response. Journal of Marketing, 74(2), 121-132.

Bateman, I. J., Munro, A., & Poe, G. L. (2008). Decoy effects in choice experiments and contingent valuation: Asymmetric dominance. Land Economics, 84(1), 115-127.

Becker, G. S. (1976). The economic approach to human behavior. Chicago: The University of Chicago Press.

Berg, J., Dickhaut, J. & McCabe, K. (1995). Trust, reciprocity, and social history. Games and Economic Behavior, 10(1), 122-142.

Bickel, W., Odum, A., & Madden, G. (1999). Impulsivity and cigarette smoking: Delay discounting in current, never, and ex-smokers. Psychopharmacology, 146(4),447-454.

Bikhchandi, S., Hirschleifer, D., & Welch, I. (1992). A theory of fads, fashion, custom and cultural change as informational cascades. Journal of Political Economy, 100, 992-1026

Biswas, D. (2009). The effects of option framing on consumer choices: Making decisions in rational vs. experiential processing modes. Journal of Consumer Behaviour, 8, 284-299.

Bohnet, I., Greig, F., Herrmann, B., & Zeckhauser, R. (2008). Betrayal aversion: Evidence from Brazil, China, Oman, Switzerland, Turkey, and the United States. American Economic Review, 98, 294-310.

Branson, C., Duffy, B., Perry, C., & Wellings, D. (2012). Acceptable behaviour: Public opinion on behaviour change policy. London: Ipsos MORI. Retrieved from http://www.ipsos-mori.com/researchpublications/publications/1454/ AcceptableBehaviour.aspx

Buehler, R., Griffin, D., & Ross, M. (1994). Exploring the “planning fallacy”: Why people underestimate their task completion times. Journal of Personality and Social Psychology, 67(3), 366-381.

Camerer, C. (2003). Behavioral game theory. Princeton, NJ: Princeton University Press.

Camerer, C. F. (1997). Progress in behavioral game theory. Journal of Economic Perspectives, 11, 167-188.

Camerer, C., Loewenstein, G., & Prelec, D. (2005) Neuroeconomics: How neuroscience can inform economics. Journal of Economic Literature, 43, 9-64.

Camerer, C., Loewenstein, G., & Rabin, M. (Eds.) (2011). Advances in behavioral economics. Princeton: Princeton University Press.

Chandon, P., & Wansink, B. (2007). The biasing health halos of fast-food restaurant health claims: Lower calorie estimates and higher side-dish consumption intentions. Journal of Consumer Research, 34(3), 301-314.

Chartrand, T. L., Huber, J., Shiv, B., & Tanner, R. (2008). Nonconscious goals and consumer choice. Journal of Consumer Research, 35, 189-201.

Chartrand, T. L., & Bargh, J. A. (1999). The chameleon effect: The perception-behavior link and social interaction. Journal of Personality and Social Psychology, 76(6), 893-910.

Cialdini, R.B. (2008). Influence: Science and Practice, 5th ed. Boston: Pearson.

Cialdini, R. B., Wosinska, W., Barrett, D. W., Butner, J., Gornik-Durose, M. (1999). Compliance with a request in two cultures: The differential influence of social proof and commitment/consistency on collectivists and individualists. Personality and Social Psychology Bulletin, 25, 1242-1253.

Cialdini, R. B., Vincent, J. E., Lewis, S. K., Catalan, J., Wheeler, D., & Darby, B. L. (1975). Reciprocal concessions procedure for inducing compliance: The door-in-the-face technique. Journal of Personality and Social Psychology, 31, 206-215.

COI. (2009). Communications and behavior change. London, UK: COI Publications.

Coulter, K. S., & Coulter, R. A. (2005). Size does matter: The effects of magnitude representation congruency on price perceptions and purchase likelihood. Journal of Consumer Psychology, 15(1), 64–76.

Davenport, T. H.  (2009). How to design smart business experiments. Harvard Business Review, 87(2), 68-76.

Diamond, A. (2013). Executive functions. Annual Review of Psychology, 64, 135-168.

Diclemente, C. C., Marinilli, A. S., Singh, M., & Bellino, L. E., (2001). The role of feedback in the process of health behavior change. American Journal of Health Behavior, 25, 217-227.

Dolan, P., Hallsworth, M., Halpern, D., King, D., & Vlaev, I. (2010). MINDSPACE: Influencing behaviour through public policy. London, UK: Cabinet Office.

Duhigg, C. (2012). The power of habit: Why we do what we do in life and business. New York: Random House.

Dunt, I. (February 5, 2014). Nudge nudge, say no more. Brits’ minds will be controlled without us knowing it. The Guardian. www.theguardian.com/commentisfree/2014/feb/05/nudge-say-no-more-behavioural-insights-team.

Etzioni, A. (2011). Behavioral economics: Next steps. Journal of Consumer Policy, 34(3), 277-287.

Falk, A., Becker, A., Dohmen, T., Huffman, D. & Sunde, U. (2012). An experimentally validated preference module.  Retrieved from http://www.eea-esem.com/files/papers/eea-esem/ 2012/2688/FalkEtAl2012.pdf

Falk, A., & Kosfeld, M. (2006). The hidden costs of control. American Economic Review, 96, 1611–1630.

Fehr, E. (2009). On the economics and biology of trust. Journal of the European Economic Association, 7, 235-266.

Fehr, E., & Gächter, S. (2000). Fairness and retaliation: The economics of reciprocity. Journal of Economic Perspectives, 14, 159-181.

Fehr, E., & Schmidt, K. M. (1999). A theory of fairness, competition, and cooperation. The Quarterly Journal of Economics, 114, 817-868.

Festinger, L. (1957). A theory of cognitive dissonance. Stanford: Stanford University Press.

Finucane, M. L., Alhakami, A., Slovic, P., & Johnson, S. M. (2000). The affect heuristic in judgments of risks and benefits. Journal of Behavioral Decision Making, 13, 1-17.

Fisher, G. S. (2014). Advising the behavioral investor: Lessons from the real world. In H. K. Barker & V. Ricciardi (Eds.), Investor behavior: The psychology of financial planning and investing (pp. 265-283). New York: John Wiley & Sons.

Fiske, S. T., & Taylor, S. E. (1991). Social Cognition (2nd ed.). New York: McGraw-Hill.

Frederick, S., Loewenstein, G., & O’Donoghue, T. (2002). Time discounting and time preference: A critical review. Journal of Economic Literature, 40, 351-401.

Frederick, S., & Loewenstein, G. (1999). Hedonic adaptation. In D. Kahneman, E. Diener, & N. Schwarz (Eds.), Well-being: The foundations of hedonic psychology (pp. 302-329). New York: Russell Sage Foundation.

Fredrickson, B. L., & Kahneman, D. (1993). Duration neglect in retrospective evaluations of affective episodes. Journal of Personality and Social Psychology, 65(1), 45-55.

Frey, B., Benz, M., & Stutzer, A. (2004). Introducing procedural utility: Not only what, but also how matters. Journal of Institutional and Theoretical Economics, 160, 377-401.

Gächter, S., Orzen, H., Renner, E., & Starmer, C. (2009). Are experimental economists prone to framing effects? A natural field experiment. Journal of Economic Behavior & Organization, 70, 443-446.

Gigerenzer, G., & Goldstein, D. G. (1996). Reasoning the fast and frugal way: Models of bounded rationality. Psychological Review, 103, 650-669.

Gintis, H. (2009). The bounds of reason: Game theory and the unification of the behavioral sciences. Princeton: Princeton University Press.

Glaeser, E., Laibson, D., Scheinkman, J. & Soutter, C. (2000). Measuring trust. The Quarterly Journal of Economics, 115(3), 811-846.

Goldstein, D. G., Johnson, E. J., Herrman, A., & Heitmann, M. (2008). Nudge your customers toward better choices. Harvard Business Review, 86, 99-105.

Goldstein, D. G., & Gigerenzer, G. (2002). Models of ecological rationality: the recognition heuristic. Psychological Review, 109(1), 75-90.

Golman, R., Hagmann, D., & Loewenstein, G. (2017). Information avoidance. Journal of Economic Literature, 55(1), 96-135.

Goodman, J. K., Cryder, C. E., & Cheema, A. (2013). Data collection in a flat world: Strengths and weaknesses of Mechanical Turk samples. Journal of Behavioral Decision Making, 26(3), 213-224.

Gouldner, A. W. (1960). The norm of reciprocity: A preliminary statement. American Sociological Review, 25(2), 161-178.

Grinblatt, M., & Keloharju, M. (2009). Sensation seeking, overconfidence, and trading activity. Journal of Finance, 64(2), 549-578.

Guth, W., Schmittberger, R., & Schwarz, B. (1982). An experimental analysis of ultimatum bargaining. Journal of Economic Behavior and Organization, 3, 367-388.

Harley, E.M. (2007). Hindsight bias in legal decision making. Social Cognition, 25(1), 48-63.

Harford, T. (2014, March 21).  Behavioral economics and public policy. The Financial Times. Retrieved from http://www.ft.com/cms/s/2/9d7d31a4-aea8-11e3-aaa6-00144feab7de.html#axzz30po3p6lE.

Haynes, L., Service, O., Goldacre, B. and Torgerson, D. (2012). Test, learn, adapt: Developing public policy with randomised controlled trials. London: Cabinet Office.

Helweg-Larsen, M., & Shepperd, J. A. (2001). Do moderators of the optimistic bias affect personal or target risk estimates? A review of the literature. Personality and Social Psychology Review, 5(1), 74-95.

Hershfield, H.  E., Goldstein D. G., Sharpe, W. F., Fox J., Yeykelis, L., Carstensen, L. L., & Bailenson, J. N. (2011). Increasing saving behavior through age-progressed renderings of the future self. Journal of Marketing Research, 48, S23–S37.

Hirshleifer, D., & Luo, G. Y. (2001). On the survival of overconfident traders in a competitive securities market. Journal of Financial Markets, 4(1), 73-84.

Iyengar, S., & Lepper, M. (2000). When choice is demotivating: Can one desire too much of a good thing? Journal of Personality and Social Psychology, 79, 995-1006.

Jenner, E. A., Jones, F., Fletcher, B., Miller, L. & Scott, G.M. (2005). Hand hygiene posters: Motivators or mixed messages? Journal of Hospital Infection, 60, 218-225.

Johnson, E. J., & Goldstein, D. G. (2003). Do defaults save lives? Science, 302, 1338-1339.

Kahneman, D. (2011). Thinking, fast and slow. London: Allen Lane.

Kahneman, D. (2003). Maps of bounded rationality: Psychology for behavioral economics. The American Economic Review, 93, 1449-1475.

Kahneman, D., & Frederick, S. (2002). Representativeness revisited: Attribute substitution in intuitive judgment. In T. Gilovich, D. Griffin, & D. Kahneman (Eds.), Heuristics of intuitive judgment: Extensions and applications (pp. 49–81). New York: Cambridge University Press.

Kahneman, D., & Tversky, A. (1999). Evaluation by moments: Past and future. In D. Kahneman & A. Tversky (Eds.), Choices, values and frames (pp. 2-23). New York: Cambridge University Press.

Kahneman, D., Knetsch, J., & Thaler, R. (1991). Anomalies: The endowment effect, loss aversion, and status quo bias. Journal of Economic Perspectives, 5(1), 193-206.

Kahneman, D., & Tversky, A. (1982). The psychology of preference. Scientific American, 246, 160-173.

Kahneman, D., & Tversky, A. (1979). Prospect theory: An analysis of decision under risk. Econometrica, 47, 263-291.

Kahneman, D., & Tversky, A. (1972). Subjective probability: A judgment of representativeness. Cognitive Psychology, 3, 430-454.

Kardes, F. R., Posavac, S. S., & Cronley, M. L. (2004). Consumer inference: a review of processes, bases, and judgment contexts. Journal of Consumer Psychology, 14(3), 230-256.

Karlsson, N., Loewenstein, G., & Seppi, D. (2009). The ostrich effect: Selective attention to information. Journal of Risk and Uncertainty, 38, 95–115.

Kruger, J., Wirtz, D., Van Boven, L., & Altermatt, T. W. (2004). The effort heuristic. Journal of Experimental Social Psychology, 40(1), 91-98.

Laibson, D. (1997). Golden eggs and hyperbolic discounting. Quarterly Journal of Economics, 112, 443-477.

Lakshminarayanan, V., Chen, M. K., & Santos, L. R. (2011). The evolution of decision-making under risk: Framing effects in monkey risk preferences. Journal of Experimental Social Psychology, 47, 689-693.

Levin, I. P., Schneider, S. L., & Gaeth, G. J. (1998). All frames are not created equal: A typology and critical analysis of framing effects. Organizational Behavior and Human Decision Processes, 76, 149-188.

Loewenstein, G. (2005). Hot-cold empathy gaps and medical decision-making. Health Psychology, 24(Suppl. 4), S49-S56.

Loewenstein, G. (2000). Emotions in economic theory and economic behavior. The American Economic Review, 90(2), 426-432.

Loewenstein, G., & Ubel, P. (2010, July 14). Economics behaving badly. The New York Times. Retrieved from http://www.nytimes.com/2010/07/15/opinion/15loewenstein.html.

Loewenstein, G., O’Donoghue, T., & Rabin, M. (2003). Projection bias in predicting future utility. Quarterly Journal of Economics, 118(4), 1209-1248.

Loewenstein, G., Weber, E. U., Hsee, C. K., & Welch, N. (2001). Risk as feelings. Psychological Bulletin, 127(2), 267-286.

Madrian, B., & Shea, D. (2001). The power of suggestion: Inertia in 401(k) participation and savings behavior. Quarterly Journal of Economics, 116, 1149-1187.

Maheswaran, D., Mackie, D. M., & Chaiken, S. (1992). Brand name as a heuristic cue: The effects of task importance and expectancy confirmation on consumer judgments. Journal of Consumer Psychology, 1, 317-336.

March, J. G. (1978). Bounded rationality, ambiguity, and the engineering of choice. The Bell Journal of Economics, 9(2), 587-608.

Markus, H. R., & Kitayama, S. (1991). Culture and the self: Implications for cognition, emotion and motivation Psychological Review, 98, 224-253.

Mazar, N., Amir, O., & Ariely, D. (2008). The dishonesty of honest people: A theory of self-concept maintenance. Journal of Marketing Research, 45(6), 633-644.

Mazar, N., & Ariely, D. (2006). Dishonesty in everyday life and its policy implications. Journal of Public Policy & Marketing, 25, 1-21.

Mazar, N., & Zhong, C. (2010). Do green products make up better people? Psychological Science, 21, 494-498.

Mazzoni, G., & Vannucci, M. (2007). Hindsight bias, the misinformation effect, and false autobiographical memories. Social Cognition, 25(1), 203-220.

Merritt, A., Effron, D. A., Monin, B. (2010). Moral self-licensing: When being good frees us to be bad. Social and Personality Psychology Compass, 4/5, 344-357.

Mitchell, G. (2012). Revisiting truth or triviality: The external validity of research in the psychological laboratory. Perspectives on Psychological Science, 7(2), 109-117.

Moore, D. A., & Healy, P. J. (2008). The trouble with overconfidence. Psychological Review, 115(2), 502-517.

Morewedge, C. K., Gilbert, D. T., & Wilson, T. D. (2005). The least likely of times: How remembering the past biases forecasts of the future. Psychological Science 16(8), 626-630.

Murphy, S. T., & Zajonc, R. B. (1993). Affect, cognition, and awareness: Affective priming with optimal and suboptimal stimulus exposures. Journal of Personality and Social Psychology, 64, 723-729.

Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2, 175-220.

Nisbett, R. E., Peng, K., Choi, I., & Norenzayan, A. (2001). Culture and systems of thought: Holistic versus analytic cognition. Psychological Review, 108, 291-310.

Nisbett, R., & Wilson, T. J. (1977). The Halo Effect: Evidence for unconscious alteration of judgments. Journal of Personality and Social Psychology, 35, 250-256.

Norton, M. I., Mochon, D., & Ariely, D. (2012). The IKEA effect: When labor leads to love. Journal of Consumer Psychology, 22, 453-460.

O’Donoghue, T., & Rabin, M. (1999). Doing it now or later. American Economic Review, 89(1), 103-124.

Odean, T. (1998). Volume, volatility, price, and profit when all traders are above average. Journal of Finance, 53(6), 1887-1934.

Ofir, C., Raghubir, P., Brosh, G., Monroe, K. B., & Heiman, A. (2008). Memory-based store price judgments: the role of knowledge and shopping experience. Journal of Retailing, 84(4), 414-423.

Pallier, G., Wilkinson, R., Danthiir, V., Kleitman, S., Knezevic, G., Stankov, L., & Roberts, R. D. (2002). The role of individual differences in the accuracy of confidence judgments. Journal of General Psychology, 129(3), 257-299.

Prelec, D., & Loewenstein, G. (1998). The red and the black: Mental accounting of savings and debt. Marketing Science, 17(1), 4-28.

Prelec, D., & Simester, D. (2001). Always leave home without it: A further investigation of the credit-card effect on willingness to pay. Marketing Letters. 12(1), 5–12.

Read, D., & Loewenstein, G. (1995). Diversification bias: Explaining the discrepancy in variety seeking between combined and separated choices. Journal of Experimental Psychology: Applied, 1, 34-49.

Rick, S. I. (2018). Tightwads and spendthrifts: An interdisciplinary review. Financial Planning Review, 1(1-2), e1010. Retrieved from https://doi.org/10.1002/cfp2.1010.

Rode, C, & Wang, X. (2000). Risk-sensitive decision making examined within an evolutionary framework. American Behavioral Scientist, 43(6), 926-939.

Samson, A. (2014, February 25). A simple change that could help everyone drink less. Psychology Today. Retrieved from http://www.psychologytoday.com/blog/consumed/201402/simple-change-could-help-everyone-drink-less.

Samson, A., & Voyer, B. (2014). Emergency purchasing situations: Implications for consumer decision-making. Journal of Economic Psychology, 44, 21-33.

Samson, A., & Voyer, B. (2012). Two minds, three ways: Dual system and process models in consumer psychology. Academy of Marketing Science Review, 2, 48–71.

Samuelson, W., & Zeckhauser, R. J. (1988). Status quo bias in decision making. Journal of Risk and Uncertainty, 1, 7-59.

Schwartz, B. (2004). The paradox of choice: Why more is less. New York: Ecco.

Shah, A. K., & Oppenheimer, D. M. (2008). Heuristics made easy: an effort-reduction framework. Psychological Bulletin, 134(2), 207-222.

Shampanier, K., Mazar, N., & Ariely D. (2007). Zero as a special price: The true value of free products. Marketing Science, 26, 742-757.

Shang, J., & Croson, R. (2009). Field experiments in charitable contribution: The impact of social influence on the voluntary provision of public goods. The Economic Journal, 119, 1422—1439.

Sharot, T. (2011). The optimism bias. Current Biology, 21(23), R941-R945.

Shepperd, J. A., Carroll, P., Grace, J., & Terry, M. (2002). Exploring the causes of comparative optimism. Psychologica Belgica, 42, 65-98.

Shiller, R. J. (2015). Irrational exuberance. Princeton, NJ: Princeton University Press.

Shiv, B., Carmon, Z., & Ariely, D. (2005). Placebo effects of marketing actions: Consumers may get what they pay for. Journal of Marketing Research, 42(4), 383-393.

Simon, H. A. (1982). Models of bounded rationality. Cambridge, MA: MIT Press.

Slovic, P., Finucane, M. L., Peters, E., & MacGregor, D. G. (2002). The affect heuristic. In T. Gilovich, D. Griffin, & D. Kahneman (Eds.), Heuristics and biases: The psychology of intuitive judgment (pp. 397-420). New York: Cambridge University Press.

Slovic, P., Monahan, J., & MacGregor, D. M. (2000). Violence risk assessment and risk communication: The effects of using actual cases, providing instructions, and employing probability vs. frequency formats. Law and Human Behavior, 24(3), 271-296.

Strecher, V. J., Seijts, G. H., Kok, G. J., Latham, G. P., Glasgow, R., DeVellis, B., Meertens, R. M., & Bulger, D. W. (1995). Goal setting as a strategy for health behavior change. Health Education Quarterly, 22, 190-200.

Sullivan, P. S., Lansky, A., & Drake, A. (2004). Failure to return for HIV test results among persons at high risk for HIV infection: Results from a multistate interview project. JAIDS Journal of Acquired Immune Deficiency Syndromes, 35(5), 511–518.

Thaler, R. H. (2015). Misbehaving: The making of behavioral economics. Allen Lane.

Thaler, R. H. (2008). Mental accounting and consumer choice. Marketing Science, 27, 15-25.

Thaler, R. H. (1999). Mental accounting matters. Journal of Behavioral Decision Making. 12, 183-206.

Thaler, R. H. (1990). Anomalies: Saving, fungibility, and mental accounts. The Journal of Economic Perspectives, 4, 193-205.

Thaler, R. H., & Sunstein, C. (2008). Nudge: Improving decisions about health, wealth, and happiness. New Haven, CT: Yale University Press.

Thaler, R. H., & Benartzi, S. (2004). Save More Tomorrow: Using behavioral economics to increase employee saving. Journal of Political Economy, 112, S164-S187.

Thaler, R. H., & Johnson, E. J. (1990). Gambling with the house money and trying to break even: The effects of prior outcomes on risky choice. Management Science, 36(6), 643-660.

Thorndike, A. N., Sonnenberg, L., Riis, J., Barraclough, S., & Levy, D. E. (2012). A 2-phase labeling and choice architecture intervention to improve healthy food and beverage choices. American Journal of Public Health, 102(3), 527-533.

Triandis, H. (1977). Interpersonal behavior. Monterey, CA: Brooks/Cole.

Tulving, E., Schacter, D. L., & Stark, H. A. (1982). Priming effects in word fragment completion are independent of recognition memory. Journal of Experimental Psychology: Learning, Memory and Cognition, 8(4), 336-342.

Tversky, A., & Kahneman, D. (1981). The Framing of Decisions and the Psychology of Choice. Science, 211 (4481), 453-458.

Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science (New Series), 185, 1124-1131.

Vohs, K. D., Baumeister, R. F., Schmeichel, B. J., Twenge, J. M., Nelson, N. M., & Tice, D. M. (2008). Making choices impairs subsequent self‐control: A limited‐resource account of decision making, self‐regulation, and active initiative. Journal of Personality and Social Psychology, 94, 883‐898.

Wansink, B., Kent, R. J., & Hoch, S. J. (1998). An anchoring and adjustment model of purchase quantity decisions. Journal of Marketing Research, 35(1), 71–81.

Wilson, T. D., Gilbert, D. T. (2003). Affective forecasting. Advances in Experimental Social Psychology, 35, 345-411.

Wood, W., & Neal, D. T. (2009). The habitual consumer. Journal of Consumer Psychology, 19, 579-592.

Zak, P. J., & Knack, S. (2001). Trust and growth. Economic Journal, 111, 295-321.

Zellermayer, O. (1996). The pain of paying. (Doctoral dissertation). Department of Social and Decision Sciences, Carnegie Mellon University, Pittsburgh, PA.

Zhang, C. Y., & Sussman, A. B. (2018). Perspectives on mental accounting: An exploration of budgeting and investing. Financial Planning Review, 1(1-2), e1011.