Crafting Choices For The Irrational Mind
Why Smart Products Speak to Our Instincts, Not Just Logic
How many of your daily decisions are truly rational and deliberate? The answer: very few. We navigate life on autopilot, making snap judgments because thinking through every choice would be impossible. Yet, these quick decisions shape our daily experiences—what we buy, how we interact, and which products we trust.
For product managers and UX designers, there is a critical realization: our users make decisions the same way we do—fast, instinctive, and shaped by biases. The difference? In this scenario, you’re not the user—you’re the choice architect. A choice architect has the responsibility for organizing the context in which people make decisions.
In this article, we will explain the psychology behind biases. We will explore most common biases and see real examples of how local Saudi apps utilized them. Finally, we'll leave you with very practical resources to navigate your way through biases and heuristics.
Psychology Behind These Biases
Daniel Kahneman, a Nobel Prize-winning psychologist and author of Thinking, Fast and Slow, revolutionized our understanding of human decision-making. His research in behavioral economics explains why people often make irrational choices, particularly due to cognitive biases.
He describes the mind as operating in two distinct systems:
System 1: Fast, automatic, and instinctive—requiring little to no effort.
System 2: Slow, deliberate, and analytical—requiring focus and mental energy.
Why did the human brain evolve to rely so heavily on System 1, leading to involuntary biases? The answer is survival. We live in a world overloaded with information, and processing every detail rationally would be impossible. To function efficiently, the mind developed shortcuts—heuristics—that help us make quick decisions. These mental shortcuts often serve us well, but they also create predictable errors in judgment, shaping the way users interact with products and experiences.
These basic terms (e.x, System 1) and biases (e.x, Anchoring Bias) have been developed by scientists across decades through tons of social experiments. To give a glimpse of how these experiments work, we'll give an example of one of the many experiments that have been run to understand one famous bias called "Anchoring Bias". In this experiment, some visitors at the San Francisco Exploratorium were divided into 2 groups. First group received this question:
Is the height of the tallest redwood more or less than 1,200 feet? What is your best guess about the height of the tallest redwood?
On the other hand, the second group received the following question
Is the height of the tallest redwood more or less than 180 feet? What is your best guess about the height of the tallest redwood?
The first group's average was 844 feet. However, the second group's average was 282 feet. These results show the impact of anchoring in human decision making. What happened is that arbitrary number given in the question has been taken as a baseline for participants to build their best guesses on. This is basically how anchoring works and more importantly how psychologists come with different biases with time.
Going forward, we'll spend less time on psychology and more time on how different biases basically work.
Anchoring Bias
You might recall sometimes when we were asked to guess something and you anchored a relative number to go closer to the number you were asked to guess for. This voluntary smart work to leverage on what you know to project what you don't is also done involuntarily. Taxi drivers were surprised when credit cards came to play. When they installed the Point Of Sales (POS) technology that gave reluctant tippers 3 options to choose from besides manually entering the amount:
15%
20%
25%
The total tips increased because riders started from higher anchors compared to what they usually give. In similar experiments, increasing the options increased the tips. However, there is an unknown limit of aggressive defaults where people turn rather reactance (feel ordered and produce unwanted results)
Definition:
"People rely too heavily on the first piece of information (the "anchor") they receive when making decisions or judgments, even if it’s irrelevant."
—KeepSimple
Example: ~Hunger Station
HungerStation: An online food delivery app that connects users with restaurants and grocery stores for convenient ordering and doorstep delivery.
Context: When the user receive their order in a timely manner, the app shows the following screen when the order is about to be delivered
HungerStation app leveraged on anchoring by placing certain options to choose the tip from (3%, 5%, ,10%). These options anchor towards 10%. Although tipping may not work with HungerStation due to local culture around tips, this is a great utilization of Anchoring Bias. HungerStation played it well also by utilizing the Reciprocity concept where they introduced tipping at a positive touchpoint.
“Reciprocity means that in response to friendly actions, people are generally nicer and more cooperative.” (Wikipedia, 2024)
Loss Aversion
Humans are loss averse by design. In other words, the prospect of losing something makes you twice as miserable as the prospect of gaining the same thing makes you happy (Thaler & Sunstein, 2008, p. 35). There is a classic basketball experiment when some participants won basketball tickets for a high-profile game. When they were asked what the price they would sell the ticket for, they priced much higher than any participant would buy a ticket for (presumably they would never pay half the price for a ticket) (Growth Mindset Psychology, Cognitive Rigidity, 2024).
Definition:
"People feel the pain of losing something much more intensely than the joy of gaining something of equal value. Losing $10 hurts way more than finding $10 feels good."
—KeepSimple
Example: ~Abyan Capital
Abyan Capital: A robo-advisory app where users can get investment portfolios tailored for their financial needs.
Context: When an investor tries to withdraw their money at a time of loss, the following screen appears.
Abyan app is leveraging loss aversion to convince the invested to do not commit on losing their money after investing and wait to regain their capital
Framing
The way information is presented can shape decisions, even if the facts remain unchanged. People react very differently based on whether an outcome is framed as a gain or a loss. For example, consider two ways of presenting a health treatment:
"If you use energy conservation methods, you will save $350 per year" (Positive frame)
"If you do not use energy conservation methods, you will lose $350 per year" (Negative frame)
Even though both statements mean the same in reality, the first one feels less risky, making people more likely to react to the second statement.
Definition:
"The way information is presented (framed) can significantly influence decisions and judgments, even if the underlying facts remain the same."
—KeepSimple
Example: ~Hunger Station
HungerStation: An online food delivery app that connects users with restaurants and grocery stores for convenient ordering and doorstep delivery.
Context: When a subscribed user go to the subscriptions page, the app shows the following screen
Again, HungerStation smartly used another heuristic which is Framing. HungerStation is leveraging on Framing by the statement “Your savings 7,280” instead of using statements “Keep enjoying free delivery on selected restaurants”. It’s hard for a user to choose not to save such an amount in 6 months.
Availability
Availability bias occurs when people judge the frequency of an event based on how easily similar instances come to mind.
For example, causes of death that receive high media attention—such as plane crashes or shark attacks—often lead people to overestimate their actual likelihood. In one experiment, participants were asked to compare the frequency of different causes of death (Kahneman, 2011, p. 138). 80% of respondents believed accidental deaths were more common than strokes, when in reality, strokes cause nearly twice as many deaths as all accidents combined. This illustrates how our perception is shaped more by what we easily recall than by actual statistics.
Definition:
"People tend to overestimate the likelihood or importance of events based on how easily examples come to mind. Recent or emotionally charged events are often given disproportionate weight in decision-making."
—KeepSimple
Example: ~WealthFront
Wealth Front: A robo-advisory app offering automated investment and cash-management services.
Context: When an investor create a new portfolio.
WealthFront leverages on historical data of a certain portfolio to help the user deciding weather to create the portfolio or not. Using historical performance of the portfolio in this context increase the chance of the user relating to same performance replicating in the future. The historical data in this context is the available information to the user at the point of decision-making that might bias them towards the action of continuing,
Social Proof
Humans tend to follow the behavior of others, especially in situations where they are uncertain. This psychological tendency, known as social proof, means that if many people are doing something, others assume it must be the correct or preferable choice.
A classic experiment demonstrated this effect when participants were placed in a room filled with smoke. When alone, most left the room quickly. However, when others in the room ignored the smoke, participants hesitated, assuming no immediate danger. This shows how people often look to others for cues on how to act.
Definition:
"When people trust something because other people are attracted to it"
—KeepSimple
Example: ~Abyan Capital
Abyan Capital: A robo-advisory app where users can get investment portfolios tailored for their financial needs.
Context: When a new user downloads the app, after onboarding, Abyan shows the following screen.
Abyan again leveraged on another heuristic which is Social Proof. That is done by leveraging the number of users they’re serving. Knowing that the app serves hundreds of thousands of users adds credibility and might nudge the user to follow the crowd and check why others are into the app.
In conclusion, as PMs and UX designers, we're all choice architects crafting the context which our subject users experience. Those users are humans at the end and make decisions quickly influenced by tons of biases. The spirit of this article is that learning such biases can help create more effective, intuitive, and empathic customer experience.
(Kahneman, 2011, p. 21).
Resources:
Kahneman, D. (2011). Thinking, Fast and Slow. New York: Farrar, Straus and Giroux.
Thaler, R., & Sunstein, C. (2008). Nudge: Improving Decisions About Health, Wealth, and Happiness. New Haven, CT: Yale University Press.
Growth Mindset Psychology, Cognitive Rigidity, 2024,
KeepSimple, https://keepsimple.io/uxcore
Wikipedia, 2024, https://en.wikipedia.org/wiki/Reciprocity_(social_psychology), https://www.nngroup.com/articles/reciprocity-principle/