I don’t need to tell you that addiction is bad, as is addictive design. That much is obvious to everybody, including the European Parliament who’ve assessed the nature and impact of addictive apps, websites, and platforms and determined that new rules must be put into place in order to combat digital addiction.
In this article, I’ll dive straight into digital addiction — what it actually looks like and what causes it. I’ll also explain what gamification is and why it might not be a good thing before we get into the nitty-gritty details of designing ethical digital products and avoiding addictive design (and dark patterns overall).
Digital addiction is bad for your mental health and your physical health, but you probably knew that already and frankly are sick of hearing about it, so let’s take a look at how digital addiction actually affects people on a day-to-day basis.
The most notable and sometimes only sign that somebody is suffering from digital addiction (or any addiction) is a change in their behavior. They might become quiet and withdrawn, become disinterested in things or people that used to interest them, appear anxious or irritable, and this affects the people around them as much as it affects them. Out of fear of embarrassment or intervention, they might lie about how much time/money they’re spending on/with digital products, time and money of which they might not have.
Additionally, they might neglect important parts of their life such as their job or personal responsibilities, despite feeling frustrated or anxious about falling behind.
What’s worse is that they escape these negative feelings by retreating even further into their addiction — it’s a vicious cycle that people often feel powerless to break, and the issue isn’t just that businesses exploit this behavior but cultivate it.
This applies to social media, games, gambling, gambling disguised as games, and more.
Putting dark UX patterns aside for now, let’s take a more holistic look at what causes digital addiction.
The most obvious quality of addictive digital products is that they’re fun or interesting; however, the range of things that constitute fun or interesting is quite wide. Think: educational, popular, competitive, weird, shocking, morbid even — anything that invokes a strong reaction:
Digital products that are endless are also quite addictive. For apps and websites, a good example is an endless supply of content to consume (or just doom scroll). For games, endless missions.
Addictive digital products also tend to have some kind of reward mechanism to keep users engaged.
The worst quality, however, is that addictive digital products are always within hands reach. There are absolutely no obstacles or limits in the way of them, which often causes people to engage with them more out of habit (need) than interest (want):
Gamification means to make something look or feel like a game in order to boost engagement, but it’s difficult to define what gamification is exactly because many actual games today are also addictive and exploitative.
On one hand, gamification can make mundane tasks a little bit more interesting by breaking them down into smaller quests and implementing a reward mechanism.
On the other hand, gamified products with social aspects can make people irrationally competitive (i.e., pay to win or pay to get ahead).
So it’s clear that gamification isn’t necessarily a good thing (at least not anymore), and might be a positive framing of what actually boils down to dark patterns a lot of the time. It might be best to stop referring to gamification holistically and instead refer to the individual qualities of gamified products and the context in which they’re used.
On that note, let’s take a look at the actual UX patterns — the good, the bad, the ugly — when they’re ethically okay to use and why our predispositions about their profitability might be wrong in today’s society.
To motivate users to complete mundane tasks, break them down into smaller sub-tasks (or objectives/quests if you want to make them sound a bit more interesting). The reason why this works is because people are more likely to attempt smaller tasks.
Plus, completing them will make them feel like they’re making progress and thus they’ll go on to complete more. It’s unethical to use this UX pattern on tasks that people are already motivated to complete because it fosters addiction. Good example: long forms (as shown below). Bad example: games.
Another trick is to implement a reward mechanism. I call it a “mechanism” because it doesn’t have to be an actual reward; for example, to-do lists often use satisfying micro-interactions (i.e., animations/sounds) to motivate users to complete tasks. Again though, this UX pattern shouldn’t be used to cultivate addiction by offering rewards in exchange for completing vanity engagement tasks (like in the Xbox Rewards Hub example below):
A dark UX pattern that you should avoid at all costs is endlessness, even if it does increase engagement. This includes content (e.g., feeds, which result in doom scrolling behavior) and tasks (e.g., to-dos and missions). The reason why this is so bad is that people tend to continue with what they’re doing even as conditions change (plan continuation bias), which in this case are the effects of addiction (i.e., fatigue, etc.).
This is actually one of the hotter topics of the European Parliament debate on combating addictive design, along with autoplay and notifications (which should both be turned off, by default):
Making things available for a limited amount of time fosters anxiety in the form of FOMO (fear of missing out). However, it’s a fair trade-off if it’s some kind of reward for adopting a healthy habit such as drinking water, as opposed to a relatively pointless habit such as completing vanity engagement tasks on an app or website. You might describe positive but not necessarily healthy habits such as using a learning app as either ethical or unethical depending on the frequency of the rewards.
The effects of FOMO, positive or negative, can be boosted by enabling users to maintain streaks (like, once again, in the Xbox Rewards Hub example below). My advice? Implement forgiving streaks and time-limited rewards in moderation (oh, and make it feel rewarding!):
To make something more fun, you could introduce a competitive social aspect, but not if people can pay to get ahead. In gaming, we sarcastically call these games “pay to win” games, but we’re also seeing this ugly practice in everyday apps and websites now.
For example, people that think of X (formerly Twitter) as a game now have to pay for visibility. This is particularly damaging to people that calculate their self-worth based on the number of likes and reposts they get, which isn’t a small amount of people unfortunately.
Ultimately, everybody should have an equal chance to win, and while this doesn’t necessarily prevent digital addiction, it doesn’t encourage it either. Certainly avoid leveraging competitiveness in damaging scenarios such as social media though. See the below statement from X regarding “Prioritized ranking”:
It’s not possible to create a perfect experience for everybody because we all behave differently, and our habits change anyway. This is why it’s never a bad idea to implement engagement limits (i.e., soft limits, hard limits, and perhaps the option for what gambling websites call “self-exclusion,” which is essentially a request to have your account temporarily or permanently banned). Self-exclusion is a feature mandated by the law, by the way (at least in the UK). Banks can also impose gambling blocks, and Apple breaks down screen time by app and enables users to set usage limits — however, all of these actions are voluntary:
Is that enough to combat digital addiction? I’ll leave that to you to decide. Personally, I think that automatic soft limits (i.e., friendly warnings) and hard limits (i.e., temporary time-outs) aren’t a bad idea for apps, websites, and platforms that can potentially become addictive, in addition to self-exclusion.
That could be what the European Parliament decides upon anyway.
Ultimately, squeezing customers for every little bit of money and time they have has obviously been a successful approach, but times are changing. More and more people are adopting healthier lifestyles, including social media detoxing and overall spending less time in the digital world.
Besides, there will be rules in place eventually to prevent businesses from manipulating and exploiting customers using addictive design patterns. Just like we now have privacy and accessibility laws, we’ll have addictive design laws too. It’s a matter of when, not if.
However, this isn’t a bad thing. People are more willing than ever before to spend money on things that don’t come with these negative effects. For example, addictive design patterns are the reason that I don’t have social media on my phone or play certain games, but there’s still money in my wallet and time in my schedule for products that don’t demand so much from me and make me feel burned out, and this trend is becoming more popular every day.
Hence, accepting that addictive design and unethical design overall isn’t welcome anymore might be a better mindset moving forward and a more profitable approach in the long run at least.
I suppose we all have a responsibility to make digital products healthier to use, don’t we? UX designers shouldn’t implement dark patterns, product designers shouldn’t entertain exploitative features, executives shouldn’t push for product success in ways that require said features, and even users (addicted or not) shouldn’t help unethical products thrive by spending time and money on them.
Realistically though, people often feel powerless to push back like this in a work environment, which is exactly why these new rules that are being discussed are so important.
There was a lot of conversation about data protection and accessibility before rules came into place that pressured businesses to take privacy and accessibility seriously, and what we’re seeing here with addictive design is no different. Addictive design will almost certainly become regulated and businesses that are slower to adapt will ultimately face more legal and technical challenges. They must begin shifting away from addictive design practices now.
Jumping off a sinking ship at the last minute is such a risky move, and considering that unethical businesses already suffer from poor brand reputation and social media scrutiny, perhaps it’s best for them to accept that the jig is up at this point.
Instead of focusing on addictive design and dark patterns, perhaps creating more value, attracting more customers, and everything else that we’ve discussed in this article can be a better approach to generating revenue. I’m not convinced that addictive design is a bigger moneymaker than ethical design anyway, at least not in today’s society where people don’t hesitate to boycott businesses that treat people badly.
This is quite a big topic so please drop your thoughts into the comment section below, and as always, thanks for reading!
LogRocket lets you replay users' product experiences to visualize struggle, see issues affecting adoption, and combine qualitative and quantitative data so you can create amazing digital experiences.
See how design choices, interactions, and issues affect your users — get a demo of LogRocket today.
Subscription pages are meant for users and businesses, and they should work well for both parties. This blog is a thorough discussion of what’s best and what’s not when it comes to designing subscription pages.
Call it what it is. Product designers and UX designers have unique roles, even if their titles often get swapped. In this blog, know the difference and own your expertise.
Search bars are more than icons and inputs — they can be a retention magnet or a churn trigger. Sharing my tried-and-tested search bar design principles in this blog!
Are your colors clashing or cohesive? In this blog, I talk about clashing colors, their impact, and how you strike the perfect balance with colors in your designs.