The Ethical Implications of UX Dark Patterns

Stefanie Lauria
15 min readDec 21, 2021

Introduction

The practice of user experience is centered between user needs and business needs. In good design, business needs meet user needs. However, there are instances where user needs are ignored for the sake of profitability and fast returns. In these cases, design decisions are made without a good ethical framework and the outcomes are deceptive and manipulative to users. This type of UX Design is called “dark patterns.” Dark patterns can be pernicious, and with no real ethical framework in the field of design, designers are leaning in their own morality and applied ethics to make these decisions. The reasons for this individual approach to ethics are complex, creating a need for a strong and effective UX design body to give ethical guidance to the field of design.

Photo by Joshua Rivera on Unsplash

Defining Dark Patterns

The term “dark patterns” was first defined by Harry Brignull, PhD in Cognitive Science, in 2010 (Jaiswal, 2018, paras 7). It is “instances where designers use their knowledge of human behavior…and the desires of end users to implement deceptive functionality that is not in the user’s best interest” (Gray et al., 2018, p. 1). Brignull observed through many websites 11 types of dark patterns that he then documented (Jaiswal, 2018). The first pattern described is bait and switch where the results of an intended user action end in an undesirable outcome. For example, closing a window would force a user to upgrade instead of canceling as in the case of Windows 10 (Jaiswal, 2018, paras 16–17). The second pattern is disguised ads where ads are designed as an element of the page making users mistakenly click on then expecting it to be a feature of the site, but instead they get redirected somewhere else (Jaiswal, 2018, paras 19). The third pattern is called forced continuity where in subscription services will give you a free trial only if you input your credit card, and then when the trial expires will quietly start charging the user without any warning and it is made worse when there is no clear path to cancel the subscription online (Gray et al., 2018, p. 4). The fourth, friend spam, is when a social medial platform asks for permission to email your address book under false pretenses, and instead spams all the contact with a message as if it came from the user (Gray et al., 2018, p. 4). This was a pattern practiced by LinkedIn that resulted in a $13 million class action lawsuit in 2015 (Jaiswal, 2018, paras 26). The fifth is hidden costs. This pattern is when a user encounters surprise charges at the end of the checkout process (Gray et al., 2018, p. 4). This dark pattern is used by “Curology, an acne treatment subscription, advertises a monthly prescription-based bottle to the subscriber at $19.95/month. The cost however is not that, it is $19.95 + $4.95 (shipping) which is revealed much later in the process of registering” (Jaiswal, 2018, paras 28). The sixth is misdirection and is used to distract the user from a questionable element on the page by making the user focus on something else (Gray et al., 2018, p. 4). The seventh is price comparison prevention making it hard for users to make an informed decision about the purchase they are making (Jaiswal, 2018, paras 33). The eight is privacy “zuckering” named after Facebook’s CEO, Mark Zuckerberg. This pattern tricks users into sharing more information than they originally intended (Jaiswal, 2018, paras 35). The nineth pattern is called roach motel. This pattern is especially used by subscription services as the design will easily allow the users to get into a situation but make it almost impossible to get out of it later (Jaiswal, 2018, paras 37). The tenth pattern identified is sneak into basket. In this pattern the user “attempts to purchase something, but somewhere in the purchasing journey the site sneaks an additional item into your basket, often through the use of an opt-out radio button or checkbox on a prior page” (Gray et al., 2018, p. 4). The eleventh and last one is trick question. In this pattern users are asked a question with double negatives that at first glance seem to ask one thing but when read carefully have a completely different meaning and intent (Gray et al., 2018, p. 4).

In addition to these eleven dark patterns, Gray et al. (2018) expand on this framework to define “five primary categories of designer strategies that we label as dark patterns” (p. 4). The first in this new framework is nagging. Nagging is repeated intrusions used to interrupt the task at hand and take the user’s focus into somewhere unrelated from the original task (Gray et al., 2018, p. 5). Gray et al. (2018) provide Instagram as an example of utilizing this pattern. A modal pops-up prompting the user to turn on notifications with the only options as “Not Now” or “OK” and never providing the ability for the user to dismiss the message all together (Gray et al., 2018, p. 5). The second is obstruction. It impedes a “task flow, making it an interaction more difficult than it inherently needs to be with the intent to dissuade an action” (Gray et al., 2018, p. 5). The authors provide the example of theladders.com. This job search site requires that the user create an account to view the job postings and charges a premium to apply to them. However, many of the postings are available through other sources for free and without requiring an account (Gray et al., 2018, p. 5). The third example that the authors describe as a dark pattern is sneaking. This is an “attempt to hide, disguise, or delay the divulging of information that has relevance to the user” (Gray et al., 2018, p. 6). An example of this pattern is Salesforce.com requiring users to consent to a privacy statement before unsubscribing from their emails. If the users do not read the fine print, then they will not see that their information is being sold to another country (Gray et al., 2018, p. 6). The fourth example provided by the authors of dark patterns is interface interference. Here the interface is manipulated in ways that privilege certain actions over others and therefore causes confusion and limits discoverability of important actions to the user (Gray et al., 2018, p. 7). In this category, there are three subtypes of dark patterns that the authors elaborate on. One is hidden information where “critical information requires additional discovery effort” (Gray et al., 2018, p. 7). “The primary motivator behind hidden information is the disguising of relevant information as irrelevant” (Gray et al., 2018, p. 7). Another subtype is preselection where an option is selected by default even though this selection only serves to benefit the owners of the product not the user necessarily (Gray et al., 2018, p. 7). The third subtype is aesthetic manipulation where design choices are made to distract the user from an action or convince them of something else (Gray et al., 2018, p. 7). In this subtype toying with emotions is also categorized. In this pattern, language, tone, style, and color are used to evoke emotion to persuade the user to act in a way that is most beneficial to the product owner. An example is when purchasing a product and being given a limited time to complete the order or else lose the product (Gray et al., 2018, p. 7). The other categorization under this subtype is false hierarchy. In this practice, the design gives users the false sense that certain options are more important than others, when, in reality, they are equal (Gray et al., 2018, p. 7). The last of this framework is forced action which requires the user to “perform a specific action to access (or continue to access) specific functionality” (Gray et al., 2018, p. 8). A notable example of this is being forced to upgrade to an operating system before being allowed to restart the system (Gray et al., 2018, p. 8). Under this category, there are two subtypes that are worth noting. One is what the authors classify as a social pyramid which requires users to recruit other users to use the services. This is employed frequently by social media sites to grow their user base (Gray et al., 2018, p. 8). The second subtype is gamification which describes how users need to earn certain aspects of the services that are undesirable in order to continue the user journey with the product. An example of this dark pattern is employed by many gaming apps where they will almost force the user to pay for power-ups to advance the level or play the game for a significant longer period of time (Gray et al., 2018, p. 8).

Photo by Riccardo Annandale on Unsplash

Ethical Framing in UX Design

These dark patterns violate many ethical boundaries, and most of these design decisions are framed through individual designers’ personal morals and individual ethical practices. This is supported by the research of Gray and Chivukula (2019). They sought to examine the decision-making process of designers without “guiding methods or frameworks” (Gray & Chivukula, 2019, p. 2) and through the less of “mediation, identifying how organizations, personal, and ethical frameworks impact design practices” (Gray & Chivukula, 2019, p. 2). Gray and Chivukula (2019) also denotes that designers engage with their work from a “predominantly pragmatic ethical perspective” (p. 2).

The decision-making process in the field of design is being shaped by organizational and personal factors (Gray & Chivukula, 2019, p. 3). The complexity involved is influenced by the organizations where design is practiced, by the individual and applied ethics (Chivukula et al., 2020, p. 2). This is exemplified in the research as practitioners were interviewed and observed in their environments for the purpose of identifying how they made ethical decisions in their everyday work.

Gray and Chivukula observed three designers in various levels of expertise and functions within the discipline. The first, a designer that works at an agency, deliberately removed himself from making ethical decisions since the agency had consultants that review the work from an ethical perspective (Gray & Chivukula, 2019, p. 5). This designer furthers focuses on prioritizing the scope of work, and therefore feels oriented to not concern himself too deeply with the ethical considerations especially if they fall outside his responsibilities and define by any agreement made with the client (Gray & Chivukula, 2019, p. 5). The second designer observed by Gray and Chivukula (2019) was a design manager at another agency. This designer sees his role as a client partner and consultant hired to give professional opinions on the proposed work. This designer reads a cross section of disciplines to inform his understanding of design interactions on human behavior. He even communicates to the designers that he oversees not to practice “’fear-based persuasion’” (Gray & Chivukula, 2019, p. 6). Instead, he relies on user research to identify users’ aspiration not just needs. However, when pressed by a client to create a design pattern that inherently uses sneaking and price comparison prevention, this designer sought to mitigate the business request with user needs. This demonstrates how this designer was “comfortable in ‘taking something away from [the user]’ in terms of functionality” to appease client requests (Gray & Chivukula, 2019, p. 7). The third designer in this study works in-house for a business-to-business company. This designer focuses her decisions on the outcomes of user research and user testing, and she is focused on delivering performance by simplifying user flows. Furthermore, “organizational practices and goals constrains her values to usability, with no felt ability to address potential social impacts of her design work” (Gray & Chivukula, 2019, p. 7). She reads scholarly articles and white papers to stay abreast of usability issues and to guide design decision but is focused on the organization’s goal of simple usability and does not engage with personal values to create her designs (Gray & Chivukula, 2019, p. 7). Therefore, in this example, the organization is prioritizing features and as a UX designer, she and the team are concerned with simplifying the task flow especially since their measure of success is having the user accomplish the entire task flow (Gray & Chivukula, 2019, pp. 7–8). There was no identified preoccupation in the designer’s part to interject any ethical evaluation to the work being done (Gray & Chivukula, 2019, p. 8).

Photo by Green Chameleon on Unsplash

Rationale on Personal Ethics

Chivukula et al. (2020) identifies that in practice designers face design complexities that have ethical implications and a level of responsibility that links personal beliefs to outcomes of their design and implications to the world at large (pp. 2–3). In this study, semi structured interviews were conducted with 11 designers in the field that represent a “range of industry types, years of experience, differing educational backgrounds and degree levels, current role(s) in their organization, and related experiences as a professional practitioner” (Chivukula et al., 2020, p. 3). The findings assert the positionality of UX in the enterprise and conflicts and balancing decision making as dimensions that constrains designer’s ethical decision-making process (Chivukula et al., 2020, p. 9). Each of these dimensions explains how designers are forced to compromise ethics and must struggle with personal morals to make human-centered design decisions.

In positionality of UX in the enterprise Chivukula et al. (2020) explains the power of influence of the discipline to effectively advocate for users and exercise their ethical commitment to better user outcomes. Therefore, ethical integrity could be compromised since other more powerful disciplines can out rank design in the decision-making process. One of the respondents said, “how do we then negotiate and basically come to a place where we do the best for the user, while considering where the practical constraints from engineering and or business sides?” (Chivukula et al., 2020, p. 5).

In conflicts and balancing decision-making Chivukula et al. (2020) explains that designers face push-back on what is most appropriate for the user from an ethical standpoint. This means that any minimal repercussion to the business goals, and they are forced to make nuanced decisions that mitigate the business requirements and user goals (Chivukula et al., 2020, p. 9). One respondent stated that they faced constant pushback on some design decisions because the marketing team saw those as a potential threat to sales and any decrease is a crisis (Chivukula et al., 2020, p. 6).

There is also no real accountability for organizations that practice dark patterns. There is no consequence when a website sneaks’ additional product into the cart, for instance. In the face of this ineffectual indifference, designers’ resort to shaming companies on social media to call attention to these dark patterns. “#darkpatterns tweets are being used…to publicly denounce companies for implementing dark patters in design practices” (Fansher et al., 2018, p.5).

In addition, “currently, there is little guidance regarding how students and practitioners should recognize, articulate, and act upon their values in appropriate ways” (Chivukula, Brier, Gray, 2018, p.4). A study conducted with graduate and undergraduate UX students found that at the onset of the problem statement, participants acknowledge user needs and concerns, however they later adjust “design decisions to become more aligned with stakeholder needs, often at odds with known and defined user goals” (Chivukula, Brier, Gray, 2018, p.4).

Additionally, collective ethical frameworks are nascent in the field of Human Computer Interaction. The Human Use of Human Beings by Nobert Weiner which explores several ethical entanglements computers can cause was not published until 1950 (Bynum, 2015). It then took another 25 years for the subject to be broached again in a serious academic formality with the publication on Cybernetics by Walter Maner in 1976 (Bynum, 2015). And even though the concentration has been evolving, it has not kept pace with the information revolution, leaving a gap in the body of work.

Photo by Jason Goodman on Unsplash

The Need for Institutional and Professional Approach to Ethics

Ergo, the focus should shift from the individual to the collective. Dourish lays out the problem of individual action to affect collective change when he describes similar issues in the ethical problems in HCI and environmental sustainability. There is “broad focus upon individual rather than collective action, on information technology as a persuasive force in behavior change, and the adoption of existing HCI methods, tools, and rationales as means to a solution” (Dourish, 2010, p.1). He argues, however, that problems at scale need to be addressed by the community at large for real change to be surmountable.

Design as a discipline has relied too earnestly on what Chivukula et al. (2020) calls “identified design activities/practices,” such as user research to increase and support ethical decisions (p. 9). Results from user research and usability studies increase the trust in the decisions designers make as right ones for the users.

Reliance on both of these methods revealed a sense of trust in user research as bringing about more ethical outcomes, with an unstated assumption that if research was conducted and acted upon, it would result in a more desirable end product (Chivukula et al., 2020, p. 7).

Data, however, can be artificially used by organizations to reinforce dark patterns instead. This is especially true with A/B testing. Narayanan et al. (2020) argues that this type of research misses nuances of user choices and preferences. If the key metric is to increase time spent on site, the design will need to reinforce those business goals, therefore making it easy to ignore the best ethical practices for users (p. 76). There is a false sense of assurance that research itself can solve ethical dilemmas. However, if a strong ethical HCI organization would enforce ethical guidelines then the reliance on research would be moot.

Dark patterns are still pervasive. In 2020, a study done by Geronimo et al. looked at dark patterns in mobile apps. They examined 240 free android apps and found that 95% of them employed at least one dark pattern (p. 5). These patterns have become so ubiquitous that users do not even recognize them. A study found that 55% of users did not notice “malicious designs in the app containing Dark Patterns” (Geronimo et al., 2020, p. 8) and 20% were unsure. Marthur et al. (2021) argues that “the users may understand and accept the trade-off, but there is a significant cost to society as a whole from tolerating such practice” (p.11). As Marthur et al. (2021 explains, the lack of awareness by users or even acceptance is impacting large communities, such example was the Facebook scandal with Cambridge Analytica (Marthur et al., 2021, p.11) “Zuckering” users into giving away their data so it can be used to manipulate the users with political disinformation.

The argument here is that collectively, as has been suggested by Taylor and Dempsey (2017), design would greatly benefit from its own version of the Hippocratic Oath. Just like medical graduating students need to uphold a do no harm approach towards their patients so should designers uphold a collective set of values that do no harm to users.

The consideration for a unified body of ethics is indeed in need since there has been for the past two decades an ethical misalignment between society and companies, and corporate advisory boards have failed at independently holding these companies into account (Narayanan et al., 2020, pp. 82).

Conclusion

The proliferation of dark patterns has a complex set of reasons. Designers are only empowered to apply ethical judgement to their work through their own pragmatic approach to ethics. In order for UX to have greater positive social impact there needs to be an amalgamation of a set of ethical values that as a profession we can all follow and be held accountable for those values.

In closing:

Design is power. In the past decade, software engineers have had to confront the fact that the power they hold comes with responsibilities to users and to society. In this decade, it is time for designers to learn this lesson as well (Narayanan et al., 2020, pp. 85).

References

Bynum, T. (2015). Computer and Information Ethics. In E. N. Zalta (Ed.), The Stanford Encyclopedia of Philosophy. https://plato.stanford.edu/entries/ethics-computer/

Chivukula, S. S., Brier, J., & Gray, C. M. (2018). Dark Intentions or Persuasion? Proceedings of the 2018 ACM Conference Companion Publication on Designing Interactive Systems. https://doi.org/10.1145/3197391.3205417

Chivukula, S. S., Watkins, C. R., Manocha, R., Chen, J., & Gray, C. M. (2020). Dimensions of UX Practice that Shape Ethical Awareness. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. https://doi.org/10.1145/3313831.3376459

Dourish, P. (2010). HCI and environmental sustainability: The politics of design and the design of politics. In Proceedings of the 8th ACM Conference on Designing Interactive Systems (DIS ‘10). ACM.

Fansher, M., Chivukula, S. S., & Gray, C. M. (2018). #darkpatterns. Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems. https://doi.org/10.1145/3170427.3188553

Geronimo, L. D., Braz, L., Fregnan, E., Palomba, F., & Bacchelli, A. (2020, April 1). UI Dark Patterns and Where to Find Them: A Study on Mobile Applications and User Perception. UI Dark Patterns and Where to Find Them | Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. https://dl.acm.org/doi/10.1145/3313831.3376600.

Gray, C. M., & Chivukula, S. S. (2019). Ethical Mediation in UX Practice. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. https://doi.org/10.1145/3290605.3300408

Gray, C. M., Kou, Y., Battles, B., Hoggatt, J., & Toombs, A. L. (2018). The Dark (Patterns) Side of UX Design. Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. https://doi.org/10.1145/3173574.3174108

Jaiswal, A. (2018, August 15). Dark patterns in UX: how designers should be responsible for their actions. Medium. https://uxdesign.cc/dark-patterns-in-ux-design-7009a83b233c.

Mathur, A., Kshirsagar, M., & Mayer, J. (2021, May 1). What Makes a Dark Pattern… Dark?: Design Attributes, Normative Considerations, and Measurement Methods. What Makes a Dark Pattern… Dark? | Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. https://dl.acm.org/doi/10.1145/3411764.3445610.

Narayanan, A., Mathur, A., Chetty, M., & Kshirsagar, M. (2020, March). Dark Patterns: Past, Present, and Future: The evolution of tricky user interfaces. Queue. https://dl.acm.org/doi/10.1145/3400899.3400901.

Taylor, C., & Dempsey, S. (2017, November 28). Designing Ethics: Shifting Ethical Understanding In Design. Smashing Magazine. https://www.smashingmagazine.com/2017/11/designing-ethics/.

--

--

Stefanie Lauria

UX Designer in the NY Metro area. Music hunter. Lover of the great outdoors. Van life dreamer. Sharing is caring.