When UX crosses the line: The new face of dark patterns
In my experience, the conversation around dark patterns hasn’t gone away it’s just evolved. What I’m seeing more frequently now is this: the most manipulative UX tactics no longer look obviously deceptive. Instead, they hide behind the language of personalisation, nudging, or even growth experimentation.
We’re not just talking about hard-to-find unsubscribe buttons or endless opt-out loops anymore. The modern dark pattern is far more subtle. It might appear as friction deliberately built into cancellation flows, default opt-ins tucked away in modals, or AI-generated content nudging users toward choices that serve the business—not the user.
So what’s the real danger if these patterns are introduced with the best intentions. A/B testing might show they ‘work’. Conversion rates might spike. But what’s the long-term cost to user trust, brand perception, and ethical design culture?
🕵️♂️ What exactly is a dark pattern?
A dark pattern is a design choice that deliberately tricks, coerces, or manipulates users into doing something they might not otherwise have chosen. These are often subtle, embedded in everyday interfaces, and can appear as:
Confirmshaming – using guilt to discourage users from opting out ("No thanks, I like being confused").
Roach motels – easy to get into (subscriptions, services), hard to get out of.
Forced continuity – making cancellation difficult once a free trial ends.
Hidden costs – unexpected fees revealed only at checkout.
Pre-ticked boxes – opt-ins disguised as standard behaviour.
The problem isn’t just poor design—it’s a breach of trust, one that users are increasingly aware of and less tolerant of.
🚩 Spotting the line before you cross It
So how do we keep UX persuasive, not manipulative? Here are a few questions I always come back to:
Who really benefits? If the experience makes things easier for the business but harder for the user, that’s a red flag.
Would you explain it face-to-face? If you’d feel uncomfortable justifying a particular journey to a user in person, it probably needs rethinking.
What do customer support and legal teams say? They often see the consequences of misleading or confusing flows before design ever does.
Are you solving a user problem or introducing friction to protect a metric? There's a big difference between support and control.
🛠️ Actionable Tips: Designing with ethics in mind
Whether you're reviewing a cancellation flow or building a new onboarding journey, here are some practical steps to help ensure your UX puts people before metrics:
🔹 Run ethical design reviews. Make time to assess whether your designs respect user autonomy. Consider using a checklist that includes fairness, transparency, and ease of exit.
🔹 Use plain language and predictable patterns. Hiding intent behind vague CTAs (“Continue” instead of “Subscribe”) erodes trust.
🔹 Make consent real. Avoid pre-selected checkboxes and make opting in (or out) a clear, intentional action—not something that can be missed in fine print.
🔹 Let users leave with dignity. Make cancellation or account deletion as straightforward as sign-up. It's not just good UX—it signals respect.
🔹 Treat metrics as a signal, not the story. A high conversion rate doesn’t mean the experience is good if users feel tricked into clicking. Look at retention, sentiment, and support logs too.
🔹 Design for the long game. Trust builds slowly and breaks quickly. Ethical, user-first design is an investment in sustainable growth.
Good UX is persuasive, not manipulative. There’s a meaningful difference—and your users can feel it. The strongest products aren’t just optimised for clicks, they’re built for credibility.
Designed for Humans is here to make your UX resonate and work for real humans.
Curious about UX and design?
Take a look at our other blogs