clock menu more-arrow no yes mobile

Filed under:

Some apps use design to trick you into sharing data. A new bill would make that illegal.

The bipartisan legislation also proposes extra protections for kids online, including the end of autoplaying video and many types of habit-forming design.

Mauricio Santana/Getty Images

When using an app, how many times have you seen a cute, unobtrusively designed pop-up that asks you to share your location? Or an app onboarding process that makes access to your contacts just another button to click to get started?

This type of design makes it easier to consent to giving up personal information — whether or not you actually understand what you’re doing — than it does to seriously consider your options. It’s called a “dark pattern,” a term coined by user experience designer Harry Brignull in 2010 and popularized over the past few years. Technologists, regulators, and consumers have all started asking more specific questions about how apps can be created to look friendly and harmless even when they’re exploiting us, and this is one of the biggest topics of conversation.

Sens. Mark Warner (D-VA) and Deb Fischer (R-NE) introduced a bipartisan bill on Tuesday that would make this type of design illegal, affecting any social media platform with more than 100 million monthly active users. Any user interfaces that are designed to hide or gloss over the personal data you’re consenting to share will no longer be allowed.

The bill demands that disclosures of personal data collection be “clear, conspicuous, context-appropriate, and easily accessible” and not “deceptively obscured.”

“Any privacy policy involving consent is weakened by the presence of dark patterns,” Fischer said in a statement. “These manipulative user interfaces intentionally limit understanding and undermine consumer choice.”

If passed, the bill would instate an act called DETOUR (Deceptive Experiences to Online Users Reduction) and would also prohibit social media platforms from A/B testing most types of design changes without explicit consent from users. From the text of the bill: It would be illegal “to subdivide or segment consumers of online services into groups for the purposes of behavioral or psychological experiments or studies.”

For example, Facebook wouldn’t be able to serve different versions of the News Feed with different design elements or layouts to different people and study how they behave in response to it. At least, not without explicitly informing them that they’re doing so. (A controversial 2014 study the company conducted with Cornell University, in which some users were given a super-negative News Feed to see if it would depress them — it did! — would be considered not just unethical but actively against the law if it were to take place now.)

Arguably the most interesting part of the bill is its expansion of protections for children who use social media. Currently, children are protected from advertisers collecting and using their personal data under the Children’s Online Privacy Protection Act (COPPA) of 1998. But this would be the first legislation to attempt broad legal protection from social media addiction or other psychological harm. It would make it illegal “to design, modify, or manipulate a user interface” directed at children under the age of 13 “with the purpose or substantial effect of cultivating compulsive usage.” (Autoplaying videos are included as an example.)

The lawmakers define compulsive usage as “any response stimulated by external factors that causes an individual to engage in repetitive, purposeful, and intentional behavior causing psychological distress, loss of control, anxiety, depression, or harmful stress responses.” Almost every social media platform is set up to gamify posting new content and provide dopamine hits for receiving likes and other notifications, as well as encourage routines of scrolling through a never-ending algorithmically generated feed and opening apps during every moment of downtime.

If the Federal Trade Commission were to decide that Instagram is a platform marketed to people under 13, or — though it has so far declined to do so — that YouTube is, you can imagine there would be a lot of questions about the way those platforms create habits among their users. TikTok, the wildly popular music-based Vine replacement, is already on the FTC’s bad side, and was hit with a record $5.7 million fine for COPPA violations this February.

The idea for DETOUR came out of a memo that Warner circulated last summer, just after Facebook’s Mark Zuckerberg testified before Congress, and is part of a wave of increased government interest in the consequences of allowing big tech companies to run rampant. This announcement comes one month after presidential candidate Sen. Elizabeth Warren (D-MA) proposed breaking up Google, Amazon, Facebook, and Apple under antitrust law, and a little over a week after the Department of Housing and Urban Development filed charges against Facebook for enabling housing discrimination with its targeted advertising options.

If the bill passes, the new law would be enforced by a small, new regulatory organization that reports to the FTC.

Want more stories from The Goods by Vox? Sign up for our newsletter here.

Sign up for the newsletter Today, Explained

Understand the world with a daily explainer plus the most compelling stories of the day.