WASHINGTON — Tech companies sometimes lure users to sign up for a service or share information they might not have agreed to otherwise by using subtle tactics and marketing on their websites and apps, like surveys that mine for personal information or designs that hide privacy settings.
But these practices — commonly called “dark patterns” — are coming under increased scrutiny from the federal government.
Virginia Sen. Mark Warner (D) has teamed up with a group of bipartisan lawmakers from the House and Senate on legislation known as the DETOUR Act that seeks to ban these practices.
Meanwhile, the Federal Trade Commission, the government’s consumer protection agency, has said it will ramp up enforcement against dark patterns that are already illegal and can trick consumers into subscriptions. The FTC also announced this month that it will undertake a wide-ranging overhaul of its overarching guidance on digital advertising.
“For years dark patterns have allowed social media companies to use deceptive tactics to convince users to hand over personal data without understanding what they are consenting to,” Warner said in an email. “The DETOUR Act will end this practice while working to instill some level of transparency and oversight that the tech world currently lacks.”
“Consumers should be able to make their own informed choices on when to share personal information without having to navigate intentionally misleading interfaces and design features deployed by social media companies,” Warner said.
While the name “dark patterns” sounds daunting, internet users have likely encountered them on legal websites.
Website designs that manipulate users can include intentionally obscure unsubscribe buttons, pop-ups that pressure users not to leave the platform and countdown checkout timers that create a false sense of urgency to purchase items.
Other practices include signups for trial services that unwittingly enroll users in a subscription that is then difficult to cancel, surveys or quizzes that collect data without telling people, the concealment of privacy settings on social media platforms and websites that ask a question that leads to a barrage of marketing emails.
A 2019 Princeton University study scanned 11,000 shopping websites and found instances of dark patterns on 11 percent of the sites. The more popular shopping sites were more likely to have dark patterns, according to the study.
“Social media companies often trick users into giving up their personal data – everything from their thoughts and fears to their likes and dislikes – which they then sell to advertisers. These practices are designed to exploit people; not to serve them better,” Imran Ahmed, CEO of the Center for Countering Digital Hate, said in a statement. The group supports the DETOUR Act to try to stop the practice.
Last fall, the Federal Trade Commission released a new enforcement policy to ramp up its response to illegal dark patterns that “trick or trap” consumers into subscription services. It came in response to a rising number of complaints about financial harm from deceptive signups and automatic renewals that trap consumers into ongoing subscriptions that are nearly impossible to cancel.
“Over the years, unfair or deceptive negative option practices have remained a persistent source of consumer harm, often saddling shoppers with recurring payments for products and services they did not intend to purchase or did not want to continue to purchase,” the FTC wrote in its 15-page statement.
The FTC guidance states that companies should “clearly and conspicuously” disclose subscriptions and honor cancellation requests. An FTC spokesperson said the agency is still working on the enforcement and has no new updates.
Outlawing dark patterns
The Deceptive Experiences To Online Users Reduction Act, or DETOUR Act, would go further to give the FTC more power to crack down on dark patterns by setting new limits on how the largest online companies can market and ask for information.
Specifically, the legislation would place limits on how internet firms with over 100 million monthly users could ask for information in an effort to stop them from tricking users into handing over their personal data.
The bill says companies cannot use an interface that has the “substantial effect” of preventing users from making informed decisions. They could not enroll users in behavioral experiments without consent or develop compulsive experiences, like automatically played videos, targeted at children.
“This legislation provides a foothold for regulators to better guard against deceptive and exploitative practices that have become rampant in many large technology companies, and which have had outsized impacts on children and underserved communities,” said Colin M. Gray, an associate professor at Purdue University who studies human-computer interaction.
A bipartisan group of lawmakers from both the House and Senate are pushing the issue. In the Senate, Warner is joined by Sens. Deb Fischer (R-Neb.), Amy Klobuchar (D-Minn.), and John Thune (R-SD). On the House side, Delaware Democrat Lisa Blunt Rochester and Ohio Republican Anthony Gonzalez are the cosponsors. The House Energy & Commerce Committee and House Judiciary Committee have had several hearings on the issue.
But even with bipartisan support, the bill would face a difficult path in the Senate, where Warner would need to get every Democrat and at least nine more Republicans to support it.
The legislation has the backing of the American Psychological Association and two major groups that advocate for internet safety for children, Fairplay and Common Sense.
“The DETOUR Act is an important step towards curbing Big Tech’s unfair design choices that manipulate users into acting against their own interests. We are particularly excited by the provision that prohibits designs that cultivate compulsive use in children,” Josh Golin, executive director of Fairplay, said in a statement.
Our stories may be republished online or in print under Creative Commons license CC BY-NC-ND 4.0. We ask that you edit only for style or to shorten, provide proper attribution and link to our web site. Please see our republishing guidelines for use of photos and graphics.