House of Delegates members walk past the south portico around at the end of the veto session at the Virginia State Capitol in Richmond, VA Wednesday, April 22, 2020. The House members were meeting outside in a tent instead of the House Chamber in order to practice social distancing due to the COVID-19 virus. (AP Photo/POOL, Bob Brown)

By Irene Leech and Susan Grant

We support Virginia legislators’ desire to protect their constituents’ privacy. Unfortunately, the Consumer Data Protection Act falls short in many ways. Not coincidentally, this bill is nearly identical to one in the state of Washington that is supported by major tech companies such as Microsoft, Amazon and Google. 

It is based on the outdated “notice and opt out” framework which underpins the current system of commercial surveillance and fails to provide consumers with meaningful control over their personal information.

Instead of requiring companies to get people’s permission before using their data, the Consumer Data Protection Act places the burden on consumers to navigate today’s incredibly complex data ecosystem. Under this weak bill, consumers must take steps to opt out of unwanted uses of their information (to the limited extent they are allowed to do so). Making “opt out” the default disempowers consumers and poses equity concerns; consumers with less time and resources to figure out how their data is being used and how to opt out will inevitably be subject to more privacy violations. Where the default lies matters, as marketers well know. It’s time to change the default to “opt in.”

That’s not the only concern we have about this bill.

• It gives consumers no rights concerning the personal data that may be gleaned from social media and other “channels of mass media” if they didn’t adequately restrict access to that information.

• It gives consumers no control over businesses selling their personal information to affiliated companies.

• It requires opt-in for processing consumers’ “sensitive data” but not for uses of their personal information that may be sensitive.

• It allows consumers to opt out of seeing targeted advertising based on tracking their activities over time on multiple websites and apps and profiling them, but that opt-out does not stop the tracking and profiling from occurring.

• It does not apply to advertising based on tracking consumers’ activities over time on the company’s own website or app and profiling them – the business model of Google and Facebook, which profit from profiling and targeting consumers on behalf of other businesses.

• It only gives consumers the right to opt out of profiling when it is used “in furtherance to decisions that produce legal effects concerning a consumer or similarly significant effects concerning a consumer.” There is no overall right to stop being tracked and profiled.

• It does not apply to consumers’ personal information when it is in the hands of financial services companies or other businesses that are covered by other laws, even if the privacy protections of those laws are much weaker.

• It allows parents and legal guardians to exercise consumers’ rights but does not enable consumers to designate others to act on their behalf, as an aging parent who doesn’t understand technology might want.

• It allows businesses to charge consumers more or provide them with lower-quality products or services if they exercise the limited rights they have to opt out of targeted advertisements, their personal data being sold, or being profiled. In other words, if consumers want privacy, they have to pay more, a blatantly discriminatory policy.  

• It lets the companies that hold and process consumers’ personal data avoid any responsibility when third parties to which they disclose the data violate the law unless they knew those parties intended to violate the law. (So Facebook would have no liability for what Cambridge Analytica did with users’ personal information.) 

• It prevents consumers from taking legal action to enforce their rights.

• It creates a “right to cure” that hampers the ability of the attorney general to take action to stop bad practices and obtain remedies for consumers. 

• The legislature should hit the pause button and work with consumer and privacy groups to craft a bill that truly protects Virginians’ privacy. It is not enough to say that the law doesn’t go into effect for two years and changes can be made next year. Our experience is that it is difficult to strengthen consumer protections after legislation has already been enacted. If addressing these issues means that passing the bill has to wait until the next session, that’s fine.  It’s better to get it right than to count on going back and changing mistakes.  

Susan Grant is director of consumer protection and privacy at Consumer Federation of America, a nonprofit association of consumer organizations across the United States. Irene Leech is a consumer advocate and Virginia Tech professor from Buckingham.