You’re sitting across the desk from a bank loan officer to refinance your home mortgage and take advantage of extraordinarily low interest rates when he reveals that your credit is hopelessly overextended. You’ve financed two new cars, secured a massive personal loan and opened (and become delinquent on) dozens of credit cards in the past year.
Not only that, your digital double has pocketed a six-figure loan secured by the equity in your home and vanished without a trace.
The blood drains from your face and panic makes breathing difficult as you realize that a cybercriminal found enough private information about you to become you, to take over your finances, to steal most everything you’ve worked a lifetime for, to leave you with debts it would take several lifetimes to repay. You face a terrifying, complicated, years-long struggle to reclaim your good name and claw back what’s been lost.
Our most intimate data now reside in digital form, and much of it is frighteningly accessible to cyberthieves ranging in skill and resources from military units of hostile foreign powers to basement dwellers who haven’t seen the sun in months. Data breaches were a global pandemic decades before the coronavirus arrived, but only in recent years have governments begun mandating safeguards be taken by those who gather, analyze, store, buy and sell information about us.
According to research from Richmond-based Risk Based Security, while the 2,037 reported data breaches for the first six months of 2020 is down 52 percent from the first half of 2019, the estimated 27 billion records exposed in those breaches is more than double the 12 billion records compromised during the same period a year earlier. Among the stolen data were 90 million payment card records, and even more Social Security and financial account numbers, the report said. Prime targets were in the information, health care, finance and insurance and public administration sectors of the economy.
Twenty years into the 21st century and 40 years since email became a thing, Virginia is considering a raft of new data privacy and security legislation that would impose upon businesses tighter minimum requirements on gathering, securing and retaining data, mandate its safe destruction and limit how fast-emerging, Orwellian technologies such as facial recognition paired with artificial intelligence can be deployed and used.
The first to act in a meaningful way was the European Union, which adopted the General Data Protection Regulation, world’s broadest, strictest and most confining data privacy rules on earth. The GDPR, which took effect in May 2018, defines what data that may be legally collected about any person in the EU, how and where it may be gathered and kept, processes for informing people about their data and for permanently destroying that data upon their request. It also establishes prohibitive penalties, enforceable worldwide, for violators.
The United States has nothing remotely like it, remaining an untamed badlands that beckons hackers with the world’s richest trove of digital booty. Security protocols vary widely by industry and among businesses, depending on their resources, size and levels of information technology sophistication. Absent federal leadership, states are acting on their own, assembling a crazy quilt of sometimes mismatched laws that present legal, regulatory and compliance nightmares.
Two years ago, California became the first state to pass significant restrictions on how personally identifying information (PII in the jargon of cybersecurity pros) could be collected, used, stored and shared. It took effect this year. In November, California voters will vote in a statewide referendum on an initiative to further enhance those consumer privacy protections.
Other states are considering similarly comprehensive data privacy and security legislation, including New York, New Jersey, Maine, Massachusetts and Nevada. Virginia’s approach to date is more piecemeal — a handful of bills introduced for last winter’s legislative session but carried forward to January’s regular session, allowing time for vetting by committees of the General Assembly’s Joint Commission on Technology and Science.
“Data breaches are going to be the next (Hurricane) Katrina for cyberspace,” said Del. Hala Ayala, a longtime digital security expert for the U.S. Department of Homeland Security who heads the JCOTS advisory panel on data protection and privacy.
Ayala, a Democrat from Prince William County who is running for lieutenant governor next year, sponsors one of the bills her panel is evaluating. It would prescribe standards for safe retention of customer records and their destruction when they’re no longer needed. Another bill governs collecting, storing and safely destroying biometric data (retinal scans, voiceprints, hand or facial recognition) that employers use to grant employees access to secure places and systems. Such data, Ayala said, are goldmines for cybercriminals, and it is important for state government to create incentives for businesses to adopt and practice effective and up-to-date online hygiene.
That can take the form of shredding obsolete customer information on physical media, from paper to floppy disks and diskettes to tapes from a time when computers displayed iridescent green type on black cathode ray tube screens and Atari was the brave new world of gaming. Or it can mean securely wiping desktops, laptops and mobile devices and processes to find and eliminate data held in cloud environments.
“We are just scratching the surface of these conversations,” Ayala said. “My goal is to provide critical thinking in an environment where not many people have brought this kind of backgrounding into the General Assembly.”
Del. Cliff Hayes, D-Chesapeake, who has 25 years’ experience as an IT professional in Hampton Roads – including with law-enforcement agencies – heads a JCOTS panel examining the two-edged sword of facial recognition systems. On one hand, they promise quantum advances in crime fighting and national security. On the other, they are the fondest dream of totalitarian regimes as witnessed by wholesale deployment throughout China.
Hayes is skeptical about the technology’s widespread use.
“One of the things I know is that there is always this push to use technology to make ourselves more efficient. The question is how you define it. When you talk about effectiveness, what lens do you look at it through. Is it through a tactical or quantitative lens, or is it through an ethical and moral lens?” he said.
For example, he recalled, facial-scan technology was secretly tested in and around Tampa’s Raymond James Stadium at the 2001 Super Bowl. It generated outrage after its use was disclosed and was shelved for further development. Twenty years later, he said, it’s better, but is it good enough?
“Sure, there have been improvements but the truth is it’s not something we need to lean on to determine specifically that a person committed a crime and to put that person in a situation to be incarcerated,” he said. “It’s not something that law enforcement should depend on.”
Part of the problem, he added, is that studies have found that these systems can return greater levels of false positives for people of color and women. It is also subject to bias in favor of people with previous run-ins with the law because a significant share of its facial metrics data are drawn from their mug shots.
What we don’t need, he said, is to train machines to perpetuate the most troubling racial prejudices of human beings.