Tech industry titans suddenly love internet privacy rules. Wanna know why? We’ll tell you
Analysis After years of fighting to prevent any form of legislation that would safeguard Americans’ online privacy, this week Congress will have two hearings on the topic during which the tech industry will outline its newfound love for laws covering its business.
But, experts warn, there is one big goal behind the sudden willingness to engage: a set of nation-wide federal laws, strongly influenced by the industry itself, which override individual state laws, and in particular a California law that was passed last year in an extraordinary last-minute compromise.
“Here’s a quiet fight that’s brewing in Washington that you should pay attention to,” the newly appointed FTC Commissioner Rohit Chopra tweeted on Monday. “It’s called preemption – that’s the ability of Congress to hit delete on all state data protection laws.”
The issue of federal versus states’ rights is one of the United States’ most enduring battles, and in the past year, the rules surrounding telecoms and the internet have been pulled firmly into its orbit, not least thanks to the FCC’s controversial decision to tear up its own rules on net neutrality.
While Congress has failed miserably to deal with key issues in the digital era, and federal regulators have adopted a hands-off (or should that be hands-free?) approach to regulation, state legislators have stepped in and started making laws to protect their constituents from harm. But now Big Tech has realized that federal laws are all but inevitable, it has decided to see if it can use the process to get rid of the current laws it doesn’t like.
As Chopra put it, “if Congress includes preemption in federal law, this might erase the biometric privacy law in Illinois, the data broker law in Vermont, and the new consumer privacy law in California. Members of Congress are starting to push back.”
In fact, there is an entire raft of state laws that been developed to deal with internet privacy, all of which could be effectively struck from the law books if an overreaching federal law is introduced – you can view a list on the National Conference of State Legislatures’ dedicated internet privacy page.
All eyes on the West Coast
But the foremost target is California’s law that appeared out of nowhere, and was passed in record time last July.
The California Consumer Privacy Act of 2018 was the first such data privacy law passed in the US, despite years of legislative efforts in Washington DC, and while it didn’t completely extend European-style GDPR protections, it did give the state’s 40 million inhabitants the ability to view the data that companies hold on them and, critically, request that it be deleted and not sold to third parties.
Tech giants absolutely loathe the law, which threatens to undermine their fundamental business model of gathering, packaging, and selling user data while doing as much as possible to keep people as uninformed as possible about what information they actually have on them. Under the California law, any company with data on more than 50,000 people is covered, and each violation carries a hefty $7,500 fine.
How did online giants like Google and Facebook, which are based in Cali, ever allow such a law to pass? Why didn’t they use their full lobbying might in Sacramento to kill it? Well, the fascinating answer behind that one is that they feared a worse alternative: a ballot measure. A chance for voters to directly give a thumbs up to new safeguards for their information.
In early 2016, a number of dedicated individuals with the funds and legislative know-how to make data privacy a reality worked together on a ballot initiative in order to give Californians the opportunity to give themselves their own privacy rights after every other effort in Sacramento and Washington DC has been shot down by lobbyists of Big Tech and Big Cable.
Such a law is enormously popular with voters and after real estate developer Alastair Mactaggart put about $2m of his own money into the initiative, it made its way through the somewhat complex procedure, and was just about to be placed onto the official ballot to voters.
It was almost certainly going to pass, and that meant that not only would Big Tech be forced to deal with a data privacy law but it would be far harder for it to change the legislation after the fact through Sacramento lobbying.
It came down to the wire: Mactaggart said that if California’s governor signed into law a new privacy act before the ballot deadline, he would pull it. And so California’s Congress scrambled, Governor Brown signed it, and literally the evening of the deadline, the ballot measure was pulled.
Some changes I think
But, this being a legal sausage-making machine, legislators included an 18-month window to make “technical, clean-up amendments” before the law comes into effect in January 2020. And Big Tech has been furiously trying to undermine the rules every since, with limited impact since there is now a large spotlight on the process.
Not so in Washington DC where a data privacy law is still in its early stages, and so subject to all manner of behind-the-scenes lobbying. Which is how we get to this week’s Congressional hearings.
As we mentioned last week, industry has already sewn up the Senate hearing, to be held on Wednesday, with most of the witnesses representing the industry view. Just for good measure, the tech giants’ representatives are rubbing privacy advocates’ faces in it by holding a reelection fundraiser literally the night before the hearing for the committee chairman, Senate Roger Whicker (R-MS).
Meanwhile, the House hearing’s witness list was released on Monday, and strikes more of a balance, listing a senior director of the US’ “largest online civil rights organization” Color of Change as well as the CEO of the Center for Democracy & Technology (CDT), both of which are strongly in favor of giving netizens the right to decide what is done with their data.
On the other side sit the VP of public policy for the Interactive Advertising Bureau (IAB) who will promote the idea of industry self-regulation, as well as Roslyn Layton from “non-partisan” think-tank the American Enterprise Institute (AEI), which pushes Big Cable’s lines on everything from net neutrality to 5G to data privacy.
In a nutshell
Most crucial, though, is the vice president for technology and innovation at Business Roundtable, which closely reflect the careful line that business will take when it comes to a data privacy law: saying everything that people want to hear while focusing obsessively on the fine detail to ensure that the US does not get GDPR-style protections.
“At the heart of the Business Roundtable proposal is a set of core individual rights that we believe consumers should have over their data,” reads its testimony [PDF] before listing what sounds like a privacy advocates’ dream:
The right to transparency regarding a company’s data practices, including the types of personal data that a company collects, the purposes for which these data are used, and whether and for what purposes personal data are disclosed to third parties
The right to exert control over their data based upon the sensitivity of the information, including the ability to control whether their data are sold to third parties
The right to access and correct inaccuracies in personal data about them, and
The right to delete personal data
Except, of course, all the data remains in the hands and control of the companies, and they are the ones that get to decide how it is disclosed, how it is amended, what constitutes inaccuracies, and what kind of “rights” individual users will actually get.
And as for fines for failing to adhere to the new law – something which pretty much everyone agrees is the only thing that has given Europe’s GDPR law any teeth – well, there’s no need for that. In fact, there is not even a mention of that stick in any of the pro-industry remarks.
What the testimony does mention repeatedly however is the need for a “consistent, uniform framework.” One that works across the United States. One that provides a “stable policy environment”; that prevents “inconsistent approaches to privacy both domestically and abroad” and stops “inconsistent protections for consumers.”