The landmark antitrust hearing earlier this week had a little bit of everything: tantrums over mandated masks, questions about cancel culture, and, of course, diapers dot com. And while a lot of us here might’ve predicted some of the turns that unfolded over the six and a half ungodly hours of CEO-grilling, the one thing I was struck by were the real, tangible questions raised about all of our privacy.
Granted, when it came to questions about surveillance, the playing field was far from equal. Questions on the topic that were lobbed at Apple CEO Tim Cook mostly concerned long-running accusations of the company using its App store dominance to spy on—and squash—competing apps. For Jeff Bezos, it was how Amazon employees used data siphoned from third-party sellers to create their own competing private labels.
Mark Zuckerberg was asked a question or two about his company’s piss-poor reputation in the privacy department, but for the most part, questions aimed at him were about his platform’s (supposed) anti-conservative bias that, in fact, doesn’t really exist. The only one really pressed on the privacy issues hitting the average consumer, rather than the average competitor, was—shockingly—Sundar Pichai. And let me tell you, he was not prepared.
To spare you the misery of sitting through a parade of tech companies failing to be held to account, I decided to pull Zuck’s and Pichai’s three biggest waffles, wavers, and half-truths they told Congress about our privacy. If you want to follow along at home, here’s the Reuters video I’m referencing, and I’ll timestamp each of these quotes.
“We’ve long been working to comply with GDPR, and we’re in full compliance.” — Sundar Pichai, Google [2:53:32]
For some reason, Congressman Kelly Armstrong (D-ND) mentions that Google had, supposedly, “restricted advertising analytics” and cut down on the “portability of data” to comply with GDPR—the prevailing privacy law in the EU—despite the fact that we actually have our own data privacy statutes here in the States, under the CCPA which went into effect earlier this month. He then goes on to ask how Google’s working to comply with those laws in the EU, and Pichai responds that his company’s “long been working to comply with GDPR, we’re in full compliance, to the extent of my knowledge.”
G/O Media may get a commission
Ah. So, let’s talk about GDPR for a second—while Google might technically be in compliance with these laws, the company’s become something of an expert at twisting some of the loopholes in the legislation’s wording to its advantage, and it’s gotten the company in hot water more than once across the pond.
Since GDPR became the law of the land back in mid-2018, the company’s faced two serious legal challenges that are still up and kicking. Less than a year after the EU’s laws went into effect, French authorities fined the company a hefty $57 million dollars on the grounds that Google deliberately makes the onboarding process for new Android devices less than consensual, saying:
Essential information, such as the data processing purposes, the data storage periods or the categories of personal data used for the ads personalization, are excessively disseminated across several documents, with buttons and links on which it is required to click to access complementary information.”
iSo while that might be technically complying with GDPR—which requires user consent for all things tracking and targeting related, it sure sounds like they’re hiding some of the critical intel needed for that consent in walls of obscure text hidden behind link after link, which—if I’m gonna be honest—sounds like a pain in the ass that no Android user is going to bother going through with. Not only that, but because Google goads new Android owners into signing up—or signing into services—with their Google account, the French authorities accused Google of being guilty of trying to “bundle” the necessary consent needed for those two products—a tactic that’s illegal under GDPR.
Google, for its part, tried to appeal these charges back in June, but the French courts—thankfully—dismissed Google’s attempts to sweep this under the rug. Then, a month later, the company was fined another $670 thousand dollars by Belgian authorities on the grounds that the company violating the “right to be forgotten” rules under the EU’s statutes. Both of these cases are still ongoing.
“We don’t use Gmail data for ads.” — Sundar Pichai, Google [4:54:12]
Further down the line, Armstrong asks if Google’s choice to retire the third party cookie actually did jack shit to protect user’s privacy, considering how the company can just, well, move on to other more insidious forms of tracking and targeting users, which is something a few reporters (including this one) have pointed out. He specifically asks whether data from our Gmail accounts is used for any ad targeting, since in the past, the company’s been caught red-handed scanning our emails for new ways to target its massive user base. Pichai responded, saying “we don’t use data from Gmail for ads […] on the services where we do provide ads, and if users have consented to ads personalization, yes, we do have data.”
Okay, so just because Google isn’t scanning your emails to your nana for ad targeting doesn’t mean there’s no data being siphoned from your inbox. Just like ad products from every other cog in the vast Google ad machine, the company tracks the ads you see, the ads you click on, and the ads you don’t. That goes for Google search and Gmail. So even if Google isn’t sharing the juicy details of your emails with advertisers, it’s damn well sharing exactly what you’re doing when an ad lands in your inbox and slaps you in the face. That means every click (or non-click) on a Gmail ad, along with every forward or favorite (if you’re one of the freaks who uses that function) is delivered right back to Google.
Oh, and by the way: turning off ads personalization doesn’t stop any data from being collected by the company—it just changes the ads you see.
“We don’t use cookies.” — Mark Zuckerberg, Facebook [6:30:04]
Towards the six and a half hour mark (urgh), congresswoman Lucy McBath (D-GA) pointed out in Facebook’s days of yore, its privacy policy stated that the company “would not” use cookies to track and target its user base. She pointed out that language “is a statement about the future, and that was written in 2004.” She goes on to ask him point-blank if the company uses cookies to compile data on users, and Zuckerberg responded, saying “[his] understanding is no. We’re not using cookies to collect private information about people who use our services, and I believe we’ve upheld that commitment.”
So, once again, this is technically correct, but functionally it’s just utter bullshit. Explaining why takes a bit of explaining, so bear with me: the little bits and bobbles of code we’ve all come to know as cookies are baked into webpages as a way to know if you’ve visited a particular site, and sometimes, what you’ve done on the site. These details get beamed up to some third party company who can use them for their own nefarious purposes.
In lieu of cookies, Facebook uses a handy tool called the Facebook pixel, which can track things like whether you add something to your online shopping cart, any applications you submit, or any donations you donate, along with much, much more—kind of the same way your average cookie would. Only in this case, that data’s going straight back to Facebook so you can be retargeted on their sets of products, like Facebook and Instagram, rather than around the web.
Naturally, Facebook also has the handy option to let ad folks bake their cookie data into their pixel data, and beam both of those details back up to Facebook as well. And while that responsibility technically falls squarely on the shoulders of the publisher who’s using these cookies or pixels or what have you, Zuckerberg’s pledge that his company doesn’t use cookies, is, quite frankly, a lot of shit—which is probably why he backpedaled off of his initial statement into a “Yes, we do use cookies.” when McBath pressed him a little bit more.
Source: gizmodo.com