Google paid $7 million to 38 states earlier this week to settle its Street View privacy scandal.
This was a serious privacy violation and despite the fact that this story has been unfolding for three years, much of the coverage of the settlement was surprisingly poor.
So it’s worth backtracking to remember what happened here.
In Europe a few years ago, Google was under scrutiny for its Street View cars, which trawl streets taking pictures for Google’s maps. Protestors didn’t want the Web giant taking pictures of their homes and putting them online. They didn’t realize that Google would also be collecting information from their Internet routers as it passed by.
In April 2010, privacy-sensitive Germany discovered that Google was collecting MAC addresses and wifi network names in order to improve its location services technology. German regulators went bananas.
A few days later, on April 27, 2010, Google responded in a blog post by that said, “Google does not collect or store payload data,” which is data you transmit when using the Internet.
Eight days after that, on May 5, German privacy regulators told Google they wanted to audit one of its Street View cars themselves to prove that it wasn’t collecting sensitive personal information.
By May 14, Google was forced to issue a correction of its earlier statement, admitting that it had collected and stored payload data:
But it’s now clear that we have been mistakenly collecting samples of payload data from open (i.e. non-password-protected) WiFi networks, even though we never used that data in any Google products.
However, we will typically have collected only fragments of payload data because: our cars are on the move; someone would need to be using the network as a car passed by; and our in-car WiFi equipment automatically changes channels roughly five times a second…
So how did this happen? Quite simply, it was a mistake. In 2006 an engineer working on an experimental WiFi project wrote a piece of code that sampled all categories of publicly broadcast WiFi data. A year later, when our mobile team started a project to collect basic WiFi network data like SSID information and MAC addresses using Google’s Street View cars, they included that code in their software—although the project leaders did not want, and had no intention of using, payload data.
But most everything Google said there was also incorrect.
We now know that a Google engineer, Marius Milner, “made a deliberate software-design decision” to collect the data, that Google collected more than fragments of information, and that Milner told his supervisors and colleagues about the tracking in the design document for the project. The question naturally arises: What else are middle-tier coders collecting at Google that their bosses don’t know about?
What made this worse was Google’s response when the scandal came to light. Milner invoked his Fifth Amendment rights and, in the words of the FCC, “Google deliberately impeded and delayed the Bureau’s investigation” by “willfully and repeatedly violat(ing) Commission orders to produce certain information and documents that the Commission required for its investigation.”
Although a world leader in digital search capability, Google took the position that searching its employees’ e-mail “would be a time-consuming and burdensome task.”
Google’s lawyers disputed the FCC’s assertions about the company’s cooperation, but how often do you see the FCC scream at a giant company like that?
Adding to the screwball comedy element, Google told regulators that it would delete the data. Then it told them two years later that it hadn’t quite done that.
That’s the background. Now on to this week’s coverage of the settlement.
Happily, AllThingsD’s Liz Gannes, who got the settlement scoop last week, got the story right. Many of those who followed her did not.
Time just flat gets the story wrong, buying Google’s assertion that it was an accident (emphasis mine):
But it turned out that Google went much further than that, vacuuming up snippets of browser history and email data. The company explained that when the Street View program launched, the team inadvertently included code in their software that “sampled all categories of publicly broadcast WiFi data,” even though the project leaders did not want the more comprehensive data. As soon as Google discovered the practice, it grounded the Street View cars and separated and secured the data on its network.
Again, Milner designed the software to do what Time says Google did inadvertently.