Google StreetView, it transpires, did not scoop up private data from wireless networks by accident. The system included a bit of code called "gstumbler" which was expressly designed to rake in what it could. The people at Google's core apparently had no idea this was going on, and indeed it seems Google never used the data for anything or even had plans to do so - although given the frankly slipshod investigation of the issue, in the UK at least, it's a bit optimistic to suggest we know that for certain. But Google says it didn't, and David Cameron's government, which knows the company quite well, evidently feels it would be impolite to pry.
That might be because Mr Cameron is very eager to get his hands on exactly this sort of data himself. The Communications Capabilities Development Programme the Coalition is advancing is a rehash of Gordon Brown's Intercept Modernisation Programme, the failed Snooper's Charter against which the Conservatives spoke passionately while in opposition. It would allow the government to gather data from everyone who uses digital communications, all the time: who contacted whom, from where, and at what time. It wouldn't actually provide access to the content of those communications - that would still need a warrant - but in a strange way content is actually less important than metadata, at least at a macro level. Never mind that you can, as a matter of reality, put together a great deal of personal information about an individual from fragments of data about their online movements and calling habits; that is intrusive and can be very spooky - as when your online supermarket deduces that you're probably pregnant or trying to be from your recent web history and starts quietly recommending appropriate products - but it's not the whole story.
At the heart of both democracy and capitalism is a simple assumption that across the board people make free and relatively rational decisions; that we are, to borrow a medical term, Gillick Competent. Google's key tenet is that more available information makes us better at these decisions, and indeed makes us better people. The company's mission is to make the world's information accessible and useful - to make money thereby, sure, but ultimately to improve the world. It's about empowerment through access. But that throws up a dilemma for Google, a genuine fractureline in its core identity. Google's revenue comes from targeted ads. On the one hand, that could be as simple as connecting a buyer and a seller in an equal and even exchange. But there is a thing called 'choice architecture', and a field of expertise called 'behavioural economics', and these disciplines make the situation much less simple, because they deal in the business of influencing decisionmaking, and they are made possible essentially by access to large amounts of data about how and when and where people do things; all the data, in fact, that Google - and various other sites including Facebook - collect as a matter of course.
Choice architecture has recently become popular in some political circles as a way of getting the general population to accept things they ought to want, but which for some reason they don't. Organ donation is the obvious example; if it were made the default option, very few people would bother to carry a card asserting that their organs could not be used in the event of their death, and the perpetual shortage of transplant donors would get a lot less fraught. But that wouldn't be a decision; it would be the path of least resistance. And there are a lot of issues like that. At what point does this friendly nudging towards what we ought to want become control? At what point, actually, have we abdicated democracy in favour of a species of blurred technocracy? (In fairness: to what extent do we have genuine democratic choice now?)
I dislike this model intensely; it seems to create an electorate which is never required to think seriously and never has to balance one priority with another - an electorate which is infantilised and malleable.
So Google's dilemma is that on the one hand its ambition is to improve mankind by empowerment, while on the other it survives - flourishes - by providing a service which potentially becomes more disempowering as it gets more accurate.
David Cameron is to a certain extent in a similar position; he is bound to a dream of personal responsibility, but as the head of government he desperately wants to get things done. A vast trove of behavioural data would be breathtakingly valuable in this respect, telling him when to launch a given sort of initiative - a study of Twitter posts has shown a mood cycle through the week; that information in concert with habits of buying and reading news could yield hard insights into when we are at our most persuadable and how to approach us before ever we come to a well-framed decision point. The applications are clever and subtle, and the more information you have, the more fine-grained it is, the more you know and can achieve.
This is where the rubber meets the road on privacy; it's not simply the legitimate concern about a Peeping Tom state or nosy multinationals, or the right of the individual to conduct business, pleasure, and discourse without being monitored - although I personally regard those as serious and valid objections to the frenzy of data-gathering presently under weigh. There is a serious complementary issue of the creation of a society which is predicated on loaded choices offered to a placidly compliant populace.
There is an argument on that score which I find compelling: that if such information is to be gathered, it should be considered the property of the members of the public it describes, and made accessible to them so that they can also make use of it to understand their own behaviour and move - in a genuinely democratic and self-willed way - in a direction of their own choosing. The data sets would obviously have to be stripped of their overt identifying features - IP addresses and so on - to prevent abuse. The extraordinary potential for a better understanding of our collective choices and therefore an improvement of our ability to make good decisions is as appealing as its opposite is repellent.
The alternative is the creation of a broader Verpixelungsrecht - the German right to be removed from online images to protect one's private life - and the creation of proper, enforceable safeguards for Internet users requiring minimum standards of data control for users. Such regulation would shatter the business models of sites like Facebook, which is premised on a data-for-usage arrangement. There are also, as Jeff Jarvis points out, implications for the health of public space.
Whatever we choose, the discussion should be loud and wide. It should happen in public, not as an aside while David Cameron struggles with the Greek debt and the unacknowledged reality that a collapse of the Euro, so delightful to some in his party, would be ruinously expensive for Britain. The economy is vital, but it's not the only show in town - and this issue is too important to be decided backstage.