The Power Of An Informed Public

5th June, 2015 — Laura Kalbag

Protest sign reading ‘Thank you Mr. Snowden, you are a fucking hero
Photo by PM Cheung (CC BY 2.0)

It’s been two years since Edward Snowden revealed that the NSA (US National Security Agency) had been making records of all its citizens’ phone calls, regardless of whether those citizens had done anything wrong, and that other governments around the world were complicit. The government argued (and still does) that mass surveillance prevents terrorist attacks, but there’s no evidence to support those claims. However, as Snowden himself puts it, the difference two years on “is profound.”

“In a single month, the N.S.A.’s invasive call-tracking program was declared unlawful by the courts and disowned by Congress. After a White House-appointed oversight board investigation found that this program had not stopped a single terrorist attack, even the president who once defended its propriety and criticized its disclosure has now ordered it terminated.

This is the power of an informed public.”

But as Snowden also says, we’ve got a long way yet to go. In a joint report from Privacy International and Amnesty International, entitled “Two Years After Snowden: Protecting human rights in an age of mass surveillance,” we can see that courts in a number of countries have ruled that mass surveillance contravenes human rights laws. Even more experts and international bodies have declared government surveillance as a violation of human rights. Despite this, governments are still seeking greater surveillance powers. And the US government still wants to put Snowden in jail.

This report is a very valuable overview of the types of surveillance, hacking, and lobbying performed by governments in order to collect citizens’ data, as well as public opinion on such surveillance: 59% of people across the world are strongly opposed to government surveillance, and only 29% approved. I suspect the 29%, and the remaining 12%, will come around.

Two Years After Snowden also shows the amount of surveillance that was enabled through corporations such as Facebook, Google and Microsoft being forced to hand over their customer data. The dynamic between government and corporate surveillance is also examined this week in a post by Rick Searle on algorithms and Frank Pasquale’s book, ‘The Black Box Society’. This is the very reason Ind.ie was founded. Aral questioned why corporations need to have access to people’s data in the first place, and maybe if the corporations didn’t collect it themselves, then they wouldn’t have anything to share with the governments. As the Two Years After Snowden report says:

“Companies have a responsibility to respect the right to privacy online. To live up to this responsibility they should take far bolder steps to increase security on their platforms and services, so that private user data is not made freely available for harvesting by governments.”

Privacy and Apple

Few corporations have taken their customers privacy seriously. As we’ve mentioned before, Apple seems to be taking it more seriously than most. This week, at a Champions of Freedom event in Washington, CEO Tim Cook gave another speech on privacy saying: “privacy is a fundamental, moral right.” He also made it clear why business models play a key role in the ability to protect people’s data:

“I’m speaking to you from Silicon Valley, where some of the most prominent and successful companies have built their businesses by lulling their customers into complacency about their personal information. They’re gobbling up everything they can learn about you and trying to monetize it. We think that’s wrong. And it’s not the kind of company that Apple wants to be.”

He carries on too, suggesting that Google’s new Photos will result in “your family photos data mined and sold off for god knows what advertising purpose,” and pointing out the flaws in governments’ attempts to undermine encryption: “If you put a key under the mat for the cops, a burglar can find it, too.” The TechCrunch coverage is well-worth a read. We at Ind.ie acknowledge that Apple is not perfect, but they’re certainly a good stopgap for protecting our data and human rights until alternatives become available.

Advertising: Spyware 2.0’s business model

Quinn Norton has written a fantastic piece on Medium, ‘The Hypocrisy of the Internet Journalist.’ From a position of long-time experience in advertising and analytics, then internet journalism, Quinn explains the huge amount of power and information held by web trackers. And, once fed our data, how they can shape our lives:

“What I’d do next is: create a world for you to inhabit that doesn’t reflect your taste, but over time, creates it. I could slowly massage the ad messages you see, and in many cases, even the content, and predictably and reliably remake your worldview. I could nudge you, by the thousands or the millions, into being just a little bit different, again and again and again. I could automate testing systems of tastemaking against each other, A/B test tastemaking over time, and iterate, building an ever-more perfect machine of opinion shaping.”

Quinn goes on to lament that she doesn’t know how to get us, or herself, out of the situation where we’re tracked and controlled. She uses the Noscript and Ghostery browser extensions to both block trackers, and make them more visible, but feels like they make little difference; mostly serving to remind us that we have no control.

Zeynep Tufekci explains how advertising doesn’t just sell out the people using these services, but also distorts our online interactions, as companies have financial interest in manipulating our attention on behalf of advertisers. However Zeynep is more optimistic than Quinn, and has a simple suggestion: people should be willing to pay for web services. She denounces the idea that no-one would be willing to pay for services we usually get for free, saying: “with growing awareness of the privacy cost of ads, this may well change.”

Aral Balkan has a different take on Zeynep’s post, which he voiced on Twitter and expanded for the roundup:

“Of course, Facebook may very well implement Zeynep’s suggestion as it would be a great public relations stunt for them to implement paid accounts, knowing that only a tiny percentage of their users would opt for one. It would be like Google implementing end-to-end encryption for Gmail as a browser plugin, safe in the knowledge that only a single-figure percentage of its users will use it. What Facebook cannot change is its core business model or the fundamental architecture of its service. It would be like turning the Titanic en pointe, and would surely lead to a shareholder revolt unless Facebook’s very existence was being threatened in some way.”

On the same theme of how free is a lie, TechCrunch have published an article reminding us to Smile, It’s Free — You’re The Product! when it comes to Google Photos. Natasha Lomas explains that the “lure of free unlimited storage masks another motivating impetus… the big one: data” by encouraging people to upload their photos to Google’s free locked-in storage repository. Natasha examines the business model of Google, and their obvious data-grab, and also asks us consider some more subtle points when it comes to Google’s new centralised “privacy dashboard”:

“It’s interesting the company feels the need to spin so hard on privacy. The language used on this dashboard is couched to suggest Google’s data gathering activities are performed principally for the user’s benefit. Which is a pitch-perfectly disingenuous response to growing consumer concerns about data privacy.”

Natasha also shows how Google’s privacy policy isn’t really about privacy at all:

“The company unified the privacy policies for more than 60 of its products back in 2012 — a move which got it into trouble with European data protection regulators because such drastic amalgamation results in a lack of transparency about what specific services are doing with your data, and makes it harder for people to control how their personal data is generally used by Google. So the exact opposite of user in control then.”

What have those cheeky corporations been up to this week?

Google

In further moves against the privacy of its users, Google seems to have conveniently forgotten to update its Keep My Opt-Outs privacy extension for Google Chrome. The extension has been downloaded by 400,000+ users, in order to block ad-tracking cookies, however it relies on keeping a list of blocked cookies which hasn’t been updated since October 2011. If you’re looking for a more reliable blocker, we still (despite Quinn Norton’s reservations!) recommend Ghostery.

Facebook

In a very odd move, Facebook is encouraging people to share their OpenPGP encryption keys with Facebook so they can provide secure email communications. As Natasha Lomas said about Google’s privacy dashboard, it’s interesting that Facebook feel they need to focus so much on the privacy of emails when they’d never encrypt their users’ data on their servers; it would lock Facebook out of the data too.

How our data collected by these corporations is used

Last week we looked at who owns your data and how your digital ‘reputation’ could affect you. This week, Astra Taylor and Jathan Sadowski looked in depth at ‘How Companies Turn Your Facebook Activity Into a Credit Score’. Credit scores and consumer reports can be created from the data you share on Facebook, Google and other social networks. As one of these data broker companies, Zest Finance, declares: “All data is credit data.”

In Astra and Jathan’s article on The Nation, they question the ethics of the unregulated parts of the financial industry potentially algorithmically targeting their adverts to people based on their financial status and vulnerability. In the US (and likely other countries) it’s illegal to discriminate against race, religion, sex, or marital status when considering an application for financial services. However, as the United States Public Interest Research Group’s Ed Mierzwinski says, “In addition to whether they’re covered by the laws, there is also the question of whether some of their algorithms are trying to evade the laws by creating illegal proxies.”

How can this be prevented from happening? The US Federal Trade Commission’s Julie Brill believes that current US law can protect consumers from this kind of algorithmic discrimination, but further regulation is needed to protect privacy in this new area. As Astra and Jathan suggest:

“In other words, we need to move from an opt-out model, where the default setting makes our private information freely available to thousands of invisible and unaccountable actors, to one that’s opt-in—a move that would inevitably constrict the flow of private data.”

An article by our friend, Katarzyna Szymielewicz, on Internet Policy Review follows a similar train of thought; she suggests that in order to be able to trust corporations, we need to retain control over our data. Katarzyna makes the point that “information shared as a result of a conscious choice and not a forceful act or deception, is very likely to be more accurate and, therefore, more valuable” to the corporations.

However, in looking at European policy, Katarzyna points out that the European Commission seem to have forgotten the rights of the individual in favour of the rights of the corporations in their Digital Single Market strategy; governments have “agreed that consent for processing data does not have to be explicit.” This is putting more power in the hands of the corporations at the sacrifice of the people using their services, especially when these corporations are also allowed to change the original purpose of data processing without even consulting the customer. The problem of changing the purpose of data processing is clear from Christian Sandvig’s article on ‘Corrupt Personalization’, where he uses an example of Facebook “recycling Likes” to promote advertisers’ products.

Looking further at the power dynamic created by corporate surveillance, Jason Calacanis, the entrepreneur and angel investor, has written a post explaining how freedom of speech cannot exist on Twitter, Facebook or in any other privately-owned space:

“Ideas matter, words matter, and freedom of speech does not exist in a corporate setting by definition — and that’s OK. Twitter, Facebook, and Instagram can run their services how they like, and their interests are largely driven by revenue from sponsors based on growth.”

Jason makes it clear that we lose out if our everyday lives are mitigated by Twitter and Facebook, because they’ll become “perfectly sanitized with an underlying tension that something isn’t right.” He suggests that we need to fight harder for open standards; standards that allow us to hold our conversations freely, without mitigation, and without any kind of corporate overseer.

The Investigatory Powers Bill (Snooper’s Charter)

The UK Investigatory Powers Bill, also known as the Snooper’s Charter, has not yet been published in draft form, so specific details are unclear. But from the speeches and declarations of intention from the recently-formed Conservative government, it’s highly likely that tech companies will be made to compromise the privacy and security of their customers. It’s one of the many challenges to our democracy mentioned by Edward Snowden. In an article on TechCrunch, Former Liberal Democrat MP Julian Huppert voiced many concerns on the Snooper’s Charter and other UK regulation, saying “I worry that the Home Secretary will largely try to simply procure more powers for the state without justifying it or consider the count of balancing issues that there are.”

A week ago, Tim Berners-Lee urged Britain to fight the Snooper’s Charter. Tim went on to say that the UK has lost it’s moral high-ground when it comes to privacy, after it falsely denied GCHQ (Government Communications Headquarters) was acting in the same way as the NSA. He said instead that the UK needs to “have solid rules of privacy, which you as an individual can be assured of, and that you as a company can be assured of.”

The lack of assurances for tech companies has resulted in press coverage about Ind.ie, and other organisations, leaving the UK because of the Conservative government’s ill intentions towards the privacy of its citizens. Preston Byrne of Eris Industries has also explained how the Snooper’s Charter would prevent their business from staying competitive in industrial cryptography. Alison Powell has also observed that the UK government’s moves could result in many more digital innovators saying “bye bye Britain”, creating “threats of its own to Britain’s fragile but dynamic technology startup scene.”

Updates on the Ind.ie pre-alpha

We’re reaching an exciting stage with progress on the Ind.ie pre-alpha. If you want to be one of the first to try Heartbeat, it’s not too late to fund us on the Independence or Democracy tiers to get access when the alpha is released.