Archived content:

This page is in the Ind.ie archive.

Go back to the Ind.ie homepage.

Engage

Aral:

… invite Glyn Moody to the stage… .

(applause)

Aral:

I totally forgot. Sorry. Here we go. And Glyn is a journalist, an independent journalist and a blogger, who I've been following for quite a while, and finally got the chance to meet recently, I think it was in Berlin again; Berlin seems to be a hub. And I definitely wanted him on this panel because I think again, we need those diverse, diverse conversations to be happening, just like you were saying, Marietje, about how we need to be outside of our bubbles; that's what we're trying to do here today. At my opening I mentioned that we need to…the problems we face are mainstream problems with diverse audiences, and we need as diverse a group of people working on the solutions that we're going to create them. It's not just a technological problem that we're solving.

So Doug, let's start with you. I, of course, loved what you were talking about in terms of education and our role and what could happen in the future. Where do you see us going right now. What's the…are we heading…what's the trajectory we're going in right now. What's the thing that worries you the most?

Doug:

So the thing I say…I used to be a teacher, I used to teach history, and when I left education I was a senior leader in an Academy, Director of e-earning. And so I'm culpable; I'm the person who gave three thousand students Google apps, for example, so putting all of their data into Google's silo, because I was young and naïve, and that's what you do because you don't want to have to deal with systems admin within the institution.

But what I see happening all over the world, because I'm still in touch with educators, that kind of thing, I see the rush to one to one iPads and the rush to one to one Chromebooks, so literally twelve, thirteen year old kids being forced to sign or click a user agreement button because otherwise they don't get to take this machine home, they don't get to use all the cool tools that's happening, and you wouldn't believe how quickly this is happening; we're talking millions of units every single year. Growth numbers in the hundreds of percentages across the world. Australia having a roll-out across entire huge states. And this we can meet here, we can talk about this as much as you want, but each of you can go back to wherever you're from, and you can make sure that there's a voice pushing against all this.

If you're a parent you can go into schools; you can become a governor of your local school, and this happens in every education institution all over the world, because just as corporate companies are outsourcing their email to the Cloud, which is a problematic notion which we haven't really talked about thus far anyway; just as that's happening, it's happening even more in education, because budgets are tight and the expertise isn't there, so I see that as the biggest issue right now, the combination of hardware and software on an educational level.

Aral:

And Marietje, do you see this working in the EU, because these companies are also lobbying governments, inter-governmental bodies, and trying to portray themselves, from what I've seen, as the protectors of privacy, as the protectors of and the protectors of innovation and jobs, so how are you seeing this. Have you encountered this?

Marietje:

Lobbying. Never.

Aral:

No. No, it doesn't happen in your world. OK.

Marietje:

No. Every day, but let me say two things…

Aral:

How do we fight back?

Marietje:

Well, I don't have to fight back. I make my own decisions and I think that that is something that we should all be much more focused on. There's a big tendency to focus on the lobby, but you're all the lobby as well; Bits of Freedom is lobby, and I think a very successful one in the Netherlands, you know, where Ancilla used to work. EFF is lobby, very successful and important. I mean, if we're going to look at lobbying it's just as saying, it's the economy, or there's different aspects to it, and I think as long as it's not a crime what companies are doing, it's fine to try to make your point, I think there should be limits to how that can be done, how much money can be used…

Aral:

Right, because if you have a chest of a few billion euros versus, I don't even know if the EFF has a few billion; I'm assuming not. It's not really a fair…

Marietje:

Well, but what we should really keep in mind, I agree with you and I think the United States and the European Union are still two different spaces in that sense; I think money corrupts American politics systematically, and in the EU this is different. I mean, we don't fundraise for campaigns, for example. I ran on a virtually no budget campaign twice and got elected, but that's a different discussion. What I want to say is that the people who vote, the people who are lobbied, should be held to account with equal ambition, and it's easy to say, oh the lobby is bad: yeah, maybe, I mean we can challenge that whole system, but let's begin by holding those who make the decisions to account and technology actually helps foster a lot of transparency, you know, every vote is registered, it was uncovered that somebody had copy pasted something like a hundred and fifty amendments from a lobby group directly into a resolution that this person was not even working on actively, and it was uncovered, and the media scrutinised it, so that's one side of the story too.

But when it comes to the lobby, I mean I think as a politician you have to draw very strong, strict lines in terms of what you agree with or not, but what has got me worried, and this is what you were talking about in terms of companies portraying themselves as the protectors of privacy, of basically the well-being of everyone oftentimes, is that I was in a panel recently with a representative from Google and a representative from Microsoft, and both of them were saying how important Big Data was for advancing healthcare. Now, once Google and Microsoft agree you should really scratch your head and wonder how that could be, but I think that the use of healthcare as the example by so many companies of why we need Big Data, should really be a wake-up call to many, if it wasn't already, so yes we see this kind of framing in terms of the public good all the time; it's free, it's good for small and medium sized enterprises, it helps efficiency and of course we're in a crisis, so it's cost and this is great, and that's where in the broadest sense, knowledge about how technology works is really important. I mean, you said we all live in bubbles; this is true, but the disproportionate impact that technology has on almost every aspect of our lives from education to health, politics, the economy, is not reflected in the average knowledge of decision-makers, and this should change. And I want to invite you all to be a part of that change, because it is really important that people make informed decisions.

Danny:

So I just want to…

(applause)

Danny:

… I actually just want to reiterate this, because there's some great digital rights organisations working at Brussels if you go to edri.org, there's all of these small groups, some of which you might have heard of with Bits of Freedom and the Open Rights Group, some of which you won't have done, and they've all been doing fantastic work at Brussels without people really noticing, and that job has become much harder recently because I think that the lobbying that has begun to happen with the large sort of data-driven organisation has increased a great deal, and that started I think when the form of the Data Protection Regulations began to hit, and a lot of companies like Facebook and Google became very concerned that that would affect their business, and it's increased more and more now that I think the NSA revelations have made people in Europe much more, actually around the world, much more sensitive to what happens with their data, and they naturally turn to their politicians to try and do something about that, and so there's this…I hear this now from activists in Brussels, they say, you know, it used to be that the big companies weren't really paying attention in the EU, and now they are and we're sort of having to face off them, not only doing lobbying and giving information but also setting up fake grass roots organisations and this kind of stuff, so that's why I was suddenly excited. It's bad in America but there's lots of potential for it to get much worse in Europe, and you have an opportunity to stop that.

Aral:

I think Glyn had something he wanted to add to that.

Glyn:

I'd like to echo very much what Marietje said. You do not know how powerful you are. The power you have as people, communicating with people like Marietje; your MEPs in this country, they do listen, it's unbelievable and I'm sure that many of you here have never written to your MPs or MEPs. Do it!

Aral:

Who has?

Glyn:

They will…who has done that. OK, half perhaps. That's great, fantastic. The other half, you really should; you have tremendous power, and what we've learned today sitting in the audience has really struck me is that this is not a problem of technology; this is a problem of power and politics and you have that power, you have that power through politics and as an example in the next twenty four hours how you can make a big difference. European Commission is running a consultation on something called Investor State Dispute Settlement, which is part of the thing that Marietje was talking about, which is this transatlantic trade and investment partnership. ISDS, this Investor State Dispute Settlement, basically places corporations on the same level as nations; it lets the company sue a nation if it doesn't like the laws.

Glyn:

Some examples: Australia and Uruguay decided they didn't want their populations dying from lung cancer; they're going to bring in plain packets. Philip Morris doesn't like that. It's suing Uruguay for two and a half billion dollars for loss of future profits. This is what ISDS does. If we're talking about a Declaration of Independence, this is a Declaration of Dependence. If you don't like that, I suggest you go to a site called no2isds.eu or go to my blog, Open Enterprise, where I've got thousands of words on this. Find out about it in the next twenty four hours and please write to the European Commission expressing this, because you can change things; you have the power. But this is what's happening in the background without many of you knowing about it, and this is really what politics is about. You have the power to find out about this stuff; you have the power to change things. Please do so.

Aral:

OK, and that's awesome.

(applause)

Aral:

But I want to pull this back a little bit, because we do say yes; yes, we have the power and we can make informed decisions, but how are certain things being presented. I'm going to take this back to…you mentioned the Open Rights Group, for example, which I believe you were one of the founders of; you're not day to day involved with any more?

Danny:

I'm on various mailing lists.

Aral:

Various mailing lists, OK. So we have groups like Open Rights Group; we have groups like Mozilla etc, and when we're having these conversations and the event that I'm going to talk about is one that happened recently where Open Rights Group was organising, one of the organisers for a day of action, I believe that happened in London, and basically it was like fighting back against surveillance, state surveillance, and one of the sessions was basically being led by the Head of Policy for Facebook, Lord Richard Allan, and he was leading a bill…sorry?

Marietje:

A former member of the European Parliament.

Aral:

There you go. And leading a session on the drafting of a bill on privacy and civil liberties. Leading this panel; so how does it happen that even in these organisations that we trust and we go wow, OK, so ORG is fighting for us. How do we get a panel led by a Lord who works at Facebook, who is a Facebook employee; it doesn't matter that he's a Lord. He's a Facebook employee. How do we get it led by a Facebook employee. What kind of institutional corruption exists that this can happen?

Danny:

Shall I answer that?

Aral:

Anyone who wants to take it!

Danny:

So, I think it would be great if there was somebody from Facebook here right now.

Aral:

Oh I totally agree. But not leading a panel on a bill to create a bill for privacy and digital…and civil liberties, because what they're doing is actually entirely opposed to our privacy and our civil liberties.

Audience member:

(inaudible)

Aral:

No, I know, because I made a huge stink and they initially said OK, he's not going to lead it, and I talked to a few…this is where you can get involved, right. So I talked to the Labour MP who was helping to organise it and I said, well, you know, this doesn't make sense to me at all; what are you doing. We're not going to take that and we're going to actually make a stink if you do this.

Glynn:

(inaudible)

Aral:

Exactly. And then apparently it didn't go forward.

Marietje:

Well, I mean of course there's a lot of questions to be asked, but I think that there's a broader question of do you want to only say sort of like, no, what you want doesn't work; what you want doesn't work, or do you want to try to work with even these big corporations that are initially at the complete opposite end of where you stand, to try to impact that. I think that's a difficult thing because if, for example, I would go and in the case of ACTA I would have just from the beginning said: no, no, no, we don't want this, we don't want this; it's bad, it's bad for digital freedoms, none of my colleagues would have understood it and they would have really unplugged after like three seconds. If they would hear me say no the fifth time, they would just say, oh…pffff…you know, so what you have to do and I'm not talking about the Facebook example, but just to try to look at how you can influence in a broader sense, is what I try to do is to package the arguments all kinds of different ways and influence competition, influence access to medicine, so it's not just about sort of a niche group of people that thinks digital freedoms are important, but you try to create the stakes for as many people as highly as possible, right. And so I think with these corporations, because they do lobby us and the question is, we can also influence them; we can also say, if you don't change this, we may have to bring forward legislation for example.

Aral:

Right, and they're very, very afraid of that: last year when I spoke to Eric Schmidt he said…

Marietje:

Yeah, of course.

Aral:

…regulation can kill Google. I don't think he meant that…

Danny:

That's a bit of a goof to …

Aral:

Well no, they're very honest about it. If you talk to Eric, he's very honest about what they do. I mean, for one thing, I don't think they have much to fear.

Marietje:

Well but there's the interesting question also because a lot of people in the tech community are constantly saying, don't over-regulate the internet, and I think over-regulation is always quite disastrous, but we should also look at where there's vacuums now that lead to a status quo that is unacceptable.

Aral:

And I think it's naïve to expect a corporation to ever act contrary to its business model, and that's they key thing. So when we expect…

Danny:

I actually…sorry…

Aral:

Well mostly if they're…especially if they're publicly traded because they will get sued by their shareholders if they don't provide the most profit.

Danny:

So, I don't disagree with this and I think that the ark of corporations and particular corporations and particularly as you said, the business model is what drives them to behave the way they are. But my model of how to…and I talk about this in terms of how to change corporations at the edges so, let's spell this out more concretely. As International Director, one of the things I do at EFF is if for instance someone is in danger or is being censored in somewhere like Ethiopia or Iran or many countries; Thailand for instance, so a perfect example, right. The Thai Government two weeks ago wrote a Facebook app, and the Facebook app actually linked to their censor page, so if you go to a blocked site in Thailand, it goes, oh this has been blocked by the Thai Government; click here to log on via Facebook. Actually it also had a Close button. And those buttons went to a Facebook app that the Thai Government had written that would collect your data. Now I think that's a perfect indication of the dangers of placing trust in organisations like Facebook and the growing default that we hand over this data, but at the same time, this is something that is collecting data on people and putting them at risk, so I go and contact Facebook and I say, look, this is happening and I think technically this is a violation of your Terms of Service, and the Facebook app goes down.

So you have to interact with these organisations in that way, and my interactions with these organisations is not a sort of monolithic being, but constituencies within those companies, some of which care about these issues more than others, and so what you can do is feed them…not your personal data, although they love that. You can feed them arguments; you can feed them reasons to do things the way they should do, and it's the same in politics as well; you pick a group of people who are growing increasingly concerned about this sort of thing and you go to them and you say; look, here are the alternatives.

I have a colleague in Brussels who has been setting up Linux for MEPs, and what's interesting about that, of course they've never heard of open source and free software, but now they do, and also that means in order to get that working, you have to introduce open protocols and open systems within the European Parliament, and so they get this lovely computer working and they really like it and it doesn't work, and so they're furious and they go and say, why isn't this working. You can explain; well, you paid this money to Microsoft and you got a proprietary protocol. So these are the interactions.

Aral:

No, I totally agree, and please don't get me wrong; throughout this whole day I think that's what we've been saying as well, that we do need to engage with every other actor in this, but there's a big difference between engagement and then letting them lead, because I think from what I see, there's a lot of misdirection happening as well. It's a policy of mis-direction, especially on corporate surveillers, by corporate surveillers to say, look at government surveillance is horrible; government surveillance is terrible; look at what they're doing, could you please…and we're leading the debate on this; we hate what they're doing and we're trying to change this, and we're on the same panel with Mozilla and with ORG and we're trying to change this; Google is there, Facebook's there' we hate Government surveillance. Just don't look at the fact that we're doing the same thing because it's our business model.

Marietje:

And the National Security people are saying, look at corporate surveillance; that's what we should be worried about. I was recently on a panel where this exact same thing happened; I was like…oh yeah, OK.

Doug:

Sorry, just very quickly, I tell you what people don't like: people don't like you going up to them and saying, you know that thing that you're doing. Yeah, you're doing it wrong. People don't like you saying that to them, and giving people a choice and then educating them like, the thing that you're doing now, you can still do that. Or let me tell you about this other thing like you have a choice, like the Linux example, for example; people can't make a choice if they don't know a choice exists.

Aral:

Or if they don't have alternatives, because in the consumer space, we don't have alternatives. I mean, I'm sorry, like Diaspora, it's not an alternative to Facebook. Nobody's on there, right. I'm not going to switch. We need to wean people off of these things but we can't wean them off without gorgeous, gorgeous alternatives; that's what we're trying to do. Otherwise, we can scare them, we can depress them, and we can say hey, it's fucked, ain't that great?

Marietje:

No, no, but you have the market that can do something but you also have laws that can do something and if the laws don't exist or if the laws are outdated, then the laws must change because they are the basis upon which you can sue the Government, which EFF has done successfully multiple times. You need a basis.

Aral:

Exactly, and because if we create those alternatives and the laws don't exist, those alternatives might be illegal as well.

Marietje:

Or they will never get an opportunity.

Aral:

Exactly.

Marietje:

Like with net neutrality; if you don't have net neutrality, what opportunities do start-ups get?

Aral:

Exactly. Do we have any questions from the audience. Now we're going to really cut down our break. Are you guys OK with that. Yeah, because I want to have more of a debate and some of our panellists haven't even had a chance to speak yet.

Marietje:

Sorry.

Aral:

Oh no, no; I mean, it's been great, and they will. Jeremy. Why don't you ask a question?

Jeremy:

First of all, I thought it was really interesting that Gry and Greg [Doug] both talked of words as in like trying to name the thing, define the thing, re-defining the definition of privacy; I think that is really important. But Danny, I also appreciated how you kind of turned that round and talked about that when words aren't enough, when good intentions aren't enough, or when good intentions can lead you astray, the original sin, the road to Hell, because it crystallised a lot of nervousness I've been feeling through the day from people with the best of intentions talking about, we're building a better product or seamless product for people, and we'll be fine, and it strikes me that if you touch users' data in any way, even if you're not storing it, if it passes through your hands, then you kind of have to ask yourself, what's the worst that I could do with this, and act on that, and whether you are Philips, or Momentum or Indie Phone to actually accept yourself as a threat vector. Even if that means that the result is a less seamless, beautiful experience.

Aral:

Well I think that's a false dichotomy. I think we can actually create seamless experiences without having that, and that's what we're doing by distributing it; making the canonical place for your data the place that you own.

Jeremy:

But do you ask yourself what's the worst I could do?

Aral:

Yeah, exactly. Of course, every day. But I do. And then I do it. No, but exactly.

Jeremy:

The other side of good intentions, we assume the worst intentions from people like Facebook and people like Google, and yet when it came to something like lobbying against SOPA, we were more than happy to accept their power and their influence just because their intentions happened to align with ours. If we, the people, believe that those companies like that, whether it's technology or big pharma or whatever, should not be wielding that much power and influence, that means having to object to power and influence, even when it aligns with what we want.

Danny:

So, can I just point out…

Aral:

Yeah, I totally agree with that.

Doug:

So, treating Google and Facebook as two-dimensional entities I don't think really helps, so your point about having people here, having people on panels to represent that, you know, represent them as monsters; I know people at Twitter and Facebook.

Jeremy:

Believe me, I know…multi-faceted.

Doug:

Extremely ethical, awesome people who would speak out internally who would do that thing, but the point, as Aral makes, is that we've effectively got software with shareholders; we've got software which is responsible through business models to shareholders who can sue because there's some shareholder value in that kind of thing, and that's the difference; as long as we respect the fact that these companies are three-dimensional, and that these are real people who make real decisions, who aren't trying to be evil on a day to day basis, and they should be represented at things like this.

Aral:

Exactly. We have to separate the people from the company to a degree, but I think that's also important to understand that while the people can change, and the people have jobs and they'll have different jobs and they can change and they can do other things, the company itself cannot go contrary to its business model, and when we separate those, it becomes easier to see those two things as well, and we mistake those two things usually.

Marietje:

Yeah, but sometimes you can also create a market incentive, so like with sustainability or with health, your foods, etc, the consumers as a kind of constituency have also caused that demand which tilts the business case towards more protection. I think that that kind of combination is what may, and hopefully happen in Europe now, is where stronger data protection laws, for example, will create a competitive advantage, because there is an increasing amount of people that are seeking these kinds of protections, and if the laws facilitate the development, then maybe it can pull away from companies that don't have these kind of ethical considerations.

Aral:

Indeed, but I guess true blood never does taste just as good as real blood. Nobody watches True Blood here. Is that too old a reference. Yeah, somebody got it. I wasn't calling them vampires. All right, do we have any questions from the audience. Let's take a few and then we'll take a very short break, just a five minute break, and then we'll have our closing with Jackson as well. Who hasn't spoken much today. No one here actually has, so just let's pick someone, why don't we start with you up there with the green t-shirt and then…go from there.

Q:

I'm curious if you saw what Russia's doing, forcing Google and Facebook to store all the data on Russian people in Russia, and also forcing them to ask permission to send it back to the US or to Europe, so I'm curious about your reaction to that.

Aral:

I don't know who wants to take it. It's not surprising.

Stefan:

Does this work. I hope so. I just hear too much of Google and Facebook. I just think life can be pretty good without those two companies and just enough intelligent people in this room to actually create an alternative.

Aral:

That's exactly what needs to be done. But also I guess the biggest problem we have is the monopoly of their business model so you know, yes, life can be great without Google and Facebook, but also Yahoo! also LinkedIn, also whatever, SnapChat; OK that's a bad example because who needs that. But you know what I mean, if we then count…

Stefan:

Amazon?

Aral:

…every means we have for keeping in touch, for being updated about things, and Google has a huge plethora of apps as well that are very useful, we're going to at some point today cut ourselves off from modern life, and you can do that. You can have a little Linux notebook that you download your email on once a day, but is that the choice that we should be faced with. Either disconnect or accept being spied on, and I think that's unacceptable. The monopoly of this business model is unacceptable, and that's where I think regulation could really come into play.

Marietje:

Well, but it's also when you see this over-stretch of power, either by the NSA or by Silicon Valley, you know, I'm going to stop mentioning specific companies. This extra-territorial impact we call it, so basically because these are global services used by people all over the world by signing off on the terms of use you go into that realm of jurisdiction basically, for using that service, and we've seen this very concretely, when Twitter got subpoenaed to hand over also private messages of people in the Wikileaks investigation and the Dutch Government couldn't really do anything to protect a Dutch suspect because the person had signed off on the terms of use, and so what you see, and I think that this is where it actually becomes interesting and an opportunity, is that US Government in terms of foreign policy does not want this, does not want Russia to nationalise storage of data because effectively if every country starts doing that, we don't really have a World Wide Web any more; it's termed vulcanisation of the internet; basically nationalising the internet, and a lot of countries are already doing that one way or another, or on one layer or another or for one kind of service or another, and I think that it may be a good argument against this excessive, intrusive extra-territorial surveillance and other kinds of malpractices, whether for commercial purposes or for national security purposes, to actually re-think this notion, because it really invites a race to the bottom and it makes the counter-arguments from democracies very weak if they don't practise what they preach when it comes to preserving freedom etc, etc, and I think that that's the moment where we are now is that the democracies of this world have really lost credibility in fighting for digital freedom across the world, because we saw this when Russia started leaking tapped phone calls with Victoria Nuland saying, something f-u-c-k Europe, it was difficult…

Aral:

We can say fuck at this summit!

Marietje:

Yeah, OK, but you know… I don't really want to be in a video where someone repeats me saying that, the F-word… the F-word all the time, but basically…

Aral:

I said it…

Marietje:

It was like the F-word Europe, and it was released and of course United States are angry about that, but what can they say. Oh, don't tap people's phones. Everybody would be like, yah…you know, so this is maybe a funny example, but when you're an activist in a very difficult country and you had contact with people that were US funded, you may think twice to reach out to them again, and I think it also hurts the United States, not just those people in the field, and maybe this can be a reason to re-think the whole notion of security over everything, which of course we know a hundred per cent security does not exist.

Q:

That's a very sharp observation to what you see happening. I think what you said was very relevant is that countries, you know, Russia is now doing theirs but you see governments saying, we don't want who are buy routers any more, you see US companies want CISCO routers and basically they're silo-ing everything and I think that's a very bad thing, but doing all these PRISM programmes makes this happen, so they put it in motion.

Aral:

Excellent, yeah. Thank you… sorry, you were going to…we just have time for maybe one response to that and then we need to…

Glynn:

Well can I just say, it's obviously completely disingenuous of the US authorities to complain about the fate of the internet, as if they cared. They care about access to that data, so I wonder, being provocative, whether we could live with a federated internet, if we designed it in the right way, such that it would allow countries to retain control of their people's data.

Aral:

You know what. I think that is the sort of controversial question we need to end this panel on……