Archived content:

This page is in the Ind.ie archive.

Go back to the Ind.ie homepage.

Raising the next generation

Transcript

(applause)

So I think I can just see you all. Can I ask for a quick show of hands. I'm going to ask you to keep your hands down if you're not going to be here tomorrow, put it up a little bit if you're not sure, and right to the top if you're going to be here tomorrow, just so I can get a gauge. Oh, excellent, gonna have a full house tomorrow. Thank you very much for that. You can dim the lights again. OK.

So, my name is Doug Belshaw, I'm the Web Literacy Lead for the Mozilla Foundation. You probably know Mozilla from Firefox. I hope you use the browser, but we're all about user choice. But we're actually global non-profit, focus on keeping the web open and free, and I count Rob Hawkes as one of my fellow Mozillians.

The TLDR of this presentation is basically the battle ground for everything that we're talking about this Summit, I believe, is education, so the blogpost that Aral just mentioned was this one, so I had a wonderful couple of hours talking to Aral about everything from Firefox OS to the next generation etc, and in part it inspired this particular blogpost. I'll put it back on the screen with the URL at the end if you're interested.

Now, in that blogpost I talked about four things; things to get from where we are now to where we need to be to change the world. So, first thing we need to identify the problem. Next we need to name the problem, we need to visualise it, and then we need to find somewhere in which we can actually have a lever to change the world. So I think we've identified the problem. That's why we're all here today; that's what we're doing. We've identified the problem that we need to solve.

But I want to run quickly through the other three things. So; naming the problem. Why is that so difficult. Well, I would suggest from where I stand working across the world as part of a distributed team that this is the case. In the US, people are mainly focussed on Government surveillance, and that's for historical reasons. They don't like the fact that the Government can surveil them, whereas, and this is grossly over-simplifying, but over here in Europe, we don't tend to like corporate surveillance, and I guess that's the reason that most of us are here today. Now you might think that both of those things would come together to want to change all the things. You might think that people would get very angry and want to change things. What tends to happen is that people just don't care, and they tend to be happy in giving away their data and their privacy in exchange for free, shiny things; that's what tends to happen mainly in my experience.

So I believe we need a better way of talking about this, so tomorrow at the Unconference, if you'd like to talk about naming this problem and trying to get people more interested in it, just from a naming point of view, then let's talk about that tomorrow.

The next thing is visualising this problem that we've got. Now, when we visualise things, we can do broadly two things; we can come up with metaphors, ways in which people can latch onto the idea that we're talking about, or we can point to examples of things which already exist which are homely things which people can understand.

Let me just talk about metaphors first of all. In the eighteenth century, a philosopher called Jeremy Bentham came up with an idea of a panopticon. Now, if you've never come across this idea before, it was an idea of guards being in the centre with a light shining out, and all the prisoners being round the edge in a circle, and all the prisoners were facing inside. Michel Foucault, I had to bring in some kind of post-modernism here somewhere, Michel Foucault in the 1970s he went on with this idea and he said that the reason why the panopticon would be so successful is although it's discontinuous in its action, like it's the guards aren't looking at the prisoners all the time; the prisoners feel like they're being watched all the time, therefore they change their actions on a day to day basis. That's why it would be so effective as a jailing system. Now, if we apply this as a metaphor to today to mobile devices, for example, it's an obvious metaphor; we carry these devices round everywhere. In a recent Wired article, there was an Italian company, I think it's called Hacking Team, sell their wares, their software, to any kind of government around the world, no matter how repressive that regime is, and they can back-door any kind of mobile operating system at all. Well you might think that's OK, because we have to stop the bad guys, after all, and so they might be happy to give up some of her civil liberties to be able to stop the bad guys bombing Amsterdam, I don't know; there's some kind of justification there.

But let's just apply that to education for a moment. There was a case in 2010 where a one-to-one laptop initiative, young kids being given laptops to take home. The administrators were turning on the webcam without the little light showing up when kids were in their bedrooms. Now let me just let that sink in if you're a parent there. How much of an egregious invasion of privacy that is. So, as a metaphor, the panopticon I think is quite a powerful one.

Let's just go onto some examples; so I'm going to quickly skip through the Facebook one because it's been mentioned several times today, but this was an experiment, we think funded partly by the US Department of Defense where people had their Facebook timeline changed, and by the end of the experiment, they were either posting more positive or more negative things. This kind of thing has happened before, but it's only been monitored by researchers, people who have been doing their PhD research, etc, into social or emotional contagion. The difference here is that they were actually setting out to achieve that, and that's really worrying.

Now, people did actually get really angry about this. People who I know who usually kind of don't care, got really angry and some of them closed their Facebook accounts about this. Why is that. Now, I would suggest it's because you don't mess with people's emotions. You don't mess with people's emotions because it's what sets us apart from machines. It's what makes us uniquely human, is our emotions.

So, let me just give some more examples, just skipping on from the Facebook example there. I want to give three. One has been kind of mentioned; TOSDR, I want to talk about Lightbeam, the web literacy map just quickly. Terms of Service Didn't Read, if you haven't come across this before, is TOSDR.org, it takes those big long user agreements and it classifies them into ways in which you can quickly understand if you want to sign up to a service, whether it's going to fit in with what you want out of that service, so are you signing away your copyright. Can you actually close your account if you don't like it any more. Things like that. So that visually shows people rather than just clicking accept, it shows them what it is that they're getting into.

The other one started as a project from my colleague Atul Varma, and it was called Collusion. We got some funding from the Ford Foundation, and it's now Mozilla Lightbeam. What this does, you install it as a Firefox add-on, and it allows you to track the trackers, the people who are tracking you across the web, and do something about it, so instead of this abstract notion of people out there are tracking you, you can see the people who are tracking you and you can shut them down.

And the third one I want to talk about is the thing that I work on. Mozilla Web Maker. This is taking people away from elegant, just elegantly consuming the web to actually making it and helping create the next generation of apps and things. The thing that I particularly work on is the Web Literacy Map, a visual way of seeing how to be more web literate. So, everything from privacy and security through to open practices and sharing.

The final thing I want to talk about is finding a focus point. There's been a lot of kind of talk about data being the new oil, which is a bit of a scary concept if you think about it, especially on fracking. And people just don't care enough about this, and they're happy to trade their privacy and their data for free, shiny services. But when you say, that's your kids' data that we're taking about, and it's in compulsory education where kids are being given Chromebooks and iPads, and they're having to click that user agreement, people start to care; people start to realise that this is my kid, this is the next generation. People get angry, and that's good. We want good anger.

There's a guy, he's David McCandless, he's behind Information is Beautiful if you've ever seen it; it's a wonderful, wonderful book and a website, and he says data's not the new oil: data's the new soil, it's had massive opportunities. So, in this article which you can go away and have a look at if you're interested, I've got three main questions which we can dive into tomorrow at the Unconference.

Firstly, what are we doing in terms of all the people who are represented here, with not just adults' data but kids' data, the next generation of web users, the next generation of people who are going to carry that flag. What are we doing with their data?

What are they doing with our data. Not just Google, but what is the norm in Silicon Valley. What are they doing with our kids' data.

But most importantly, what are kids doing with their data. How are they using it to make the world a better place. Not just giving it over to people, but using that data to try and make the world better than when they came into it.

So we need to identify the problem, we need to name it, we need to visualise it and we need to find the point at which we can change things, and I think the battleground is education.

Thank you.

(applause)

Thank you Doug, thank you.