Topics
Series
Media type
Topics
Featured topics
Arts & Media
Economics
Environment
Governance & Politics
Health
Human rights
Peace & Conflict
Social movements
Technology
Series
Featured series
Better Conversations
Doha Debates Podcast
Doha Debates with Ghida Fakhry
Necessary Tomorrows
The Long Game
The Negotiators
Town Halls
Vox Pops
Media type
All media types
Podcasts
Videos
Articles
Topics
Featured topics
Arts & Media
Economics
Environment
Governance & Politics
Health
Human rights
Peace & Conflict
Social movements
Technology
Series
Featured series
Better Conversations
Doha Debates Podcast
Doha Debates with Ghida Fakhry
Necessary Tomorrows
The Long Game
The Negotiators
Town Halls
Vox Pops
Media types
Podcasts
Videos
Articles
April 22, 2020

Standing up to big data

S1 E15 39 MINS
Your phone is tracking you all the time. Tech companies are monetizing your personal data. Is there anything that you can do about it? Nelufar talks to Danny O’Brien of the Electronic Frontier Foundation about why data privacy should be a human right. Then we discover some surprising ways social media ad targeting is being used to do good in the world from computer scientist Ingmar Weber.

Full transcript

Note: We encourage you to listen to the audio if you are able, as it includes emotion not captured by the transcript. Please check the corresponding audio before using any quotes.

[JAUNTY PERCUSSIVE MUSIC]

NELUFAR HEDAYAT, HOST:
OK! Wait, lemme just — there we go.

I am extremely happy today. Because I am doing a challenge that is going to be by far the easiest challenge I’ve ever done for Course Correction. I mean, I literally have just plunked myself down at my desk, I’ve booted up my laptop, I’ve got my coffee, I’ve got my phone, and all I have to do is go onto Facebook and Google and get all the data that they have on me. Easiest challenge ever. I’m just going to get onto Facebook, log in and figure out…where would something like this be?

Privacy, it has to be in privacy. “Do you want search engines outside of Facebook to link to your profile?” No.

[OVERLAPPING AND ECHOING EFFECTS]

NELUFAR: OK, OK.

OK, OK.

Settings, right?

Maybe it’s in “Timeline and Tagging.”

No. Review…

Review “Security and Login”? No…

Face recognition? That’s a creepy question I am not ready to answer.

[SIGHS]

[PERCUSSIVE SYNTHESIZER MUSIC]

NELUFAR: This is Course Correction, a podcast from Doha Debates. Each episode we look at one big global problem…and meet the people who are actively working to fix it.

I’m Nelufar Hedayat. Throughout this series, we’ve been talking about climate change, conflict, waste, water — things that you can see with the naked eye. Problems that are visible, where you can stand with your protest sign and demand change.

But: Data. Data privacy? Where do you go to picket a line and protest your rights for that? The invisibility of your data, the fact that it’s not tangible, that you can’t ever really place it… makes things challenging. And maybe more importantly, we can’t see who else can see our data.

What even is it? Where does it live? If it’s invisible, then does it really matter?

MAN WITH BRITISH ACCENT:
Honestly, if you don’t push to make technology empower people, then there are other people who will use technology to push to empower themselves. And you have to keep pushing in the other direction.

NELUFAR: I’ll talk to some people who understand the peril and the promise of this technology that surrounds us every day.

MAN WITH GERMAN ACCENT:
One way to frame things is between the risk of misuse, but then also considering the risk of missed use.

NELUFAR: Ah.

SECOND MAN: What is the risk, or what is the harm done, by not using this data?

[PERCUSSIVE SYNTHESIZER MUSIC]

NELUFAR: My job today is to figure out what some of the world’s biggest companies know about me. I can do that right here from my sofa in London for a couple of reasons.

First, in much of Europe we have a rule that this has to be doable. It’s known as the GDPR. It sounds like the name of a Soviet state, but it’s actually an acronym for a set of regulations that force internet software companies to make sure that any personal data they collect from me still belongs to me — and crucially, that it’s easily accessible to me. Things like my location, my name, my address, any of my photos — they have to be upfront with me about how they use it. And they can’t sell it.

The second reason I can do this is…it’s not nearly as hard to find as I first thought. Maybe my coffee hadn’t kicked in yet. I just needed to look in the right spot.

[PERCUSSIVE XYLOPHONE MUSIC]

NELUFAR: You can view — Ooh! Oooh! OK! — you can view or download your information and delete your account any time. “Access Your Information.” OK, here we go. View.

[IN A LOUD WHISPER]

NELUFAR: Whoooooa.

OK, it was a lot easier to find than I thought it would be. I am just a little bit silly. So I wasn’t looking for it properly. But once you log onto Facebook, you can go into Settings, you can click onto your Facebook information, which is actually really easy to find if you know what you’re looking for, and then click on “Access Your Information,” and here we are. This is amazing, there’s like an entire menu here! There’s information about the posts I’ve made, the comments I’ve made, events I’ve been to, things I’ve got from the marketplace, photos I’ve liked, apps and websites — apps and websites. What does this mean? Hm.

NELUFAR: Seeing this menu of my life spread out like this is such an eerie experience. I can see my rants in all caps as I’ve shared a post about some injustice or another, see 10-year-old comments I’d written on photos I was tagged in — including a photo of me with a very dodgy haircut my mum used to give me. Ugh.

NELUFAR: Facebook’s got a longer history than my actual diaries, and it’s precise.

NELUFAR: In fact, this is a chronicle of my life, a version of it that I wanted everyone to see going back to 2007. I don’t think I know myself as well as Facebook does.

NELUFAR: It knows where I tagged, when I tagged, what comments I made on what posts, what things I liked and reacted to. And ads. “Your interests, interactions, and existing relationships you have with advertisers that influence the ads you see.” Now that has piqued my interest, so I’m going to click on this to see — whoa! OK. Sorry — that heavy sigh, breath, is the sound of somebody learning stuff about themselves that they did not know.

NELUFAR: All this information about me — my memories, likes, dislikes, my friends — all of this isn’t priceless data. In fact, you can buy it. Well, in one respect anyway. Some data that me and 2.5 billion other active Facebook users put in — for free, mind you — is sold to companies that use ad targeting. Ad targeting means that companies that are looking for a specific type of person can be very clinical in the way that they find them. Rather than waste billions of dollars paying for ads on billboards, newspapers or magazines, ad targeting means that companies and organizations can find their ideal buyer, or viewer, and bring their message onto your screen.

NELUFAR: Now I have no idea how this has happened, but a lot of these organizations that I subscribe to are using my data on Facebook in a way that I don’t expect. So this is an interesting list. “These advertisers have run an ad in the past seven days using a list uploaded to Facebook containing your information, typically an email address or a phone number. Facebook then matched the uploaded information to your profile, without revealing your identity to the advertiser.” So Facebook is this magical matchmaker between Deliveroo, Sephora, Amnesty International, Groupon, the BBC, DoorDash, JustPark, the Humane Society — all people that I know I’ve given my information to, that are then going onto Facebook and targeting ads to me. And I’m guessing that all of these organizations are paying Facebook for these really targeted ads.

Now I don’t know how I’ve — oh my god, there’s a “See More” section. Whaa—

NELUFAR: Obviously, I’m a little bit shocked at how Facebook is using this data — particularly how it’s helping to make Mark Zuckerberg a little bit richer. And honestly, Facebook is just a site I go on to waste a little time now and then. But Google — Google is practically where I live online. I use it for basically everything: I have a professional account, a YouTube account. I use Google so much I had to Google how to find my data on Google. As with Facebook, I can dig up videos I’ve watched, things I’ve clicked on, maps and YouTube histories — things I’ve Googled, naturally.

NELUFAR: Apparently I had, I wanted to see someone about a massage a few weeks ago.

NELUFAR: But I’ve been living on this site, and its related sites, for years. So my history is just an infinite scroll. How am I supposed to make sense of any of this?

NELUFAR: Oh my god, I should have done this years ago!

NELUFAR: At the end of the day, what I do on their site is my choice. But it took me dozens of clicks to begin to understand how all of my choices matter to them.

NELUFAR: “Based upon my activity, Google may show my profile name, profile photo, activity, in shared endorsements, in ads” — No! You can’t do that! Getting quite angry now.

One thing I do know is after having gone through this experience, I’m actually a little disconcerted by the amount of data that’s being connected online about me. It’s not that I’ve shared my favorite restaurant, or where I’ve liked to go, and where I’ve checked in, and places I’ve reviewed, and photos of me — all of that is consensual, I want to upload that stuff, I like that stuff. And it doesn’t bother me that ads are able to target me better. What bothers me is how a lot of this is being done for a profit behind my back, and I’m the person that’s generating all of this information. And I’m still quite unclear on how all of this can be used about me.

[SIGHS]

NELUFAR: I need to make some more coffee.

NELUFAR: More coffee — and more conversation. I need to talk to someone who understands how my data is being used for a lot more than helping spas advertise to me.

Danny O’Brien is the director of strategy for the Electronic Frontier Foundation, the EFF. They’re a civil liberties group that believes that if you, me or anyone lives online or has a digital footprint, we should have rights over whatever data we leave behind  — and big companies shouldn’t make money off of it. Danny says the internet has done a lot to change our lives in ways that are both good and bad.

DANNY O’BRIEN:
If you’ve grown up with the internet, it’s kind of hard to remember a time where it was like genuinely difficult to, like, find out about things. I mean, these days ask a question in a bar or a pub, like — somebody eventually will give in and Google it and find out what the answer is. Those conversations never ended before the internet.

[LAUGHTER]

DANNY: Like people would just end up arguing or discussing those forever. And so having this huge corpus of the world’s knowledge publicly available — government data, data about the climate, data about the world around us — it is vitally important, and it’s incredibly useful to have that. I think it’s important that everybody has sort of equal access to that, and there are ways of doing that.

Because that was like, one of the promises of the internet — that power would be decentralized and we would have more autonomy over our lives. And I think a lot of people these days feel that technology is working against them rather than for them.

NELUFAR: We’ve been talking about websites and apps. I can choose to monitor when and how I use them. But the mother of all data spies is of course sitting in your purse or pocket at this very moment.

NELUFAR: I was thinking about this on the way to the studio to come and talk to you. And it was like, I genuinely don’t think, apart from if I’m swimming, or if I’m — I can’t think of any other time where I’m separated from my phone, you know?

DANNY: And they’re getting more waterproof now! So you may not even have, have that refuge.

[LAUGHTER]

NELUFAR: Can you just talk me through a little bit of like — what kind of data is being collected about me through my phone and my laptop in just a normal day?

DANNY: Well, from the minute you wake up in the morning, if you’re like me and you keep your phone by your bedside, that device, that phone, has a bunch of sensors. And some of that information from those sensors is being sent off to other people. I mean, the simplest one is, is, is where you are. Your phone is constantly telling the phone company, at the very least, where it is — because that’s how you get calls to a mobile phone, right? But that means that the phone company always knows where you are. So it’s a tracking device in that sense. And you don’t really have an idea — or at least I don’t, and I’ve studied this quite a lot — how long the phone company is keeping that data, whether it correlates it with who you are, whether it’s selling that data on to someone else.

NELUFAR: Right, no, stop the tape. OK, here’s the thing. It’s one thing to need to know where I am so that I can make a call, but it’s an entirely different thing altogether to keep that data for an indefinite amount of time or to sell it so that you can build a map of where I’ve been.

DANNY: And the mobile phone has a bunch of other sensors as well. It has a microphone. It has a camera. And I think most of us feel that the things that microphone can hear and that camera can see, you know, are under our control. So we turn those on and we expect that data to be kept on the device, right? Under our own control. But these days, we’re not sure. So there’s this very fuzzy perimeter between the data that’s being collected on us and where it’s going and what it’s being used for.

NELUFAR: Ugh. If you go into the analytics of your phone, which, which I did before jumping on the call with you — like, it measures how many times I pick up my phone. Like how many times it wakes up, as it were. What other kind of like surprising things are being logged or tracked? What other kind of surprising data is being collected through me using my phone?

DANNY: So weird things that phones know are like what angle they’re being held at. They know —

NELUFAR: Wow.

DANNY: — what the air pressure is, and the altitude. Newer models of phones actually have a bit of a sense of things that are around them as well.

NELUFAR: OK, you’re creeping me out now! That’s just creepy.

DANNY: Right, right, right. But to get sort of even more creepy about it, there’s also stuff that you can just derive from all of these little indicators. Computers, in general, are getting increasingly good at detecting patterns. If you can track someone, for, say, a month, after a while you can deduce where they are right now — without tracking them — with sort of 80, 90 percent probability.

[MIDTEMPO SYNTHESIZER MUSIC]

DANNY: And once you start collecting that data, you can derive other things, right? So if somebody appears very small in the background of one of those Ring cameras, and then appears a few minutes later in another camera, and then you see them going into — in a more high-resolution camera, or perhaps correlating this with mobile phone data — then you can basically identify who they are, and then track their movements in extreme detail across a whole city. And then you can also do that for everyone. And you can also retroactively do that, right? So you can, with the benefit of the internet connecting all of these devices together, you can retroactively say, “Well, I would like to know where this person was on this date five years ago.”

NELUFAR: My head is saying, “Excellent! We can catch murderers and thieves and bad people and terrorists.” But my heart is saying, “That is creepy, I don’t want that to exist. I don’t want people to be able to find out where I was. I don’t even want to know where I was five years ago today.” And there’s this kind of like weird push-pull relationship. I mean, I love technology, don’t get me wrong — my house is full of it. I’ve got one of those smart thermostats, I’ve got Alexa, I’ve got a smart TV, I’ve got — my phone’s connected to all sorts of things in my house. So I’m willingly giving all this information out, thinking, you know, I’m not a murderer. I’m not a bad person. I don’t care if people know this stuff about me. But equally, something in the core of who I am as a human, as a person, it just — it feels weird.

DANNY: Right! So, I think I can unpick that weirdness, that creepiness, right —

NELUFAR: Oh yeah?

DANNY: — is that I think that we have a sense of limits. I think most of us are OK with, like, the police investigating crimes and catching the bad guys. But we don’t let them come into our house without a warrant.

Like — this idea that these big tech companies really just have an automatic right to sort of have this relationship with us and our data is one of the things that makes it creepy. If you’re being tracked in all of these ways, surely you should be able to know about it, you should be able to understand what’s going on.

NELUFAR: Right!

DANNY: And the reason why we have that sense is because over time, the civil liberties — the human rights that we’ve built — have been designed to be a kind of constraint on all of these systems going wrong.

A lot of our laws are written to protect the privacy of our homes, so people can’t break down our door and seize our possessions without a warrant. Well, now we have stuff that’s just as private, but it’s stored on somebody else’s computer thousands of miles away. And how do we make the law and human rights, like, reflect that change?

NELUFAR: It’s so interesting to me that you, you think of it as a human right.

DANNY: Yeah. The European Charter of Human Rights actually added it to the list of human rights —

NELUFAR: Wow.

DANNY: — this idea that you have a right to protect your own data. I think that there’s a couple of elements to this, which is, of course, people are only just beginning to think of data in this way, and the interaction between that and human rights and civil liberties. I mean, the other thing is, though, is that it’s not entirely clear how we should govern this.

NELUFAR: One of the first major swings at the kind of governance that Danny is talking about is that GDPR rule I mentioned earlier. It’s short for the General Data Protection Regulation, and it imposes pretty hefty penalties to companies that sell or use your data without a certain amount of transparency. But Danny says that it’s still too early to tell just how effective it will be.

DANNY: The law that existed before the GDPR was sort of widely seen as a, as a bit of a failure.

NELUFAR: Right.

DANNY: It embodied all of these things, but it didn’t stop companies like Facebook and Google from collecting all of this data. So they added a little bit more teeth on the GDPR. You could actually take away a sizable percentage of a company’s revenue if they violated it.

NELUFAR: Well, that’s — is 4 percent sizable? I mean —

DANNY: Well, I mean, without getting into like the nitty-gritty of it, it’s 4 percent of people’s revenue. So it’s not their profits. It’s like actually all the money that they bring in. So trust me, I’ve talked to like these businesses and they are, that’s enough to make them pay attention.

NELUFAR: Oh my god, really? Because when I read that, like — “and they can fine you up to 4 percent of your global turnover” — which for Google would be billions and billions of pounds.

DANNY: Right, right.

NELUFAR: And for Facebook equally it would be billions and billions.

DANNY: Right.

NELUFAR: So is that, is that enough?

DANNY: I think it’s a bit early to go, “Well, we’ve sorted this out.” Like — I think it’s a question of like, well how do we actually build the institutions to deal with this kind of thing? And I would actually be much more interested in seeing alternative ways to managing control of this data. I think it’s a — we don’t have solutions.

NELUFAR: Right.

I’m old enough — and young enough, I guess — to remember the first ever PCs to come into my, like, school. And we had like the “computer room,” which everyone would queue outside of to try and like go in and play like solitaire at lunchtime. Like — I remember the first Packard Bell, Hewlett Packard computer, that we got. And we were like the talk of the town and the girls and boys that lived in my estate would come to my house just to look at the computer because it was this amazing thing. And those dial-up — do you remember those dial-up modems?

DANNY: I think this is a whole other hour of me reminiscing about, like, the noise that modems made when you first called into them. Yeah, and there was definitely a sense that that was empowering.

NELUFAR: Yeah! I felt like I was connected. Like I remember those MSN chat rooms and I felt like, I felt like I was part of a new community — that this amazing technology, this internet, this — this was going to democratize the world. And I’m worried now. Is the internet good for democracy? Is collecting data good for democracy or is it bad for democracy? And empowering the individual?

DANNY: It’s neither good nor bad.

NELUFAR: Oh, really?

DANNY: I mean, I think the truth — of course, computers aren’t the, you know, the first technology that’s come along. People have struggled with how technology has changed the balance of power, you know, since the Bronze Age.

This didn’t come as any surprise to people. And some of us have been kind of warning about this, at the same time as like, you know, enthusiastically getting people excited about the promise of the internet. But these things happen in parallel.

We worry about these big tech giants, and I think rightly so, but there’s a long history of big tech giants, like — who worries about IBM these days? There’s people in the 1960s and ’70s, who were primarily concerned about IBM having too much power. Then in the ’80s, it was Microsoft. And now, legitimately, it’s Facebook and Google and Amazon and Apple. The thing that keeps me sort of able to sleep at night is, I do think that like for every problem that technology offers, there’s always a solution waiting in the background.

NELUFAR: Mmm.

DANNY: You know, Tim Berners-Lee, who —

NELUFAR: Yes.

DANNY: — invented the web — it’s very interesting because he’s kind of moved his attention recently into thinking much more about how we re-decentralize the internet.  And that’s not necessarily a solution that lawmakers instinctively think about, because they think in terms of, you know, four or five American companies. Which is an understandable thing to do because those companies absolutely dominate sort of the coverage of the internet and also a huge chunk of what people do. But there’s more out there than that.

And I think we should maybe think about like restraining or limiting the powers of these big companies by supporting and sponsoring that decentralized idea of the internet.

[JAUNTY PERCUSSIVE MUSIC]

NELUFAR: Until the day a decentralized, more user-friendly web comes along, Danny O’Brien and the Electronic Frontier Foundation offer a course online in how to protect and secure your personal data. It’s called “Surveillance Self-Defense,” and you can find it at ssd.eff.org.

NELUFAR: Why aren’t we protesting in the streets about the ways in which our data is collected and deployed? Like, how on Earth can this be a good thing?

Well, as usual, there are two sides to every coin. In this case, I guess, there are two sides to every click. So I decided to call up Ingmar Weber, a data scientist at Qatar Computing Research Institute. And he understands that not all surveillance is used for ill.

NELUFAR: So obviously like I, before coming into the studio to talk to you today, I wanted to do a little bit of research, a bit of digging. So I went onto Facebook and I went onto Twitter and I went onto Google to see what kind of information they have about me. Because in Europe we have this thing called the GDPR, which, as I understand it, just means that we have a little easier access to the information these giant tech companies hold on us. And Ingmar, I have to say, I was shocked. Facebook knows me better than my mother. It’s really scary!

[LAUGHTER]

NELUFAR: I mean, Facebook knows things like, I am interested in space documentaries. Why does it know that?

INGMAR WEBER:
Usually this type of data is collected about users with a purpose of ultimately serving better advertising to them. Advertisers can be very, very specific and very targeted in who should or should not see an ad. So you could go to Facebook and say, you know, you would like to show an advertisement just to Facebook users who now live in Qatar, but who used to live in Germany, and who are, let’s say, interested in computer science and who hold a PhD.

NELUFAR: Wait. Ads can be that specific?

INGMAR: Yes. These sorts of audience estimates that we use in our work.

NELUFAR: Ingmar’s work isn’t selling coupons or nice T-shirts. He partners with UN agencies and NGOs, trying to help them better understand global problems like refugee migration, gender gaps and regional poverty.

Here’s how this works: Take a situation like the economic collapse in Venezuela. During the crisis, there was very little reliable data on exactly how many people were leaving Venezuela, or where they were going. And traditional field research can take months or years to compile. When an organization like the UNHCR, the United Nations’ refugee agency, wants to help people who are migrating, time is of the essence. Ingmar’s research can help fill in the gaps in the data — and they can do it fast.

INGMAR: So we sort of basically go to a platform such as Facebook and say, “Hey, I would like to show an ad to people who used to live in Venezuela, but, let’s say, who now live in Norte De Santander in Colombia.” So that’s sort of one part, you know, in Colombia, on the border with Venezuela. And then, you know, before launching the ad campaign, Facebook would provide us with an estimate of how many of its users match the given targeting criteria.

So then Facebook might come back and say, “Oh, in this particular part of Colombia, there are about, let’s say, 100,000 people.” And so then we can sort of, you know, repeat this exercise for sort of different parts of Colombia. Facebook will know internally that, you know, these people have left the country, right, based on, for example, their, you know, IP address —

NELUFAR: Wow. Can I just get something straight here for a second, because I have — literally I have goosebumps sitting in the studio here. You’re telling me that you’ve almost hijacked the Facebook ad system with the way that they can micro-target ads, to provide humanitarian services and help, or connect people who do that, to some of the most vulnerable people alive right now.

INGMAR: Well, I wouldn’t say “hijack.” We certainly try to, you know, use these sort of platforms in —

NELUFAR: You’re using ads for good!

INGMAR: Right.

NELUFAR: You’re using Facebook ads for good!

INGMAR: I’m not saying this is like the cure-all and this is all you need, but based on a lot of interactions we’ve had, it does seem that this data really does add value that’s not easily available through other data sources. So it’s definitely, you know, one part of the puzzle that can help hopefully make the situation a bit better.

NELUFAR: This sounds like the mother of all course corrections. It just goes to show that Danny’s right: That data isn’t inherently good or bad. It’s just information, lots of it, that can be used to act in the real world however one chooses.

NELUFAR: I have obviously been thinking about this — in my very Western, privileged life — of how annoying it is to see targeted ads, but I want, I need your help to understand how there are serious implications or possibilities that are either exploitative or downright dangerous when it comes to how this huge amount of data can be used. Can you talk to me a little bit about the parts that you find to be problematic, or tell me about the bits that worry you?

INGMAR: Sure, sure, absolutely. A lot of the risks that people think of is actually not the risk that worries me. So a lot of people think of the risk of sort of privacy — and obviously privacy, you know, is a fundamental right. But — at least in the work that we’re doing or that others could do or could build on our work — I don’t see a direct way of identifying individual users. So as such, you know, even if you’re in the data, I would not know how to track you using any of these platforms. So that’s not the risk that worries me.

The risk that worries me more is about the level of sort of group privacy, or relate to sort of potential group harm. Where, you know, you can go onto these platforms and they’d say, you know, I’m originally from Germany. So I can look at, OK, I would like to show my ad to people living in a particular part of Germany that are interested in Jewish holidays, right? And doing this, you could potentially create like a map of where, you know, certain types of religious expressions, you know, or sort of —

NELUFAR: You could — so you could create — wait wait wait — so you could create almost a map of all the Jewish people in Germany?

INGMAR: You know, I’m not worried about any, you know, imminent danger to, you know, any particular religious or ethnic minority in Germany, certainly not, you know, when it comes to actions from the government directly —

NELUFAR: Right.

INGMAR: — but you could imagine that, you know, the same situation in other countries might not be the same. And so, for example, when I presented this work to people at the Harvard Humanitarian Initiative, the first question from them to me was, “OK, so that’s all jolly good, but now show me a map of the Rohingyas.” Right? In Myanmar or in Bangladesh, I sort of — and then of course you do get goosebumps.

NELUFAR: I’m just going to need to take a minute to, to stop my brain from thinking about undocumented people in America or, you know, journalists in Brazil or the Philippines. You know, it seems difficult to imagine all of the possibilities, both kind of genocidal, but also the humanitarian —

I mean, this is the thing about data on this level, Ingmar, and this is why I so desperately wanted to speak to you. The potential is to save lives, is to get help and aid where it needs to be, but the risk is to target groups and disenfranchise them and remove them, disempower them from political or democratic processes. How do you reconcile that — just as a human being, Ingmar, how do you reconcile those, those risks and rewards?

INGMAR: Sure, yeah. Of course, in our own work, we try to ensure that the, you know, the, the people we work with, the organizations we worked with, have the best intentions, you know, when it comes to serving the groups of people that we deal with.

One way to frame things is between the risk of misuse, but then also considering the risk of missed use. What is the risk, or what is the harm done, by not using this data?

[JAZZY SYNTHESIZER MUSIC]

NELUFAR: Imagine: Rather than just selling your data, the internet platforms of the 21st century could help us solve real-world problems. Never thought I’d end in a place that is cautiously optimistic — even thankful — about the data that an entity like Facebook is collecting. But here I am.

Every second, each of us is generating a mountain of invisible data. And yes, it’s silly to think that some of that wouldn’t help grease the wheels of hyper-consumerism. But I get to decide where and how I spend my money. I’d like to know where and how — and why — someone is sharing my data. We need more and better rules to protect ourselves. And we need more tools like Ingmar’s, that help us use this data to help protect others — those in the most vulnerable communities.

[PERCUSSIVE SYNTHESIZER MUSIC]

That’s our show. Let’s hear from you. How are you taking control of your own data? And how do you think we can hold internet and mobile companies accountable? Tweet us at @DohaDebates. I’m at @Nelufar.

Course Correction is written and hosted by me, Nelufar Hedayat. The show is produced by Doha Debates and Transmitter Media. Doha Debates is a production of Qatar Foundation. A special thank you to our team at Doha Debates — Japhet Weeks, Amjad Atallah and Jigar Mehta. If you like what you hear, please rate and review the show — it helps other people find us. Join us for the next episode of Course Correction wherever you get your podcasts.

In this episode

Nelufar Hedayat

Nelufar Hedayat

Doha Debates correspondent & host of Course Correction and #DearWorldLive

More episodes from this podcast

course correction season 3 episode 6 3 2 site under 350 570 320
icon
Human rights

Part VI: Finding Acceptance

Course Correction
S3E625 MINS
course correction season 3 episode 5 site 570 320
icon
Human rights

Part V: The Path to Permanence

Course Correction
S3E531 MINS
TownHallThumb 570 320
icon
Education

Bonus: Malala Yousafzai Town Hall

Course Correction
S3Special / bonus episode63 MINS
Ep4 course correction refugee ada jusic illustration 3 2 570 320
icon
Education

Part IV: Pursuing Education

Course Correction
S3E429 MINS
ep3 s3 site img 2 1 570 320
icon
Health

PART III: Healing the Mind

Course Correction
S3E324 MINS
Course Correction Season 3 doha debates site 3 2 570 320
icon
Human rights

Season 3 trailer

Course Correction
S3Trailer2 MINS
ep1 s3 refugee journey course correction doha debates ada jusic illustration 2 1 dd site 570 320
icon
Peace & Conflict

PART I: Escaping Conflict

Course Correction
S3E125 MINS
ep2 s3 womens physical safety course correction doha debates ada jusic illustration 3 2 site 570 320
icon
Health

PART II: Healing the Body

Course Correction
S3E221 MINS
Ep12 570 320
icon
Human rights

Refugees and the fight against populism

Course Correction
S2E1238 MINS
Ep11GettyImages 1075923138 570 320
icon
Peace & Conflict

Palestine, Israel and the courage of dialogue

Course Correction
S2E1143 MINS

More on this topic

240711 DD BLF E plainer WEB 755 566
Arts & Media

How Western Narratives Shape Our Understanding

Explainer
4 MINS
Learn More
240322 DDP S1Ep20 Website CLEAN 104 104
Technology

WATCH: Nadeem Nathoo, Isabelle Hau and Louka Parry debate how to ensure equal AI access for all

Doha Debates Podcast
S1 E20 40 MINS
AISchool 104 104
icon
Education

Equal Education: How can we ensure equal AI access for all?

Doha Debates Podcast
S1 E20 40 MINS
240126 Necessary Tomorrows S1E6 web 104 104
icon
Technology

Indigenous AI

Necessary Tomorrows
S1 E6 33 MINS

Our other podcasts

DDPGuestV6 310 310

Doha Debates Podcast

Doha Debates Podcast brings together people with starkly different opinions for an in-depth, human conversation that tries to find common ground.

Lana 310 310

Lana

A podcast in Arabic for today’s generation to discuss their outlook on the world, hosted by Rawaa Augé.

Necessary Tomorrows Logo Clean 310 310

Necessary Tomorrows

Mixing speculative fiction and documentary, leading sci-fi authors bring us three futures that seem like fantasy. We meet the writers who dream of these futures and the activists, scientists and thinkers who are turning fiction into fact.

Long Game background DD web 310 310

The Long Game

The Long Game highlights stories of courage and conviction on and off the field. Activist and Olympic medalist Ibtihaj Muhammad guides the series and examines the power of sport to change the world for the better.

Negotiators logo forwebsite no title 1 310 310

The Negotiators

The Negotiators brings you stories from people resolving some of the world’s most dramatic conflicts.