Topics
Series
Media type
Topics
Featured topics
Arts & Media
Economics
Environment
Governance & Politics
Health
Human rights
Peace & Conflict
Social movements
Technology
Series
Featured series
Better Conversations
Doha Debates Podcast
Doha Debates with Ghida Fakhry
Necessary Tomorrows
The Long Game
The Negotiators
Town Halls
Vox Pops
Media type
All media types
Podcasts
Videos
Articles
Topics
Featured topics
Arts & Media
Economics
Environment
Governance & Politics
Health
Human rights
Peace & Conflict
Social movements
Technology
Series
Featured series
Better Conversations
Doha Debates Podcast
Doha Debates with Ghida Fakhry
Necessary Tomorrows
The Long Game
The Negotiators
Town Halls
Vox Pops
Media types
Podcasts
Videos
Articles
April 14, 2021

The Disrupters: Using free speech for good and evil

S2 E4 35 MINS
Social media has made it easier than ever to share ideas around the world and galvanize people into action. Host Nelufar Hedayat looks at the double-edged sword of free speech from the perspective of social media influencer, a free speech lawyer and two tech veterans who say that today’s tech companies wield too much power in determining what kind of speech should be permissible.

Full transcript

Note: We encourage you to listen to the audio if you are able, as it includes emotion not captured by the transcript. Please check the corresponding audio before using any quotes.

[PERCUSSIVE MUSIC]

NELUFAR HEDAYAT, HOST:

This is Course Correction, a podcast from Doha Debates. I’m Nelufar Hedayat. For season two, we’re focusing on polarization. Each episode I look at one big problem and talk to people from a wide range of views. 

I’m going to keep my mind open to new perspectives with the goal of seeing if there are solutions to these problems. And no other issue is so omnipresent than how tech is shaping our language and our lives. 

[IPHONE MESSAGE ALERT RING]

NELUFAR: I mean, it’s everywhere. We express ourselves with emojis, and share our data widely by just asking a search engine a simple question. So much of our lives happens online these days and is dictated by how quickly we can consume information, factual or not. 

[PERCUSSION MUSIC]

NELUFAR: This episode is the second of our three-part arc, where we unpack just how much tech influences our lives — no more than right now. Like many of you, I’m having most of my work conversations these days over Zoom. So if you hear a wobble or two, you can thank the Zoom Gods for that. If you haven’t had a chance to listen to our previous episode yet, hit pause on whatever tech device you’re listening to me right now and go back. I talked to the “godfather of fake news” and to the politicians dealing with the consequences of misinformation on the daily. The thing is, though, tech doesn’t just affect people in power. It affects us, too. Today will press a free speech purist on how far is too far to go. 

MAN WITH DANISH ACCENT: Regulating or restricting free speech may not actually solve the problems that you are trying to address, because those who define and enforce free speech are always those who are in power. 

NELUFAR: And we’ll hear from two French entrepreneurs on what we should demand from Big Tech when it comes to free speech. 

WOMAN WITH FRENCH ACCENT: We are at a crossroads where technology impact on the world is not necessarily net positive, as a lot of people working in tech want us to believe.

MAN WITH FRENCH ACCENT: They are not trying to make the world a better place. They are just trying to make money.

NELUFAR: Tech shapes how we think. It warps how we feel, and it can even push us to act in surprising ways. And no one knows that better than my friend Shahd Batal. Shahd is so beautiful. I actually find it better not to look at her directly in the face. 

The fact that one ocean and nearly a whole continent separates us means nothing because we keep in contact virtually all the time. I love Shahd. She’s a queen, and here’s why. 

SHAHD: Hey, guys. Good morning. Happy Sunday. Welcome back... I am going to Coachella today. Very last minute. I have no idea. 

[RETRO MUSIC]

NELUFAR: Shahd Batal is a beauty influencer and a hijab-wearing brand ambassador living in Los Angeles. Her influence doesn’t just stop at her keen sense of style or her gorgeous glow.

She’s wielded her power and platform to encourage people to vote, land an interview with her hometown Minneapolis representative, Ilhan Omar.

SHAHD: I have representative Ilhan Omar with me. Today is going to be very light-hearted. I don’t want to take up too much of your time.

NELUFAR: And to speak out against injustice. 

SHAHD: More often than not, the black Muslim community is overlooked. There’s a difference between being a Muslim and a Muslim woman and a woman of color Muslim and a black Muslim woman. There is a difference — I don’t want to hear, “We all go through the same struggle.” No, we don’t.

NELUFAR: Like me, Shahd is a first generation kid. I was born in Afghanistan, a country embroiled in civil war, and left for the safety of the U.K. She was born in Sudan, raised in the United States, and often feels torn between her two homes. 

SHAHD: Well, I do go back home, and I am clearly the American. I’m clearly the foreigner. Every single year, my Arabic is deteriorating more and more. Being a first gen is the one — like I’m one generation from losing all of the culture.

NELUFAR: Shahd knows she’s the link between keeping or losing her family’s language, her family’s culture, and her family’s story. In college, while protesting for Black Lives Matter, she became more aware of her voice and the power of speaking out. At the same time, Shahd would often tell me how her heart was broken as she watched Sudan’s political situation deteriorate. The country ousted its leader, Omar al-Bashir, in 2019, but its transition to democratic rule has been marred in economic upheaval as people demand justice.

NEWS CLIP WITH MAN SPEAKING: The revolution was sparked by a rise in bread prices, fuel shortages, as well as rising inflation. Hundreds of protesters were killed before the military overthrew President Bashir in April 2019. A power-sharing agreement between the military and the forces…

So what could one 20-something-year-old living thousands of miles away do to help her Sudanese community? She knew the Western version of the Sudanese narrative, the one only sporadically covered in the media, was not the experience that she was seeing on Twitter, Instagram, YouTube and Facebook. 

It seemed distorted, disrupted, and so she put her skills as an influencer to work, finding eye-opening protester videos and shareable photos that help others feel the same visceral connection to the story that she had. Take, for instance, a picture of a woman that went viral, 22-year-old Alaa Salah standing on top of a car, leading a crowd in chants. Take a minute to Google her. That’s A-L-A-A S-A-L-A-H. The video still gives me chills. 

NEWS CLIP WITH AMERICAN WOMAN SPEAKING: This photo of Alaa posted to Twitter has gone viral the world over. Some see it as the defining image of the key role women have played in the uprising that toppled President Omar al-Bashir after some 30 years in power. But women’s involvement in protests in Sudan isn’t new. 

[PEOPLE CHANTING IN PROTEST]

NELUFAR: If this had happened 10 years ago, maybe, it would have been up to traditional media outlets like 24 hour news channels or the broadsheets to have brought it to people’s attention. But Shahd has found a way to cut through mainstream media gatekeepers to be her own brand, and with that, wield her own power to disrupt what she saw as a silent — and therefore complicit — media narrative. 

SHAHD: Obviously, like, the Western world doesn’t care about Black people. So they’re not going to pick up the stories until it gets bigger, until there’s a woman that you can put that you know, you can draw and use as a symbol for everything. So, yeah, it was watching in real time. My family just kind of being like, “Yep, this is what’s going on today.”

NELUFAR: Shahd has that power to use social media to put pressure on those in charge, whether it’s traditional news gatekeepers or elected officials. Not that she’s naive about social media. She knows that the same ability to bring attention to injustice can also be used to mask or distort the truth. 

Shahd is also aware that it can be a place for virtue signaling and not action, and how social media could confine us to our personal bubbles, isolating us from new ideas. Being online can sometimes feel like being in a shady flea market, where everyone is trying to sell you their opinion or idea, be it real, a knockoff or a dangerous fake shot. Shahd helped me understand this better. 

SHAHD: The media is known to just glaze over things that are happening far away from here, right? So it would have just been glazed over and nothing would have happened. But I think the power of social media was just the pressure that it was putting on old hats. Perhaps it can feed into your narrative and show you what you want to see. And it is a very problematic space.

But I also — I have a lot of faith and a lot of hope in the direction that people are going, because I’m seeing, you know, Gen Z care more. I’m seeing people fight more. 

It was like, I love — I love the kids. I love the Gen Zers. As my friend says, we’re like — I’m literally the grandmother of the Gen Zs. 

[STRING MUSIC]

NELUFAR: Facebook and Twitter have rules against saying things they deem too extreme. And they’ve been trying to purge their platform of hurtful and violent rhetoric. Recently, Facebook rolled out what they called their “oversight board,” but it’s commonly known as the “Facebook Supreme Court.” It’s made up largely of lawyers and human rights experts. They’ve been tasked with having the final call on content moderation decisions. 

But defining speech as hurtful or even hateful is not a clear-cut thing to do. These days it’s easier than ever to spew hate or conspiracies online and to find an audience ready and waiting to agree with you. And now that hate is leaping off of cyberspace. Divisions are hardening and increasing radicalization — just like what happened on January the 6th and the storming of the United States capital that was spurred on by former President Donald Trump’s lies, the fringe conspiracy theory QAnon and others. 

NEWS CLIP WITH BRITISH MAN SPEAKING: The ballots that you said you saw are lying around the place or in trash cans or whatever. Where are you hearing that from? 

AMERICAN MAN: I mean, The videos are going viral everywhere. I’ve seen them on TikTok. I’ve seen them on Facebook.

[STRING MUSIC]

NELUFAR: For today’s challenging interview, I want to speak to someone who, even after seeing the devastation caused by unchecked speech, still thinks society is better with it than without. And that the only way to ensure free speech for everyone is to ensure free speech for everyone, even if this means protecting the rights of right-wing conspiracists, white supremacists or even neo-Nazis. And I think we found just the right person. Jacob Mchangama is a Danish lawyer and an advocate for free speech. 

You could say it runs in his family. 

JACOB MCHANGAMA: My father is from the Chamorro islands in Africa, and when he was a young rabble rouser, he was imprisoned by French colonial power and also the subsequent sort of dictatorship on the Chamorro islands. And he came to Europe, and also, since he returned, been arrested, funnily enough, on hate speech charges for organizing protests. 

NELUFAR: Mchangama first got interested in protesting free speech after the Danish newspaper Jyllands-Posten published controversial cartoons of the Prophet Muhammad.

JACOB: That sort of turned into a huge global crisis with deaths around the world. 

NEWS CLIP WITH AMERICAN WOMAN SPEAKING: The publication of the cartoons provoked outrage, attacks on Danish embassies, riots which killed dozens and a boycott of Danish products.

[CHANTING WITH EXPLOSIONS IN BACKGROUND]

NELUFAR: According to Mchangama, the definition of free speech seems to shift depending on your political affiliation. He contends people on the right will say free speech is absolute. But they’ll then make an exception for political speech by Muslims who they’ll broadly paint as potential terrorists. 

JACOB: And what surprised me about that was that a lot of people, sort of liberal progressives who would see themselves as the heirs of enlightenment values, were suddenly saying, you know, “Free speech is important.” But when free speech offends the religious sensibilities of minorities, we, you know, it’s not really that important a value. And so they were sort of operating with a two-tier free speech structure. 

NELUFAR: This argument felt familiar to me. I’ve seen it dozens of times before in YouTube videos and big debate shows. Self-anointed experts are always so sanctimonious about their right to spout off on anything and everything, no matter the consequences. But if you tried to suggest common sense regulations, then you’re next. Hmm. 

JACOB: And that to me is a disturbing pattern in free speech debates. 

NELUFAR: I told Mchangama how I see this as terribly dangerous. 

NELUFAR: The truth cannot exist outside of the ability to say the truth in defiance of power. Free speech allows you to, you know, play that game of David and Goliath and win, right?

JACOB: In this part of the world, if we talk about sort of the liberal democracies in the West, I think one of the problems is that we take it for granted, and we don’t really see all the benefits of it. And we focus on the harms and costs of free speech because it’s important to mention that those, you know, they are a part of free speech as well.

NELUFAR: I want to make one thing clear. There seems to be a clear demarcation that you’re making between, like, what you call liberal democracies or mature democracies. What’s the opposite of those things and how does free speech fit into them? 

JACOB: Well, take a look at Vladimir Putin’s Russia. That’s a — that’s a great example. 

You know, we have a lot of politicians in liberal democracies who worry about social media and fake news. But look at how social media has fed Alexander Navalny’s campaign against Putin.

This is one guy now in prison, who spread a video claiming to document a grotesque corruption on the part of Vladimir Putin. And it has been shared more than 100 million times. You know, Russians would never see that kind of information in traditional media.

So it’s extremely important to sort of see the difference between being in opposition in Russia, is that in the U.K., you don’t end up in prison. And that’s why I think in Western countries, journalists and politicians who are now sort of worrying a lot about free speech and want to have more limits on it — more restrictions are in a certain ways acting a bit privileged.

NELUFAR: Is free speech the same everywhere for everyone?

JACOB: Well, no, obviously, there’s a lot of context to free speech.

But I would say, you know, looking back at history, there’s a very strong relationship between free speech and democracy. So free speech had its origin in the Athenian democracy 2,500 years ago. And, you know, by our standards, the Athenian democracy had big shortcomings. But by the standards of the day, it was a radically egalitarian democracy. 

NELUFAR: What constitutes free speech in one place at one time is, you know, we would look at now and like cringe and be like, oh, my God, what?

JACOB: Exactly, you know, the U.K. is one of my my favorite examples, because up until the 19th, the middle of the 19th century, basically the lower classes were not allowed to publish radical pamphlets that challenged the status quo, because if the lower classes got these ideas into their head, the social cohesion of the class-based British society would basically collapse. And, you know, today we will look at that — we say, oh, that’s absurd.

NELUFAR: I guess my feeling is that it has never changed, Jacob. 

The wealthy classes have just changed for Big Tech companies and social media platforms who control, who arbitrate and who perform these acts of censorship and these acts of maneuvering on free speech. And these tech companies are the new Goliaths. They are the big ones. They are the powerful ones. And I think they have too much power. 

JACOB: No, I think it is a huge problem with this platformization of social media platforms. When it comes to free speech, the most important aspect in keeping free speech vibrant is a culture of free speech. And that doesn’t only speak to laws. It speaks to sort of the willingness of everyone to tolerate ideas that we disagree with. And the problem with these platforms is that a lot of people today get their news and share their ideas on private platforms that then regulate this speech according to very opaque procedures with constantly shifting terms of service and community standards. I think we would be better off if we didn’t have these huge centralized platforms. 

NELUFAR: I think I see where you and I can work together. We really can. Free speech can only work when everyone has it in an equal way. When there are gatekeepers, like Facebook, like Twitter, like Google, like Apple, then those kinds of bastions of free speech become tyrants because they’re just too big. Free speech can only work if, if it’s democratized, if it’s ubiquitous, if everybody has it.

JACOB: You know, I think there will always be some element of gatekeeping.

But I would say that as a general rule, decentralization has been very good for free speech and centralization — whether at the level of government or private corporations, has been bad for free speech. And yes, it does also allow bad actors to promote their idea, and that can have real-life consequences. 

Again, the benefits of free speech far outweigh the harms.

NELUFAR: I guess, I kind of want to stand by that point, if I might, for a moment. 

I think we’ve seen enough happen in the last five, 10 years in the Philippines, in Myanmar, in Sudan, in developed countries, to know that actually free speech running wild causes insurrection. It causes right-wing people to co-opt the idea of free speech to mean hate speech.

JACOB: Hate speech is a great example, because how do you — how do you define it? So if you go back to British colonialism, you had laws that — basically hate speech laws that were used against Indian nationalists, who were critical of British colonialism. If you go to apartheid South Africa, you had hate speech laws that protected the white minority and its white supremacy system against people like Nelson Mandela, against criticism of apartheid. If you go to Russia and you have hate speech laws there in place that protect Putin’s Russia. 

So these kinds of laws have a long history of being abused, and they will always be defined by those in power. 

NELUFAR: So I’m not looking to Putin’s Russia to tell me what democracy means. 

I’m saying we need to start asking the question, because when we don’t ask the question, when we have rampant free speech, you get situations where people think politicians are eating babies and drinking their blood. So, and that causes violence. 

JACOB: Let’s just also be clear that a lot — in a lot of countries that don’t have free speech conspiracy theories, you know, it’s not something that was invented with the internet. And violent insurrection is not inherently associated with free speech. I think that’s incredibly important to stress that. 

So, as I said, yes, there are problems, or costs and harms, involved in free speech. But I think, you know, at least if you want to regulate free speech more heavy-handedly, as you know, is on the books in the U.K. with the online harms bill then you also have to at least acknowledge that you might also harm certain values. And that regulating or restricting free speech may not actually solve the problems that you are trying to address, because those who define and enforce free speech are always those who are in power.

[STRING MUSIC]

NELUFAR: My thanks to Danish lawyer Jacob Mchangama for that conversation, though, and I have to admit, I feel really unsatisfied by it. Sure enough, he made a good, clear case for near absolute free speech, but I didn’t buy it. 

If we as individuals should be allowed to speak our minds and if freedom to say whatever we want is a pillar of functioning democracy, what about safety? Whose job is it to make sure that speech doesn’t lead to incitement of violence? Is it reasonable to think that governments and regulators can handle this, or should the burden be on Big Tech? These are the questions I wanted to know from my next pair of guests. Both have had a long history working with tech giants on the issue. 

Maëlle Gavet is a businesswoman and author who writes about Big Tech’s empathy problems. And Matthieu Boutard is a social entrepreneur, who combats cyberbullying through artificial intelligence. He previously worked at Google. Both have been critical of Big Tech’s outsized influence in our lives and what they see as negligence in protecting consumers. Yet they still think tech can help us solve some of the world’s most pressing problems. 

So I’m curious to know from both Maëlle and Matthieu how they reconcile the good and the bad.

MAËLLE GAVET: 

I’m a huge supporter of technology. I do believe that used properly, technology does make the world a better place. I just think that when you look at where, where we are right now, we are at a crossroads where technology impact on the world is not necessarily net positive, as a lot of people working in tech want us to believe.

MATTHIEU BOUTARD: 

I think like when we look back and you know what I’ve said about Google — and it’s true for all the Big Tech — you know, they’ve changed our lives for the good. 

And to me, what has changed over the last years is actually the nature of capitalism.

NELUFAR: Just a small thing then, just a little capitalism. 

MATTHIEU: [LAUGHS] Exactly. Because, you know, that through things like Google is not a social business. Zuckerberg is not a philanthropist. They need to generate more money and more and more and more money. And to do that, you want to generate user engagement, acquisition and retention. Social networks are very, very addictive. 

MAËLLE: I think Matthieu makes a good point. I think one of the challenges that we have nowadays is that rules and responsibilities seem to be a little blurry. We expect big companies, by the way, not just tech, to be taking responsibility for a lot of things which are way beyond what fundamentally companies are usually created for, which is to make money. 

And yet somehow, we have kind of accepted, especially in tech, the idea that, “Oh, they should be self-regulating.” We have all our stakeholders, like citizens, that we have started treating exclusively like consumers, and assume that as long as we feed them cheap products, there’s no reason for them to complain. 

NELUFAR: Why are we so forgiving of Big Tech — when it’s used to disrupt for good? Why do we have such short-term memories when that tech is used to disrupt for evil? 

MAËLLE: I wonder whether it’s not more a question of — tech companies have managed to convince us that this is the price to pay. That disruption is the price we pay for absolute freedom of speech, that all these revolutions and all these massacres and all this bullying and all this harassment that is happening online on social media is the price to pay for people with voices that were never heard before. For these people to have an opportunity to speak. 

And I think what we need to challenge again, as a society, as regulators, if we are regulators, as citizens, if we have voting rights, as users, if we use this platform, is the fact that you can still help underrepresented minorities to be more visible and to get together without necessarily having to also allow Nazi speeches and without like tolerating harassment and bullying. And I think this, this false choice that a lot of tech companies have drilled into our brain is the thing we need to start challenging.

MATTHIEU: I couldn’t agree more. You know, Big Tech keeps on saying, “This is too complicated, this is too, you know, how to solve these issues that are very really deep.” It’s actually not the case. And that’s the question I’m asking myself is like, are they, “Do they have good intentions or not so much?” 

In a technical perspective, the way they have developed the algorithm and the way technology was built is the wrong artificial intelligence and machine learning. And the way machine learning was built by YouTube, Google, Twitter and all of them is not the right way to solve social issues. We are teaching machines actually with biases. We are teaching machines, not the right way. 

MAËLLE: I agree with everything that was just said. I would add a few more points though. First, I don’t think that tech executives wake up in the morning with evil thoughts and are like, “How are we going to be able to disrupt the world and make all these people miserable?” 

But they see freedom of speech as an absolute right, something that they defend with all their soul and all their heart and all their money. And they really believe that because of that, it implies that all the other rights should take a backseat. So things like human dignity or safety, for example, are important, but in their mind, not as important as freedom of speech. 

NELUFAR: But that’s the same as being a fundamentalist. 

MAËLLE: Exactly. Exactly. But again, we have to be more nuanced, and “they’re just doing it for money” — I mean, for sure, money has a huge impact, but it goes beyond that. It goes to this idea that freedom of speech is an absolute right. In general, these are companies where the decisions are being made by engineers, and engineers tend to think in terms of technical problems. So as Matthieu was just explaining, they think about what is the best way to optimize the algorithm and they don’t always think about the human impact. 

NELUFAR: OK, yes, OK. I accept they are people, bearing in mind they’re probably predominately white male people.

MATTHIEU: Exactly. The numbers are actually shocking. They are only 20% of tech is women — it’s only 20%, and then 2% of tech are people of color. And they are the ones making decisions on a daily basis to know what to moderate and what to keep or remove on the platform. That is actually very scary to us.

NELUFAR: It is a scary thought, free speech engineered by such a homogenous group. 

So now I wanted to know, what’s next? 

NELUFAR: Who should be responsible to curb online hate, conspiracy theories, to curb things like QAnon and radical behaviors and radical thinking, like should it be like a world police? Should it be Interpol? Should it be government? Should it be groups within Google and such? Like, where do we want to start from? 

MAËLLE: I have a fairly nuanced view on all of that. What I’m advocating for is to go back to where we were not that long ago before tech existed, where there was a hard-fought, hard-won balance between entrepreneurial capitalism with companies that would go and innovate and try things and learn from their mistakes, and governments that would regulate by giving guidelines that need to be followed. 

For example, if you look at what exists in Europe, which I mean, it’s not perfect, but we do have free speech. And yet, following World War II, Western European democracies have implemented a series of measures that, while protecting free speech, ensure that it isn’t absolute and they take into account other values and rights, such as, as I mentioned earlier, human dignity and safety. 

So in Germany, in Austria, in France, for example, Holocaust denial is punishable by law. Germany has banned parties with Nazi ideologies. So we should be able to have a regulator just giving broad guidelines. But it doesn’t mean that they have to regulate every single post.

MATTHIEU: You know, I think I would agree with Maëlle. The first question I’m asking always is, “What is hate speech?” Because the definition differs from country to country. It is always debated in courts. 

MAËLLE: I have to say, though, in a realistic way of looking at the world, like we can’t expect every single tech company to be a nonprofit, like we actually want companies, which are for-profit companies because this is the way we’ve pushed a lot of innovation in the world. And so we got to be careful to not like to move to the extreme and say, oh, every tech company should be a nonprofit. I wouldn’t want a world where there is only NGOs. But I also don’t want a world where there’s only capitalist companies. 

It’s just, it’s a balance. We always have to think about anything in life in terms of balance. And I think we’re just out of balance right now. 

MATTHIEU: Yeah, completely out of balance. And I agree. What I’m very sad about is the billions and billions that Big Tech are making and how much we give away to NGOs, and how much we give away to solve the issue. And that’s not much, that’s not enough. And that’s where I want them to share a bit more. This is where I want them to be more inclusive. And this is where it is, to me, there’s a huge lack. We can do a lot more. They can do a lot more. But the French in me talking here, and we talk about taxes, we talk about employments, we talk about diversity — and this is not enough. And we can do a lot more, a lot more. 

[PIANO MUSIC]

NELUFAR: Thanks to Maëlle Gavet and Matthieu Boutard for that conversation.

This season of Course Correction is all about polarization and trying to build bridges to overcome our fractured ways. After talking to my guests, I realized: Big technology companies themselves are disrupting our speech. We’re trying to have complex conversations with limited character counts in a noisy medium, where the algorithm amplifies, oversimplifies, creating black-and-white viewpoints and users can lash out hatefully hiding behind the shield of digital anonymity. 

Human beings are being subjected to algorithms that decide what we are fed in our news feeds. This is short-sighted. It robs us discovering ideas from different cultures and creeds. Innovation happens best when people mix together to learn and share. 

It seems as though the entire world is being taught to never go offline. This is intolerable. We’re spending too much time sitting in silence, watching something awesome or mundane. It doesn’t matter, just as long as we’re connected to it. 

Reflection and introspection has given way to the need to always be filming, snapping or sharing. Even the term “free speech” itself feels somewhat outdated for the whole discussion. Sure, we can say what we want, but the question today is: Who gets to control how we’re heard and how far our voices can reach? That speech has been forever changed by these new systems that we have to live with. 

The last time something like this happened was when we went from sharing stories verbally to writing and then printing them out. That changed us forever. We have yet to reconcile humanity’s need for communication with tech giants’ need for profits, and the awesome power individuals have to disrupt speech for good or for evil. 

That’s the people versus Big Tech. But what happens when we’re not fighting algorithms, but rather mob mentality on what’s socially permissible? Well, that’s next week. We’ll look at “cancel culture” and how tech has changed how we behave with one another.

[CREDITS]

Whoo! That’s our episode. What did you think? Exercise your right to free speech on the internet and tweet us at @DohaDebates or me, I’m @Nelufar and I always really love hearing from you. So get in touch. And here’s one more request. Please write us a review of the show. It really helps spread the word about what we’re trying to do. 

Course Correction is written and produced by me, Nelufar Hedayat. Editorial and production assistance comes from Foreign Policy, with producers Sarah Kendal, Sofía Sánchez and Rosie Julin. The managing director of FP Studios is Rob Sachs. The show is brought to you by Doha Debates, which is a production of Qatar Foundation. Our executive producers are Japhet Weeks, Amjad Atallah and Jigar Mehta. Join us for the next episode of Course Correction and the final of a three-part arc wherever you get your podcasts. 

More episodes from this podcast

course correction season 3 episode 6 3 2 site under 350 570 320
icon
Human rights

Part VI: Finding Acceptance

Course Correction
S3E625 MINS
course correction season 3 episode 5 site 570 320
icon
Human rights

Part V: The Path to Permanence

Course Correction
S3E531 MINS
TownHallThumb 570 320
icon
Education

Bonus: Malala Yousafzai Town Hall

Course Correction
S3Special / bonus episode63 MINS
Ep4 course correction refugee ada jusic illustration 3 2 570 320
icon
Education

Part IV: Pursuing Education

Course Correction
S3E429 MINS
ep3 s3 site img 2 1 570 320
icon
Health

PART III: Healing the Mind

Course Correction
S3E324 MINS
Course Correction Season 3 doha debates site 3 2 570 320
icon
Human rights

Season 3 trailer

Course Correction
S3Trailer2 MINS
ep1 s3 refugee journey course correction doha debates ada jusic illustration 2 1 dd site 570 320
icon
Peace & Conflict

PART I: Escaping Conflict

Course Correction
S3E125 MINS
ep2 s3 womens physical safety course correction doha debates ada jusic illustration 3 2 site 570 320
icon
Health

PART II: Healing the Body

Course Correction
S3E221 MINS
Ep12 570 320
icon
Human rights

Refugees and the fight against populism

Course Correction
S2E1238 MINS
Ep11GettyImages 1075923138 570 320
icon
Peace & Conflict

Palestine, Israel and the courage of dialogue

Course Correction
S2E1143 MINS

More on this topic

240322 DDP S1Ep20 Website CLEAN 755 563
Technology

WATCH: Nadeem Nathoo, Isabelle Hau and Louka Parry debate how to ensure equal AI access for all

Doha Debates Podcast
S1 E20 40 MINS
Learn More
240228 DD ICDD MAJA ARIZMENDI WEBSITE 104 104
Peace & Conflict

Tips from a debate pro

Better Conversations
4 MINS
AISchool 104 104
icon
Education

Equal Education: How can we ensure equal AI access for all?

Doha Debates Podcast
S1 E20 40 MINS
240126 Necessary Tomorrows S1E6 web 104 104
icon
Technology

Indigenous AI

Necessary Tomorrows
S1 E6 33 MINS

Our other podcasts

DDPGuestV6 310 310

Doha Debates Podcast

Doha Debates Podcast brings together people with starkly different opinions for an in-depth, human conversation that tries to find common ground.

Lana 310 310

Lana

A podcast in Arabic for today’s generation to discuss their outlook on the world, hosted by Rawaa Augé.

Necessary Tomorrows Logo Clean 310 310

Necessary Tomorrows

Mixing speculative fiction and documentary, leading sci-fi authors bring us three futures that seem like fantasy. We meet the writers who dream of these futures and the activists, scientists and thinkers who are turning fiction into fact.

Long Game background DD web 310 310

The Long Game

The Long Game highlights stories of courage and conviction on and off the field. Activist and Olympic medalist Ibtihaj Muhammad guides the series and examines the power of sport to change the world for the better.

Negotiators logo forwebsite no title 1 310 310

The Negotiators

The Negotiators brings you stories from people resolving some of the world’s most dramatic conflicts.