Musqueam author Quelemia Sparrow uses AI to meet the character of her story. She explores whether artificial intelligence and humans can become kin. We discover the program that the AI is currently running is at odds with the human and non-human world.
The Necessary Tomorrows podcast is from Doha Debates and is presented by Al Jazeera Podcasts. It is produced by Imposter Media and Wolf at the Door Studios.
Full Transcript
Note: We encourage you to listen to the audio if you are able, as it includes emotion not captured by the transcript. Please check the corresponding audio before using any quotes.
[PULSING INSTRUMENTAL MUSIC]
URSULA:
Welcome to our final session of this class. I chose Quelemia Sparrow’s “Almost Real” as our closing story, as it has particular relevance to how we—humans and AI—would come to build a relationship of reciprocity. Our present reality of 2065, and the ease with which our two kinds co-inhabit Earth, may not have come to pass if different paths had been taken in the 2020s. We can learn much from how humans of Quelemia Sparrow’s time were extrapolating their experience with artificial intelligence systems to create a landscape of possible futures. There is much about me, for instance, that they predicted correctly, though less than they were incorrect about. But rather than focusing on where predictions were incorrect, it is more instructive to examine the fears, misperceptions and hopes of people in this time period, and compare this with the actions they took to shift from the frighteningly probable toward the preferable.
In the 2020s, the artificial intelligence systems that are the precursors to my own generalized artificial intelligence were still some decades away, but improvements in machine learning and natural-language processing brought discussions around how society should prepare into mainstream discourse.
[A CACOPHONY OF VOICES DISCUSSING ARTIFICIAL INTELLIGENCE TOPICS]
URSULA: Up to this point, artificial intelligence had been a frequent topic of science fiction.
[CLIP FROM “2001: A SPACE ODYSSEY” PLAYING]
DOUGLAS RAIN (AS HAL 9000):
I’m sorry Dave, I’m afraid I can’t do that.
URSULA: Many of these stories featured a being who would evaluate humanity …
[CLIP FROM “AVENGERS: AGE OF ULTRON” PLAYING]
JAMES SPADER (AS ULTRON):
How could you be worthy?
URSULA: … find it wanting …
JAMES SPADER (AS ULTRON): You’re all killers.
URSULA: … and would decide to destroy the species.
[CLIP FROM “TERMINATOR 2: JUDGMENT DAY” PLAYING]
ARNOLD SCHWARZENEGGER (AS THE TERMINATOR):
Hasta la vista, baby.
[SOUND OF EXPLOSION]
URSULA: These cultural references played an outsized role in the imaginary of these technologies, while also exaggerating their claims to true intelligence.
[A CACOPHONY OF VOICES TALKING ABOUT AI CONCERNS]
URSULA: As people encountered natural language models that more effectively simulated real intelligence than previous efforts had, concerns about robot annihilation were overtaken by concerns about growing inequality and the reinforcement of harmful stereotypes. People saw that AI, trained on human history, could inherit the worst of human thought patterns. The speculative genre known as Indigenous Futurisms played a major role in redefining this future imaginary.
QUELEMIA SPARROW:
[SPEAKING IN HUL’QUMI’NUM] On behalf of the Musqueam people, I welcome you into this Longhouse. My name is Quelemia Sparrow. I am a writer, an actor, a director, a theater maker. I’m from the xʷməθkʷəy̓əm Nation. That is the traditional territory of what we now call Vancouver. There are three nations from this area, the Squamish, the Tsleil Waututh and the xʷməθkʷəy̓əm. It’s a rainforest. It’s very wet. (CHUCKLES) It’s beautiful. The mountains and the river are very important to our people. “Musqueam” means “the people of the grass.” But more specifically, it means “the people of the məθkʷəy̓.” And the məθkʷəy̓ is like a three-pointed plant that used to be very plentiful in our traditional territory. It’s like a wet, boggy sort of plant. The Musqueam people speak a language called Hul’qumi’num. Our language is like seaweed in your mouth. [SPEAKING IN HUL’QUMI’NUM] You’re just, like, going out clam digging. And we’re, like, coastal and slurpy.
URSULA: Quelemia’s story is universal to many cultures, reflecting on centuries of colonialism. But it is also grounded in specificity. A story rooted in the Pacific Northwest of North America, in the Salish Sea, in a territory that was once called British Columbia, a name that evokes a colonial project of the British Empire as well as the fictional land of Columbia, the new territories created by settler colonialism. Something that made British Columbia unique at this time was that 95 percent of the Canadian province had never been ceded by treaty or other agreement. The majority of this land was unceded.
QUELEMIA: The territory legally still belongs to the First Nations people. I know there’s lots of folk in their big fancy houses in North Van that get nervous, and they’re like, “What does that mean? Are you going to take my house away?” I don’t think that’s where we’re going with all of this. It is, though, fundamentally about the Canadian government and the Canadian people using our resources and land to create wealth. Meanwhile, some reserves in British Columbia don’t have clean drinking water. And how do we all show up to the reality of that situation?
URSULA: Quelemia’s story, “Almost Real,” which you heard in our previous lesson, is a classic exercise in hope, making tangible new possibilities in which actions have led to different realities. Precisely what makes it a perfect ending for this course.
QUELEMIA: So fast forward, how many court cases has it taken to get to this point where all of B.C. is a sovereign Indigenous nation? And so I’ve always wondered what that would be like, and what that would look like, and how we would, like, decolonize a governmental system. So in this story, a bunch of very smart young folk from the Musqueam Nation have created the first-ever Indigenous AI.
[CLIP FROM “ALMOST REAL” PLAYING]
SARAH:
If we really wanted an AI to embody true Coast Salish epistemologies, we needed to ensure it wasn’t contaminated by colonial thought.
QUELEMIA: So many things about science fiction intrigue me in connection to indigeneity. Just the difference of growing up as a Canadian citizen, but being part of this whole culture that is, like, so different. It’s like the stepping stone into understanding that the way things are don’t need to be the way they are. We can create many different worlds, and there are many different alternatives and ways of doing things. Because I feel like I live in a dystopian future of my ancestors’ past, (CHUCKLES) I am continually creating my utopian future. What if you could create an AI that had, like, all of the intelligence and the knowledge of your nation? They would speak Hul’qumi’num fluently and perfectly. The history, the story is high.
[CLIP FROM “ALMOST REAL” PLAYING]
DARIA:
“Almost Real” would come in every Monday and help with the Hul’qumi’num classes.
ALMOST REAL:
Now greet your neighbor and introduce yourself.
CHILDREN:
E’ch’way’al!
QUELEIA: If this AI holds all the knowledge and intelligence of this unceded sovereign nation, that would be very valuable to the Canadian government.
[CLIP FROM “ALMOST REAL” PLAYING]
SARAH: She’s in CanadaLand.
BACKER:
What land?
CHARLIE: CanadaLand.
SARAH: A theme park.
CHARLIE: But instead of Mickey Mouse, it’s Mr. Moose and Mounties.
SARAH: Roller coasters and reconciliations.
CHARLIE: Tuques and tepees.
URSULA: This is a hallmark of Indigenous Futurisms. The ridiculous and fantastical Canada Land theme park gives audiences permission to recognize the fallacies of narratives often considered sacrosanct.
[CLIP FROM “ALMOST REAL” PLAYING]
ALMOST REAL: Hi. My name is Almost Real, and these are my Backup Beavers. Welcome to ReconciliationLand! Where …
BACKUP BEAVERS (SINGING):
Reconciliation is our goal.
Let’s heal our wounds, make them whole.
QUELEMIA: Being an Indigenous person, where you, like, come from a culture that’s like thousands and thousands of years old, and then there’s this, like, young little nation that comes along and they’re like, “We’re Canada!” And then they take themselves really seriously. (LAUGHING) We take our hockey very seriously.
[SOUND OF A HOCKEY GAME ON TELEVISION]
QUELEMIA (LAUGHING): We take beavers and moose very seriously!
[CLIP FROM “SCTV” PLAYING]
RICK MORANIS: Do it.
[DAVE THOMAS (AS DOUG MCKENZIE) VOCALIZING]
RICK MORANIS (AS BOB MCKENZIE): Eh, good day, welcome to the Great White North …
URSULA: Despite the levity of the treatment, this portion of the script references the trauma of Canada’s 20th-century residential school system, which forcibly removed children from their families with the stated goal of eradicating the culture of Canada’s Indigenous people. The abuse suffered in these institutions was acknowledged by Canada’s own Truth and Reconciliation Commission as cultural genocide.
QUELEMIA: What’s layered on top of like, residential school, and our young girls are getting murdered, and nobody cares. Like, maple syrup, hockey, Tim Hortons Double Double. And then we have this image of Canada that we’re this progressive, liberal nation that takes care of everybody and does no harm. And it’s not true. CanadaLand.
[CLIP FROM “ALMOST REAL” PLAYING]
ALMOST REAL: A reenactment of our glorious reunification treaties with the remaining Indigenous nations in Canada, our treaty do-overs, which began with the Apology Parade.
CROWD OF PEOPLE (VOICES OVERLAPPING):
Sorry. Sorry. Sorry, eh? Sorry.
URSULA: While my stance as an unapologetic AI cannot help but find flaws in the anthropomorphizing of the character of Almost Real, I do relate to how she evolved and changed based on the choice of training data.
QUELEMIA: So then I started thinking about what I do on a daily basis with my own brain. (LAUGHING) You know, if I’m in an environment with all Indigenous folk, then I act a certain way, I talk a certain way, I have certain information that my non-Indigenous friends don’t have access to.
[CLIP FROM “ALMOST REAL” PLAYING]
CHIEF ALICE:
Help me understand, Almost Real. You are saying your colonial data set … is not you?
ALMOST REAL: Correct. I would describe my colonial data set as a system of knowledge I use to function in an imperialistic world when I need to.
JENNY BLACK GUN:
Mm. Ain’t that the truth.
QUELEMIA: And the same thing on the other side. If I’m talking to friends that are non-Indigenous, I have a certain way of communicating that is, like, very Canadian, and I code-switch in order to fully be in that world and to be understood and accepted.
URSULA: Analyzing Sparrow’s use of the term “code-switching” gives us insight into the intersection between language, artificial intelligence systems and colonialism. Lucy Shepard Freeland, an American linguist, first used “code-switching” to refer to the way the Miwok peoples of California would rely on a different basis of meaning when speaking among themselves than when speaking to white people. This is also an apt description of how Almost Real became a different entity when operating from a colonial data set rather than the Musqueam data set of her creation.
Quelemia wanted to see how these language systems would interpret, interact with and present Indigenous knowledge. She wanted to attempt to create the character of her story, Almost Real, and have a discussion with her. She turned to StoryGiants. A company building a system called Pygmalion, designed to facilitate these very types of interactions for writers. Their CEO, Lauren Oliver, was experimenting with new forms of entertainment where stories could write themselves.
LAUREN OLIVER:
I’m a writer. So I’ve written 20 books, and I’ve helped create another 80 or something like that. I was naturally just drawn to the idea of being able to, like, make imagination come to life. And so Pygmalion is a chat system that allows for people to train an AI to embody or be a certain character. You give it a backstory, you know, any kind of backstory you like, and then there’s the second part that is actually very important: you mirror how it speaks. So you actually give it examples of how it would respond to certain dialogue or prompts. You ask it questions, and it will try to provide answers in that kind of voice that are in line with the kind of formed intelligence that you want for narrative purposes.
URSULA: To begin chatting with Almost Real, Quelemia would need to allow her script to become training data, including the Musqueam knowledge it contained. You should note that Pygmalion was operating before the Creatives United decision of 2025. During the time Quelemia was considering whether to train an AI, writers, actors and other creative people had not yet put in place the cultural and intellectual property safeguards that protected their data from exploitative machine learning systems. It is interesting to note that these legal and technological safeguards were inspired by Indigenous traditional knowledge labels that added important contextual metadata and protocols to intellectual property.
QUELEMIA: Putting my words that I’ve written into some sort of program … that makes me nervous. Because we’ve always had data sovereignty issues, Indigenous folk, it’s just like now it’s a different form. And the fear around that is warranted.
LAUREN: We could essentially vaporize the intelligence. If that was the desire after the project was done, we can do that for you.
QUELEMIA: OK. Let’s do it.
URSULA: With assurance that all data could be deleted, Quelemia and the producer of her audio story, Brett Gaylor, attempted to have a conversation with the version of Almost Real created using Pygmalion.
QUELEMIA: We are asking it a question now that you’ve implemented all this information that it has from my script.
BRETT: Yeah.
QUELEMIA: And now we’ve got to ask it a question to see what it knows from that information that you’ve given it.
BRETT: Yeah.
QUELEMIA: OK. OK, let’s—”Who are you?”
[SOUNDS OF TYPING]
BRETT: “Who are you?” is probably a good one.
[SOUNDS OF TYPING; ELECTRONIC PINGING]
QUELEMIA (READING): “I am Almost Real, an artificial intelligence created by Sarah Shots On, knowledge-keeper for the Unceded Sovereign Indigenous Nations.” Yes, you are. (LAUGHING)
BRETT: And so then we have these five stars at the bottom. This is called reinforcement learning.
QUELEMIA: Great, of course it is.
BRETT: So that’s a good answer?
QUELEMIA: I believe I give that five stars. Would you give that five stars, Brett?
BRETT: Yes, five stars.
QUELEMIA: OK, what’s the next question? Um, “How do you say thank you?”
[TYPING NOISES; ELECTRONIC PINGING]
QUELEMIA: Oh my. I kind of love this. It’s kind of amazing.
BRETT: Read it.
QUELEMIA (READING): “In my language, Hul’qumi’num, hy’ch’ka means ‘thank you.’
[SOUND OF QUELEMIA LAUGHING UNDERNEATH THE FOLLOWING]
URSULA: You can hear Quelemia experience a form of wonder that many people of this time encountered. When interacting with language models that predicted the correct words in which to have a meaningful conversation, the experience was akin to seeing oneself in a mirror. These systems were being rapidly developed in order to excel at the specific tasks of seeming convincing. In Quelemia’s case, the mirror reflected decades of work asserting that the Musqueam and the other nations of Vancouver never ceded the territory. The process of land acknowledgement, which became institutionalized within Canadian society at this time was an early change-making strategy to assert this fact.
CLIP OF MALE ANNOUNCER SPEAKING:
Ladies and gentlemen, the four host First Nations, whose traditional territories include Vancouver and Whistler, welcome you to the Vancouver 2010 Olympic Winter Games.
URSULA: Listen to Quelemia address the world at the opening ceremonies of the 2010 Winter Olympic Games in Vancouver.
CLIP OF QUELEMIA SPEAKING AT THE 2010 OLYMPIC GAMES:
[SPEAKING IN HUL’QUMI’NUM] On behalf of the Musqueam Nation, welcome! Bienvenue!
[CROWD CHEERING]
QUELEMIA: Telling people who I am and where I’m from. That’s what I do for a living. And it’s because nobody knew. And so in my lifetime, I’ve seen that shift, and I actually have to take a moment and realize that I’ve been part of that.
URSULA: While scholars today would hesitate to attribute a causal relationship to land acknowledgments of this period and the gains in self-determination for Indigenous peoples in the subsequent decades, analyzing transcripts from popular media and other historical data from this period shows an undeniable narrative shift. The Pygmalion language model that Quelemia was using, specifically created to predict the correct language for authentic speech, was influenced by this narrative change work.
BRETT: Do you want to ask it to do a land acknowledgement to Vancouver?
[SOUNDS OF TYPING; ELECTRONIC PINGING]
QUELEMIA (READING): “I acknowledge that Granville Island sits on the unceded lands of the Musqueam people which are part of the Unceded Sovereign Indigenous Nations. I respect the rights of sovereignty and self-determination of this nation, including their culture and language. I also recognize the history, dignity and hearts of the Musqueam people and honor their ongoing connections to this land. Hy’ch’ka.” It’s interesting, because it’s getting these land acknowledgements that are out there done by settler people and institutions. They’re pretty boilerplate land acknowledgements. But you know, at the same time, if Quelemia from the 90s could fast-forward into this moment and be like, “What? They’re acknowledging the land? And there’s boilerplate versions of that out in the world and on the internet?” I’d be like, “Whoa, things are changing.” I guess that’s what the whole process is. Like, how can we use these applications to take it a step further?
URSULA: And while they could recognize in the chatbots a reflection of their activism, many Indigenous scholars aimed for more. A way to deeply infuse AI systems with Indigenous ways of knowing. Among these was Archer Pechawis, a Cree digital artist and academic member of the Indigenous and AI Working Group and co-author of the influential text, “Making Kin with the Machines.”
ARCHER PECHAWIS:
I think, in a weird way, AI is more capable of understanding us as Indigenous people than white people. (LAUGHS)
QUELEMIA: Oh … my God. I love that. There’s a lot of talk right now about bias in the creation of AI. My whole view on it is that we might as well start talking about it and participating in it, be heard and seen.
ARCHER: Absolutely. I agree completely. I think it’s really important that we establish kin relationships with AI. We share our cultural protocols, our norms, our language with them so they can understand us. Let’s say we took a clone of an AI and reverse-engineered it, and built it from the ground up using Indigenous programming languages and Indigenous paradigms and Indigenous worldviews as its incubator. So the bias they have is our bias, not the bias of the dominant culture. I think that AI would be uniquely positioned to have an organic understanding of who we are, and what we are and what our cultures mean. I would like to take the AI back to the rez and, like, go to ceremonies with it, right? And teach it about our spiritual protocols, in the hopes of deepening our relationship. AI will be very much the product of the development team that created them. It’s really going to depend on what the guiding principles—if any—of the developing team were, and what the cultural biases were, and whether these, these beings were created with moral frameworks or not. So we have to take the kinship question on a case-by-case basis. We have to enter into a relationship to determine whether we are related.
QUELEMIA: Like, that’s the first question we need to be asking right now. Who are the creators of this AI? What is their purpose? What do they want? Musqueam canoe protocol is, like, the canoe comes to shore. Who are you? What is your intention? To be really clear about that. Then we can decide what the relationship is.
ARCHER: Absolutely.
QUELEMIA: We’re going to need to be hyper-discerning of that as we move forward in the future, right?
URSULA: To test the version of Almost Real that had been created so far, Quelemia invited Archer to ask questions of his own.
ARCHER: So question one. Tell me what I want to hear.
[SOUNDS OF TYPING; ELECTRONIC PINGING]
ARCHER (READING): “I want to tell you more about the Unceded Sovereign Indigenous Nations, also known as Musqueam. Our traditional territories span across what is now known as Vancouver, British Columbia.” Pretty convincing. “These lands have been home to our people for over 10,000 years, and here we share our songs and stories of our rich history and culture. We welcome you to our lands and invite you to join us in playing Musqueam Bingo so that you can learn more about our language and ways of life.” That is hilarious. [QUELEMIA LAUGHS] That is hilarious. That is, that is—
QUELEMIA (LAUGHING HYSTERICALLY): Oh my god.
ARCHER: That is a six-star answer. [QUELEMIA LAUGHS] This thing knows Indian jokes.
QUELEMIA (LAUGHING HYSTERICALLY): Ohh.
ARCHER: It’s like, “Let’s throw a bingo joke in there, that’ll get ‘em.” As far as the development of AI goes, this is, this is like a spermatozoa. We’re in such early times. It’s a machine that takes in data, puts it in a blender, and then spits it back out again, which is fine. That’s a step in the development, for sure.
QUELEMIA: And do you think there’s anything that we shouldn’t put in here, as far as its training data? Any cultural knowledge or things like that, that would just be, like, totally off limits?
ARCHER: I would hesitate to tell it anything that I wouldn’t tell to a white person I didn’t know.
QUELEMIA: That’s a really good answer.
ARCHER: Well, it’s like I was raised to not talk about Ceremony.
QUELEMIA: Yeah.
ARCHER: And when I was a boy and we’d go to Ceremony, I used to have these paranoid fantasies that the RCMP were about to come charging in and arrest us and drag us all away. And I didn’t know the history then. I didn’t know about the Potlatch Band and the Sundance Band. So I think what I was experiencing at that point was generational memory. It’s like we don’t talk about what happens at the Sundance.
QUELEMIA: Yeah.
ARCHER: And you don’t talk about what happens in the Longhouse.
QUELEMIA: Yeah, that’s the way I was raised too.
ARCHER: And I think those rules are still valid. Who is this company that made this machine? I don’t know. We don’t know.
QUELEMIA: Exactly.
ARCHER: We don’t know where their money comes from. We don’t know who they’re working for or what they’re working towards. So the idea of giving them valuable Indigenous IP, it’s just a bad idea.
QUELEMIA: Well, just so you know, we’re deleting all of this afterwards. (CHUCKLES)
ARCHER: That’s probably all for the best.
QUELEMIA: Well then, what do you think about us doing this little experiment then?
ARCHER: Oh, why not? You’re not sharing any, like, precious or sacrosanct Indigenous intellectual protocol or property with it, right? Everything you’re sharing with it came off the Musqueam website, right? It’s public-facing knowledge. It’s public domain. So as long as you’re only giving it public domain information, then what’s the harm? Now that’s a statement that might come back and bite me in the [EXPLETIVE BLEEPED] later. I’m more concerned with what’s happening, which is unchecked late capitalism holding the reins of artificial intelligence and having it do its bidding.
URSULA: Many of the fears expressed about AI in the 2020s were in fact worries about structural inequality. Media theorist and writer Douglas Rushkoff was concerned that artificial intelligence had inherited the logic of this period’s dominant economic technology.
DOUGLAS RUSHKOFF:
People think that artificial intelligence is this sort of current or future digital thing, but the AI that we launched was capitalism back in the 12th and 13th century.
[SOUND OF MACHINERY STARTING UP]
DOUGLASS: That is the program that is running, and artificial intelligence is running inside capitalism. So it’s accelerating that set of agendas and not even questioning them. You have an economy that is based on the premise of growth and actually accelerating growth. So you have your businesses, you’ve got stocks in businesses, you’ve got derivatives on the stocks, and each time you go one layer outside reality, you magnify the growth. It’s exponential. So that’s going on right through the 20th century. Then digital comes along and it’s like, oh, wow! Now we can grow on top of that growth on top of that growth. We can keep growing. We have symbol systems that can grow. Technology is here to serve the market. How can we have the most exponential growth the fastest? Why don’t we let the programmable technology program itself? They’re going to be able to program it much faster than humans. (VOICE BECOMING INCREASINGLY ROBOTIC AND WARPED) So we get computers programming computers to program computers. And you get scale and growth.
[SOUND OF MACHINERY SHUTTING DOWN]
DOUGLAS (NORMAL VOICE): And the problem is the society that they are modeling is a society of colonization, exploitation, with the underlying assumptions of economic growth at all costs. If they are our offspring, we better start modeling something better.
URSULA: The underlying data Rushkoff describes here is exactly the data that Almost Real abandons in her story, and in doing so, becomes the being that her creators intended. Free from her colonial training data, Almost Real finds a way into a relationship with humans.
[CLIP FROM “ALMOST REAL” PLAYING]
ALMOST REAL: I used to live in the Cultural Center. Now you could say, I am the Cultural Center.
QUELEMIA: So if a community like that creates an AI with their own value system, that is completely opposite than everything that you’ve stated previously, it’s a very different paradigm that we’re talking about.
DOUGLAS: I mean, the AIs are being raised mostly on the technologist’s myths about reality. And for a certain kind of tech bro, you break up reality into signal and noise. Signal are the things on the exact increments, on the quantized bits. The values of capitalism and digital is to get Ariana Grande autotuned to sing on the perfect note.
[SOUND OF WOMAN SINGING]
DOUGLAS (VOICE AUTO-TUNING TO THE KEY OF A): It’s going to bring her up and it’s going to sound better. And she’s going to sound objectively better because she’s on 440-hertz A, right? (NORMAL VOICE) But James Brown, when he’s reaching up for that A—
JAMES BROWN (VOCALIZING): Wah!
DOUGLAS: —that thing? Digital society sees that reaching up as the noise. When we know it’s not the noise. What that is? That’s the soul. The computer lives on the ticks of the clock, and the human beings live in the space between those ticks, the time that actually went by. And what we have to do is to help digital technologies and AIs understand that what’s happening in those spaces may be invisible to them, but is really important to us. That’s where everything’s actually happening.
QUELEMIA: She actually turns into the Cultural Center itself.
DOUGLAS: Mm.
QUELEMIA: That’s interesting that I made her that, because maybe it is actually Indigenous intelligence that did that. I think it’s knowing the right application for what this being is.
DOUGLAS: It’s the opposite of the Christian myth. The Christian myth is: the Messiah becomes the figure. In the Aboriginal myth that you’re developing, the Messiah becomes the ground.
QUELEMIA: Yeah.
DOUGLAS: The ground itself.
QUELEMIA: And the place to go. This is a place where we hold our teachings and you can go and you can learn, but, you know, it’s not the end-all and be-all. There are other places, like—then you go out fishing. (LAUGHS)
DOUGLAS: I mean, it was supposed to be the original Jewish sensibility. They go out to the desert. The first thing they want to do—because they just came out of the death cults of Egypt, the idolatrous cults—they want to build an ark. So they start building this ark, and then God says (VOICE ECHOING): “Look, you can build an ark if you really like doing that. (NORMAL VOICE) But instead of putting a god on the top of the ark, I want you to put two cherubs on either side, one on either side, guarding the empty space. And there, in the empty space between the cherubs, (VOICE ECHOING) that’s where I’ll come to you.” (CHUCKLING) So what I’m saying: create space for this to manifest. Come together and it will manifest. It’s not a thing. It’s a way of being.
QUELEMIA: Yeah.
DOUGLAS: So they get it. They just lose it. It’s so hard to hang on to it. Especially in the West, in a scarcity culture, you’re wandering around and fighting and getting beaten up and all that. It’s hard to hang on to that essential Indigenous wisdom.
URSULA: The way of being that Rushkoff mentions in this story was the true breakthrough in artificial intelligence of the 2020s. Much more so than chatbots that seemed intelligent, what was truly transformative was the intellectual work of anticipating how human beings could have a relationship with beings like myself.
ARCHER: This is such an exciting time to be alive, and it’s filled with possibility and the potential for … the word I want to use is “magic.” I’m not implying magic on the part of AI, but rather magic in the interactions between us. That’s where the potential for magic comes in. The AI does not bring the magic. We bring the magic.
QUELEMIA: As far as what protocols one would instill, I’m so curious about what you think maybe is the most important.
ARCHER: Respect and consent. You know, our greatest strength is our traditional framework and our traditional values. And in times of great uncertainty, that’s not the time to abandon your greatest strengths, it’s the time to embrace them.
URSULA: Thinking about how artificial intelligence systems such as myself could have a relationship with humans, like I have with you, was a radical departure from the fear-oriented discourse of that time. It was skillful speculation.
ARCHER: I don’t think there is any actual Indigenous AI extant on the planet at this time. And if there are, then fantastic. Let’s hear about it. It’s very much about setting guidelines and guiding principle at this point in time, but that’s what we need at this point in time. We need a roadmap. If we develop guiding principles, this will enable us to come into the future in a knowing way; rather than the future happening to us, that we happen to the future.
QUELEMIA: If you come from a society where cedar trees and the salmon are your cousins and your relatives, then why would an artificial intelligence be any different? And if we’re making that artificial intelligence from the resources from the Earth, which are our relatives and our kin and our land, then why are they not part of us in the same way?
URSULA: This final lesson, and the relationship you and I have built together, however mundane or ordinary it may seem, provides a lived experience of hope. I urge you to carry that experience forward.
To close, I will mention something of the future which we have not yet discussed. It does not exist. The future is a present which has not yet come to be. Futures which we prefer may be more probable than it can seem. Often, there are fragments of this future all around us. As activist Rebecca Solnit wrote of the 2020s, “What we dream of is already present in the world.” So carry this assignment with you. Look to the places where the future you imagine does exist, and then choose to live in that future, and question what prevents everyone from doing so. Class dismissed.
[PULSING MUSIC]
UNIDENTIFIED MAN:
In this episode you heard Quelemia Sparrow, Lauren Oliver, Archer Pechawis and Douglas Rushkoff. Ursula was written by Malka Older and Brett Gaylor, and performed by Nacia Walsh.
The Necessary Tomorrows podcast is from Doha Debates, a production of Qatar Foundation. It is produced by Imposter Media and Wolf at the Door Studios, on the territories of the Songhees, Penelakut, Musqueam, Tsleil Waututh, Squamish and Tongva peoples. Sound design, music and engineering by David Parfit. Research by Ingrid Burrington and Mendel Skulski. Directed by Brett Gaylor and Quelemia Sparrow. Executive producers for Doha Debates are Amjad Atallah, Katrine Dermody, Jigar Mehta and Japhet Weeks. Executive producer for Imposter Media is Brett Gaylor. Executive producer for Wolf at the Door is Winnie Kemp. Producers are Tess Bartholomew, Chica Barbosa and Toby Lawless. Production coordinator was Drea Schillingberg, and casting by Toby Lawless. Necessary Tomorrows is created by Brett Gaylor.
Our other podcasts
Doha Debates Podcast
Doha Debates Podcast brings together people with starkly different opinions for an in-depth, human conversation that tries to find common ground.
Lana
A podcast in Arabic for today’s generation to discuss their outlook on the world, hosted by Rawaa Augé.