Podcast: Play in new window | Download (Duration: 57:12 — 39.3MB) | Embed
Native Americans have worked hard for decades to counter the stereotypes perpetuated in old movies and television shows about the American West. Now a new generation of Native technology experts worry that artificial intelligence is eroding that work. Scores of AI-generated images and videos are flooding people’s social media For You pages. The creations are within easy reach of anyone typing a prompt into any AI generator that scrapes information from millions of sources. Often posted by anonymous creators, the products of those prompts present vaguely Native visual and audio characteristics with little to no authentic cultural connections. Along the way they generate hundreds of thousands of admirers. We’ll talk about the work to counter the looming onslaught of AI cultural appropriation.
GUESTS
Dr. Angelo Baca (Diné and Hopi), professor of history, philosophy, and social sciences at the Rhode Island School of Design
Trevor Reed (Hopi), professor of law at the University of California, Irvine School of Law and an associate justice for the Hopi Tribe Court of Appeals
Dr. Tamika Worrell (Gamilaroi), senior lecturer of critical Indigenous studies at Macquarie University
Break 1 Music: Obsidian (song) Red-209 (artist)
Break 2 Music: Digital Winter (song) Ya Tseen (artist) Stand On My Shoulders (album)

Here is the question I have as a non-Indigenous person with a keen interest in Indigenous cultures and technology: How do we blend the two? With AI, In the past few months I have been using the summarize function to create my own cliff notes. My thinking is it’s important to share with the creators – in this case Native America Calling in your recent episode on Route 66 and (see below) this episode. My question – is this useful to you and listeners or is this appropriation or a misuse?
Gemini said
Based on the provided episode details from Native America Calling, here is a summary of the key points discussed in the February 19, 2026 episode, “The growing AI appropriation threat.”
(Note: Because this is an external podcast audio file and not a YouTube video with an accessible transcript, I cannot provide exact minute-by-minute audio timestamps. However, here is the comprehensive breakdown of the episode’s focus.)
Episode Summary & Key Points
The Core Issue:
Native Americans have spent decades fighting against harmful stereotypes perpetuated by old Hollywood movies and television shows. Now, there is a growing concern among Native technology experts, scholars, and legal professionals that Artificial Intelligence (AI) is eroding that hard-fought progress.
The Mechanics of AI Appropriation:
A Flood of Fake Media: Social media feeds (like the “For You” pages on Facebook and TikTok) are currently being flooded with AI-generated images and videos depicting Native American characters and themes.
Lack of Authenticity: These AI creations are usually generated by anonymous users typing simple prompts into AI generators. Because the AI scrapes data from millions of unvetted internet sources, the resulting images present “vaguely Native” visual and audio characteristics that have little to no actual cultural connection or authenticity.
The Threat of Viral Misinformation: Despite being fake and culturally inaccurate, these AI-generated posts often attract hundreds of thousands of admirers and supporters, spreading modern stereotypes at a massive scale.
The Counter-Effort:
The episode focuses on how Indigenous communities, academics, and the legal system are working to counter this looming onslaught of AI cultural appropriation and protect their cultural property.
I’m one Indigenous person from one region, so I’m speaking from my own perspective, not on behalf of all Indigenous peoples. A useful litmus test is this: appropriation happens when Indigenous culture is taken out of context and used without relationship or accountability, especially for profit or aesthetics, while Indigenous people are excluded, stereotyped, or underpaid. For example, wearing regalia as fashion is use out of context. Selling facsimile Indigenous art labeled “Native-inspired” redirects profit away from Indigenous artists and competes with their work. In both cases, something meaningful is extracted while Indigenous people are minimized or cut out of the transaction. Respectful appreciation looks different. Purchasing jewelry directly from an Indigenous artist and wearing it with proper attribution supports Indigenous creators and ensures that benefit returns to the community. Indigenous people are meaningfully included rather than replaced. With AI tools like summarizing functions, the same principles apply. It’s worth asking whether the summary replaces reading Native voices directly, flattens important context, removes tribal specificity, or allows someone to benefit from a Native creator’s work without proper credit or compensation. Even if you know an article is about the Cherokee or Seminole, if your takeaway in reading the summary version becomes “Indigenous people think…,” that little shift reinforces the idea that Indigenous cultures are a monolith, and ultimately contributes to cultural erasure. And definitely always check the veracity of sources to ensure that the sources are legitimate.