A new generation of Native American artists is leveraging artificial intelligence and technology to create installations that challenge Western assumptions about data extraction and consent. Led by artists like Suzanne Kite (Oglala Lakota), Raven Chacon (Diné), and Nicholas Galanin (Tlingít), this movement rejects extractive data models in favor of relationship-based systems that require reciprocal, consensual interaction rather than assumed user consent.
What makes this different: These artists are building AI systems rooted in Indigenous principles of reciprocity and consent, fundamentally challenging how technology typically harvests and uses data.
- Unlike conventional AI that assumes consent through terms of service, these installations require active physical presence and participation from viewers.
- “It’s my data. It’s my training set. I know exactly what I did to train it. It’s not a large model but a small and intimate one,” explains Kite about her approach to AI development.
- The work prioritizes relationship-building over data extraction, asking what intelligence would look like if it couldn’t be gathered until a relationship had been established.
Key installations redefining AI interaction: Several groundbreaking pieces demonstrate these principles in practice across different mediums and technologies.
- Kite’s “Wičhíŋ čala Šakówiŋ (Seven Little Girls)” uses a four-meter hair braid with embedded sensors to translate her dance movements into machine-learning algorithms, anchored by stones arranged in traditional Lakota star patterns.
- Her “Ínyan Iyé (Telling Rock)” installation uses embedded AI to speak and respond to viewers, conveying what she calls “more-than-human intelligence” rooted in reciprocal exchange.
- Chacon’s Pulitzer Prize-winning “Voiceless Mass” creates electronic frequencies that exploit cathedral acoustics to generate spectral voices, but only records performances with explicit consent.
Why this matters: The movement challenges the tech industry’s foundational assumptions about intelligence, automation, and data sovereignty at a time when AI ethics are under intense scrutiny.
- These artists aren’t seeking inclusion in existing systems but are building alternatives that prioritize consent, relationship, and reciprocity over extraction and automation.
- The work highlights how Indigenous technologies have always rejected the false binaries foundational to Western innovation, offering frameworks for more ethical AI development.
- As Galanin’s mechanical drum installation demonstrates, the tension between automation and human memory raises critical questions about what happens when culture is performed without a consenting body.
The bigger picture: Native artists are positioning Indigenous knowledge systems as essential frameworks for developing more ethical and sustainable technology relationships.
- The land, labor, and lifeways that built America’s infrastructure—including its tech—are Indigenous, challenging narratives that separate Native cultures from technological innovation.
- Rather than asking for inclusion in today’s systems, these artists are modeling what should come next in human-technology relationships.
- Their work suggests that the default should be refusal rather than extraction, fundamentally reimagining how AI systems could operate based on consent and reciprocity.
Indigenous knowledge meets artificial intelligence