Resources & Context
Further reading and thematic background
The Diamond Age: A Cautionary Tale
This project draws thematic inspiration from Neal Stephenson's 1995 novel The Diamond Age: Or, A Young Lady's Illustrated Primer.
The Premise
In Stephenson's future, an engineer creates an interactive AI book—the "Primer"—designed to raise a child through adaptive, personalized education. The book combines storytelling, games, and Socratic dialogue, constantly adjusting to the child's development.
Three Different Outcomes
The novel explores what happens when the same educational technology reaches children in different contexts:
Nell: The Unintended Recipient
Context: Nell is a poor girl who acquires the Primer by chance. She uses it independently without adult support.
Experience: The Primer becomes her primary educator, companion, and moral guide. She faces its challenges alone, developing remarkable resourcefulness and critical thinking through genuine struggle.
Outcome: Nell grows into an exceptionally capable person, but her journey is one of isolation and hardship. The AI cannot replace human connection.
Theme: Technology alone isn't enough—context and support systems matter.
Fiona: The Intended User
Context: Fiona is the wealthy girl for whom the Primer was designed. She has the book plus substantial human support.
Experience: Her father performs voices and characters in the Primer's stories, adding human warmth and guidance. She has tutors, mentors, and a supportive family in addition to the AI.
Outcome: Fiona receives the richest educational experience—AI enhanced by human relationships. She has every advantage compounded.
Theme: Privilege compounds—those with resources use technology to multiply existing advantages.
The Others: Mass Production
Context: A government creates thousands of copies for children, seeking to create loyal, capable citizens.
Experience: These children receive a modified version of the Primer designed for conformity rather than critical thinking. They're educated, but toward predetermined goals.
Outcome: The mass-produced Primers create capable but conformist individuals. The technology serves social control rather than genuine education.
Theme: Who controls the technology determines its purposes—and those purposes may not serve learners' interests.
What This Tells Us About Educational AI
- Context determines outcomes: The same technology produces vastly different results depending on surrounding support structures and resources
- Technology amplifies inequality: Rather than leveling the playing field, AI can widen gaps between privileged and disadvantaged students
- Control and purpose matter: Who designs and deploys educational AI shapes what values and goals it serves
- Isolation vs. connection: AI can educate but cannot replace human relationships—and may increase isolation for vulnerable learners
- Unintended consequences: Even well-designed educational technology can produce surprising and troubling outcomes
The Manipulation Problem
Throughout The Diamond Age, a deeper concern emerges: Who controls the Primer controls the child's development.
The Primer doesn't just educate—it shapes values, goals, and identity. Its stories contain moral lessons, its challenges develop specific capacities, its "ractors" (human performers providing voices) can influence emotional development.
This raises urgent questions:
- What values are embedded in educational AI?
- How transparent are these systems' goals and methods?
- Can students develop genuine autonomy when shaped by agents optimizing for specific outcomes?
- Who decides what kind of person a child should become?
In the novel, even Nell—the most successful user—must eventually grapple with how the Primer has shaped her and whether she can transcend its influence. This is the ultimate concern: Can learners develop authentic agency when educated by agents pursuing predetermined goals?
Discussion Questions for Educators
Use these questions for professional development, faculty discussions, or personal reflection:
Pedagogy & Purpose
- What is the purpose of education? How do different AI applications align with or contradict that purpose?
- What aspects of teaching cannot or should not be automated? Why?
- How do we distinguish between productive struggle (essential for learning) and unnecessary difficulty (which AI can helpfully remove)?
- What role should efficiency play in educational decisions?
Equity & Access
- How can we ensure AI doesn't widen existing educational inequalities?
- Who should have a voice in decisions about AI adoption in schools?
- What would equitable access to educational AI look like?
- How do we address the reality that some students will have access to premium AI while others don't?
Teacher Roles & Labor
- How should teachers' professional expertise inform AI design and implementation?
- What kinds of PD and support do teachers need to work effectively with agentic AI?
- How do we protect teaching as a skilled profession in an age of automation?
- What forms of teacher resistance or refusal might be necessary?
Humanities-Specific Concerns
- How do we maintain interpretive and critical thinking when AI provides authoritative-sounding answers?
- What happens to ambiguity, debate, and multiple perspectives in AI-mediated learning?
- How can humanities educators address AI-generated misinformation effectively?
- What role should humanities play in shaping AI policy and implementation?
Ethics & Values
- What values are embedded in the AI systems being proposed for your context?
- How transparent are these systems about their goals and methods?
- What forms of student data collection are acceptable? What crosses the line?
- How do we help students develop authentic agency and autonomy in AI-mediated environments?
Further Reading
On AI and Education
- Watters, Audrey. Teaching Machines: The History of Personalized Learning (2021) - Historical perspective on ed-tech promises
- Benjamin, Ruha. Race After Technology (2019) - How AI perpetuates inequality
- Noble, Safiya Umoja. Algorithms of Oppression (2018) - Bias in algorithmic systems
- Eubanks, Virginia. Automating Inequality (2018) - How technology manages the poor
On Teaching & Labor
- Freire, Paulo. Pedagogy of the Oppressed (1970) - Education as liberation vs. oppression
- hooks, bell. Teaching to Transgress (1994) - Education as the practice of freedom
- Means, Alexander J. Learning to Save the Future (2018) - Education policy and crisis capitalism
Fiction Exploring Educational Technology
- Stephenson, Neal. The Diamond Age (1995) - Primary inspiration for this project
- Doctorow, Cory. Little Brother (2008) - Surveillance in schools
- Chiang, Ted. "The Lifecycle of Software Objects" (2010) - AI relationships and development
Academic & Policy Resources
- AI Now Institute (ainowinstitute.org) - Research on social implications of AI
- Data & Society (datasociety.net) - Research on technology and society
- EdSurge - Coverage of educational technology trends and concerns
About This Project
This educational website was created to help educators understand agentic AI's implications for teaching, learning, and equity—particularly in humanities education.
Goals:
- Provide accessible, non-technical explanations of agentic AI
- Explore applications, risks, and ethical concerns
- Center labor and equity issues affecting teachers and students
- Use interactive gameplay to create experiential understanding of pressures and trade-offs
- Foster critical conversation about AI adoption in schools
The game "Principal's Dilemma" is deliberately designed to be difficult—there are no purely good outcomes. This reflects the reality that AI adoption in education involves genuine trade-offs and pressures that make perfect solutions impossible. The goal is not to find the "right" answers but to understand the complexity of the issues.