Labor & Equity Concerns
How agentic AI reshapes teaching and who benefits
The Changing Role of Teachers
As agentic AI enters classrooms, the nature of teaching is being redefined—often without teachers' input.
From Instructor to Monitor
Before AI Agents:
Ms. Rodriguez designs her curriculum, creates lessons, assesses understanding, adapts in real-time to student needs, builds relationships, makes pedagogical decisions.
With AI Agents:
An AI system designs curriculum, delivers personalized content, assesses automatically. Ms. Rodriguez monitors dashboards, troubleshoots technical issues, manages AI-flagged concerns. Her pedagogical expertise is sidelined.
Impact: Deskilling of the profession. Teachers become technicians managing systems they didn't design and don't fully control.
Loss of Pedagogical Autonomy
A history teacher wants to spend three weeks on a civil rights unit, using primary sources and student-led discussions. But the AI system's pacing algorithm determines students are "behind" on other standards. The system pressures the teacher to move on, sends alerts to administrators, and adjusts students' individual curricula automatically.
Impact: Teachers lose control over how and what they teach. Algorithms make pedagogical decisions.
Emotional Labor Intensified
Students form attachments to their AI tutors. When Mr. Kim tries to redirect a student, they say "but the AI said I'm doing great!" Students expect instant AI-level feedback from their human teacher. Parents compare Mr. Kim's response time unfavorably to the 24/7 AI agent.
Impact: Teachers compete with AI for authority and trust while managing unrealistic expectations of availability and instant feedback.
The "AI Manager" Role
District mandates that teachers use an AI system. Ms. Chen spends hours learning the platform, interpreting data dashboards, correcting AI errors, answering questions about why the AI did something. This work is unpaid and unrecognized, but failure to manage the system effectively reflects poorly on her evaluations.
Impact: Additional unpaid labor managing technology. Expertise in AI management becomes more valued than pedagogical skill.
Economic Pressures
AI adoption in schools is often driven by cost-cutting rather than educational improvement.
💰 The Cost-Cutting Narrative
"Why pay for small class sizes when AI can personalize learning at scale?"
School boards facing budget shortfalls see AI as a way to reduce staffing costs while claiming to maintain or improve educational quality.
What Actually Happens:
- Teacher positions cut or left unfilled
- Class sizes increase
- Remaining teachers stretched thinner
- AI subscription costs prove higher than projected
- Hidden costs: infrastructure, training, maintenance, troubleshooting
📊 The Metrics Trap
AI agents excel at optimizing for measurable outcomes. This shifts educational priorities toward what can be measured:
Devalued (Hard to Measure):
- Critical thinking
- Creativity and exploration
- Collaboration skills
- Ethical reasoning
- Student well-being
- Love of learning
Prioritized (Easy to Measure):
- Test scores
- Content completion rates
- Time on task
- Engagement metrics (clicks, views)
- Efficiency gains
Impact: Humanities education—which values interpretation, debate, and ambiguity—is particularly ill-suited to optimization metrics.
Educational Inequity
Access to agentic AI is creating new forms of educational stratification.
The Three-Tier System
Use the slider to see how different levels of access produce different outcomes:
Tier 1: No Access
Who: Underfunded schools, students without devices or internet at home
Experience: Traditional education with large class sizes and limited resources. Students are penalized in competitions with AI-using peers.
Outcome: Growing achievement gap. Students lack skills in AI literacy that are increasingly expected.
Tier 2: Basic AI Access
Who: Most public schools adopting AI systems
Experience: Generic AI tutors with limited customization. System bugs and errors common. Overwhelmed teachers managing too many students and AI issues.
Outcome: Mixed results. Some students benefit, others struggle. Teaching quality varies widely. AI systems often reflect biases in training data.
Tier 3: Premium AI + Human Support
Who: Wealthy private schools and families who can afford premium AI subscriptions
Experience: Sophisticated AI agents customized to individual students, plus small classes with expert teachers, plus human tutors. AI enhances rather than replaces human instruction.
Outcome: Significant advantages compound over time. These students develop both AI literacy and strong human mentorship relationships.
The Diamond Age Parallel
In Neal Stephenson's The Diamond Age, an interactive AI book called the Primer creates vastly different outcomes:
- Nell (poor girl) uses it independently, developing extraordinary abilities through struggle and creativity
- Fiona (wealthy girl) has the Primer plus her father performing voices and guidance, creating the richest learning environment
- Other children receive mass-produced versions, learning conformity rather than critical thinking
The lesson: The same technology produces radically different outcomes depending on implementation, support systems, and existing advantages. AI doesn't level the playing field—it can amplify existing inequities.
Case Study Comparisons
Consider these two students using AI tutoring systems:
Case A: Marcus
Context: Public school with district-mandated AI system. Class size: 35 students. One device cart shared among classes.
Case B: Sophia
Context: Private school. Premium AI system. Class size: 12 students. Personal device. Teacher trained extensively in AI integration.
The difference isn't the students—it's the context. Marcus and Sophia might have equal potential, but their experiences with AI are worlds apart. Over years of schooling, this gap compounds.
Who Decides?
A critical equity question: Who has a voice in how AI is deployed in education?
🏢 Tech Companies
Design systems, set defaults, control updates, collect data. Motivated by profit and growth.
Power level: Very High
🏛️ Government/Districts
Mandate adoption, control budgets, set policy. Motivated by cost savings and political pressures.
Power level: High
👨👩👧 Wealthy Parents
Can opt out, purchase premium alternatives, influence private schools. Motivated by advantage for their children.
Power level: Medium-High
👩🏫 Teachers
Implement systems, manage consequences, understand students. Motivated by student learning and working conditions.
Power level: Low
👨👩👧👦 Low-Income Families
Receive systems chosen by others, can't opt out, limited alternatives. Motivated by children's opportunities.
Power level: Very Low
👦 Students
Experience the systems directly, develop relationships with AI, shape futures. Motivated by learning, belonging, futures.
Power level: Nearly None
Those with the least power in decision-making are those most affected by the consequences. This is a fundamental equity problem.
What's at Stake for Humanities?
These labor and equity concerns are especially urgent for humanities education:
- Critical thinking requires human mentorship and Socratic dialogue—hard to outsource to agents
- Interpretation and debate are central to humanities but don't fit optimization models
- Historical consciousness and ethical reasoning require context and nuance AI struggles to provide
- Voice and perspective in humanities disciplines reflect who has access to education—equity shapes whose stories get told
- Teachers as intellectual guides not just content deliverers—a role threatened by deskilling
If agentic AI reshapes education primarily through cost-cutting and efficiency, humanities education may be fundamentally transformed or diminished. The values central to humanities—complexity, ambiguity, multiple perspectives, ethical reasoning—don't optimize easily.