1.6. A Human-Centered Approach to AI
Hou I (Esther) Lau
In this chapter, we step back from technical details to ask a central question: how do we keep humans at the center of AI use? Artificial intelligence is not an independent force; it reflects human choices, values, and intentions. By grounding AI in human-centered principles, educators and professionals can ensure these tools support rather than displace judgment, creativity, and care.
Human-Centered AI Philosophy
Technology does not drive change; people do. AI can extend what we do, but only if it is aligned with human purpose. Five guiding principles shape a human-centered approach:
- Collaboration: Treat AI as a companion, not a replacement.
- Intention: Begin with purpose (βStart with Whyβ) so AI use aligns with goals rather than novelty.
- Expertise: Humans provide disciplinary knowledge, context, and judgment that AI cannot supply.
- Scaffolding: Use AI to support learning and creativity, not substitute for them.
- Ethical decision-making: Keep accountability and responsibility firmly in human hands.
AI in Education and Society
AI can expand human capacity, but its benefits and risks depend on how we apply it.
- Cognitive support: AI extends memory, recall, and processing, but humans interpret and decide.
- Personalized learning: AI can tailor pacing, resources, and feedback to individual learners.
- Retrieval-Augmented Generation (RAG): connects models to external knowledge for more accurate outputs.
- Limits of AI agents: Autonomous agents cannot ensure deep learning without human oversight.
- Efficiency vs. meaning: Productivity gains must be balanced with reflection, engagement, and ethics.
Educational Research Models for AI Integration
Several educational frameworks help us evaluate when and how to use AI effectively. Together, they remind us that intentionality, balance, and innovation must work hand in hand.
π Start with Why (Sinek)
π TPACK Model
The TPACK model highlights that effective teaching requires integrating three domains of knowledge:
- Technology (T): the tools we use for learning.
- Pedagogy (P): the strategies for how we teach.
- Content Knowledge (CK): the subject matter itself.
When introducing AI, TPACK helps us ask: Does this tool support the content and align with sound pedagogy, or is it distracting from the core goals?
π SAMR Model (Puentedura)
The SAMR model helps us think about the depth of technology integration:
- Substitution: Technology replaces a tool with no functional change (e.g., typing an essay instead of handwriting it).
- Augmentation: Substitution with a functional improvement (e.g., using AI for grammar correction).
- Modification: The task itself is redesigned (e.g., co-writing a story with an AI).
- Redefinition: Technology enables tasks previously unimaginable (e.g., AI simulating historical dialogues).
Values and Ethics in AI Use
AI use inevitably reflects human priorities. What we choose to automate or augment shows what we value: efficiency, empathy, creativity, or care.
- Risks of outsourcing: Over-reliance may erode responsibility, accountability, or creativity.
- Ethical use in education: Balance innovation with equity, transparency, and student well-being.
- Human vs. machine strengths: Humans bring judgment, creativity, and emotional intelligence; machines bring speed, scale, and statistical pattern recognition.
π Weekly Reflection Journal
Consider a task in your teaching or professional work that you could imagine delegating to AI. What part of that task must remain human-led, and why? How do your values influence this decision?
Quick Self-Check
Match each human-centered principle with its description.