In December 2025, independent school leaders gathered in Houston for the NAIS Symposium on AI and the Future of Learning. The agenda reflected the gravity of this moment in education: student agency, trusted tools, ethics, equity, and what it means to lead schools through disruption.
I was invited to facilitate two “Dialogue by Design” sessions, created to help attendees process the symposium’s big ideas, surface what educators are thinking about AI in schools, and identify insights that could shape future programming and research.
My goal was to create a space where educators could name what issues are lingering for them and why. What emerged was remarkably consistent across roles and school contexts: a shared concern about over-relying on AI at the expense of authentic learning; a desire to preserve the essence of student inquiry and skill-building; and a recognition that engagement matters. The consensus was that AI paired with high student engagement can spark deeper learning, while AI used in low-engagement environments risks cognitive offloading.
Hopes and Fears
As each session concluded and participants reflected on the discussion, I invited them to share a “hope” or “fear” about how AI might shape the future of education. Several powerful themes and key takeaways emerged.
Leaders want learning that is more human, not just more efficient.
One of the clearest hopes was that AI might push schools away from more transactional content coverage toward relational human development.
“I hope that AI will reshape education to focus less on content and more on what is necessary to raise healthy, critically thinking, and connected humans,” says Alex Helwink, director of learning support at The Potomac School (VA).
Leaders are not only asking, “What can AI do?” They are asking, “What is learning for, and where does the human relationship matter most?”
Equity and access are not side issues; they are the issue.
When access to AI tools are restricted, monetized, or inconsistent, equity gaps don’t just persist, they accelerate. Many leaders are already noticing how rapidly an “optional” tool can evolve into an unspoken requirement, creating new barriers for students.
“I fear that it will not be made available democratically: you get the access you can pay for,” says Manfredo Grellert, teacher at Wildwood School (CO).
Chris Kaye, visual arts design teacher at Rye Country Day School (NY), named the contradiction directly. “AI is supposed to be the great equalizer in education, but the AI companies hide the best features behind paywalls. I hope we find ways to remove those barriers so that everyone has access without having to pay for it.”
Leaders are not just debating tools. They are debating whether the next era of learning amplifies opportunity or reinforces stratification.
The “AI conversation” is really a school design conversation.
AI in schools marks an existential turning point—for how learners will learn, work, and live, and for how education must evolve its approach to teaching and learning. Educators see both the opportunity and the tensions among productive struggle, student agency, and human relationships. Schools feel pressure to ensure “future readiness” even as higher education and the labor market remain deeply uncertain. Amid this complexity, school leaders want to move thoughtfully rather than react with performative urgency.
Several of the “hot takes” from our conversation were more structural than technical.
- “Traditional school structures and systems will never keep up with the pace of AI; education must be re-evaluated.”
- “The real conversation about AI in education has nothing to do with technology.”
- “Students need AI literacy [that is] more than prompt-engineering.”
- “I hope that AI will offer pathways for all students to find their voice—not be their voice.”
The real question is not whether we should allow AI; it’s what we believe schools should cultivate when information is ubiquitous and cognition can be easily offloaded.
Five Practical Moves Schools Can Make
The symposium sessions reinforced something I see consistently in my work with schools and learning organizations: Leaders are ready to change, but they need shared language, structure, and space to think together. Because there is no roadmap yet, independent schools get to help design one.
If AI accelerates topic complexity, then curiosity becomes a leadership competency for both students and the adults guiding them. Based on what I heard at the symposium, here are five things schools can start doing now.
1. Create a “community of sensemaking” before you create a policy.
Create spaces for listening through 1:1 interviews, forums, or structured dialogues with students, faculty, and parents. Use movies or social events to provoke input and conversation and get a pulse on the ways AI is already integrated into their lives. Use protocols that surface disagreement safely and turn it into shared learning like the Game of 35 and structured panel used in the Dialogue by Design sessions. And involve students, parents, educators, administrators, employers, and tech partners while you’re still grappling with question so they can witness the challenges in the process, not a clean, sterile synthesis report.
2. Start with the real design question (not the tool question).
With AI adoption, it’s easy to get distracted by a tool or specific use case. As educators, we need to identify the problem we are actually trying to solve with AI. Are we aiming to create efficiency––to optimize a process—or using tech to manage time we don’t have or can’t resource. Rather than using tech to do things we can’t do but would like do, we might aim at the things that translate more naturally to AI to make room for spaces that help students to practice initiative, judgment, and authorship with real humans.
3. Name what you want to protect (and why).
Leaders are hungry for learning that builds healthy, critically thinking, and connected humans. That is not a technology agenda. It is a school mission agenda. Use “productive struggle” as a design lens: Where should students wrestle, and where is support appropriate? Design for curiosity, agency, and connection. These are our essential criteria as we evaluate and evolve our learning environments and strategies.
4. Stay curious while exploring new solutions.
This moment is defined by anxiety about the unknown. A thoughtful design process—supported by structures, tools, and methods like deep research with stakeholders, identification of their real challenges and problems, exploring many ways to solve them through ideation, and testing new approaches through defined experiments—can invite stakeholders into the work as co-creators while grounding us in a process that builds clarity, drives meaningful change, and moves us toward our desired future.
5. Advocate for a seat at the table.
Leaders are already worried about paywalls and unequal access. Maybe most palpably, they feel absent from key decisions big tech is making—and they are mired in the resulting consequences from the influence of social media in eras past. Find ways to represent and advocate for education at the tech table to shape how solutions are adapted in our context: Be persistent about expectations of what technology should and shouldn’t do with product teams and deliberate about the lines that technology won’t cross in your school (and why).
When leaders, administrators, and teachers are making choices in your school or classrooms, you are designing the experience for students, parents, and communities. While the choices of others feel like they are being exerted upon your campus, your design choices offer structure and limits to their impact. Educators, alongside parents and students, can co-design how it will affect the future.