Home Back

Navigating Uncertainty About AI In Education By Leaning On What Works

Forbes 1 day ago

Katy Knight is Executive Director and President of Siegel Family Endowment, a foundation focused on the impact of technology on society.

As an Ivy League graduate and one of the few Black women at Google, I “beat the odds” of what public education in America is supposed to produce (thanks, in no small part, to the support of Prep for Prep, among many others). I now lead a foundation that is applying the scientific method to drive more equitable outcomes across society—and the application of artificial intelligence (AI) and technology is core to our work.

As a longtime investor in education, it’s clear to me that we are still not getting this conversation about technology in classrooms right.

To be sure, there’s a ton of information. Articles, webinars, research, surveys, tools and reports about AI’s impact on everyone—students, teachers, districts, labor markets even AI itself—abound. Conversations about AI and edtech dominated the stages at major conferences this year like SXSW EDU, which my company has sponsored in the past, and ASU+GSV. And there’s no end in sight.

And yet, a recent Pew Research Center study shows that a third of K-12 teachers (35%) are unsure about the benefits or harms of using AI tools in the classroom. Evidence of uncertainty morphing into fear is palpable, escalating into extreme reactions such as the demand for the complete removal of technology from classrooms. Such reductive arguments overlook nuance and exacerbate polarization, derailing the sort of collaboration that is critical to improving outcomes—especially in our most marginalized communities.

In a moment when research suggests that more than two-thirds of parents believe the benefits either equal or potentially outweigh the drawbacks and 72% of students want guidance on how to responsibly use generative AI for schoolwork, we should be wrestling with not whether but how technology can play a role in preparing young people for an increasingly dynamic world.

Communities and organizations are already working with cross-sector stakeholders to support educators while wrestling with complex issues of student agency, parental rights and community values that don’t lend themselves to simplistic tropes. They understand that incorporating technology into classrooms isn’t about some ephemeral economic imperative. It's not about preparing more kids to build software for major tech companies. It's about our shared responsibility to inform and equip today’s students to thrive as citizens of an increasingly digital democracy.

Of course, the answers are not simple. The politics and perspectives may get messy. But if we aren't willing to engage in a complex conversation, we will never get it right.

The good news is that there are historic guideposts and a growing body of knowledge and experience upon which we can lay the foundation for more nuanced and productive conversations about how to approach AI in education. Here are three places to start:

1. AI That’s Fit For Purpose

Our focus should be on conversations first about the problems we’re trying to solve and then the tools—technology or otherwise—needed to solve them, rather than starting with the tools themselves. We have to prioritize solving specific problems over superficial tech integration.

A growing number of nonprofits and edtech providers have created products that are tailored for learners and educators. But to do AI well requires customization, and customization is expensive, which is one reason why we have seen the tech fall short. Quill.org, an AI-powered literacy tool launched long before the generative AI explosion and a Siegel grantee, is a great example of AI that was built to help teachers coach student writing.

Rather than relying on AI to generically pattern match between “good” and “bad” writing, they worked with educators to define custom rubrics for each writing prompt and to direct the AI to pattern-match based on their inputs. Such centering of the educator's voice takes time but solves the problem of providing high-quality, usable feedback at scale. As we get smarter about AI, we should be mindful of when and how to use customization to power useful tools and acknowledge that there are many instances where large language models (LLMs) won’t be the answer.

2. Focusing On Computational Thinking And Digital Literacy

Embracing computational thinking and digital literacy at scale prepares all students for a future where technological fluency is nonnegotiable. By doubling down on enduring skills like problem-solving and creativity, we equip students with the cognitive resiliency to adapt to whatever future jobs might exist.

We should focus the conversation away from prophecies about the nature of specific jobs (e.g., prompt engineer) and more on how we can empower students to shape the technological landscape of tomorrow. Organizations like CSforALL and Scratch, who have beat this drum for over a decade and are also grantees, have amassed a large body of resources, research and tools for how to do this in an equity-centric way.

3. On-The-Ground Evidence And Stories

Finally, we must reframe the conversation away from hot takes toward on-the-ground stories and empirical, hypothesis-driven research. We should continue to invest in research and research-driven pilots to carefully balance innovation and its unwanted effects. These are most powerful when co-designed and collaboratively generated with communities and educators that reflect the diverse perspectives we want as part of the AI revolution, an approach taken by organizations such as Leanlab Education.

We should also look to (and fund) education media outlets like Education Week, Chalkbeat and The Hechinger Report (all Siegel grantees) that serve the critical role of surfacing case studies and stories from real classrooms to shed light on possible use cases, interrogate the conditions in which they lie and weigh the benefits and harms to schools and communities. Through it all, philanthropy can build the connective tissue between stakeholders to move such learnings across the ecosystem.

To be sure, technology poses complex risks and challenges to education. By leaning on what we know works—purpose-driven innovation, computational thinking and networks of evidence—we can have better conversations about a path forward that honors the true potential of technology to shape the next generation's future.

Forbes Nonprofit Council is an invitation-only organization for chief executives in successful nonprofit organizations. Do I qualify?

People are also reading