Next Gen Learning

As a middle school teacher around 2010, Matt Farber noticed a curious detail in a student essay. Copied at the end of the text were numbered footnotes from Wikipedia, the digital encyclopedia that had risen to become one of the world’s most visited websites.

In those days, Wikipedia was driving conversations around academic integrity and how to hold students accountable in a world where many of the answers were easily available.

Fifteen years later, a much more complex and sophisticated rival, ChatGPT, is inspiring the same kind of conversations. It has emerged as the most well-known artificial intelligence program, but it is only one of many accessible tools that creates text, photos and audio with a simple prompt. AI can also research content—however accurate or reliable the responses may be—something Google search users have discovered through its Gemini feature.

Looking back at that moment of Wikipedia plagiarism, Farber says the task of holding students accountable isn’t much different now, despite how rapidly AI tools continue to evolve. The student who misused Wikipedia in 2010, he imagines, is very similar to the student who abuses AI tools in 2025.

The challenge for educators now is not to prohibit or penalize AI but to work with it in a productive way.

“I don’t think I would be doing my job teaching teachers by ignoring that AI is out there and that students are using it,” says Farber, now an associate professor of educational technology at the University of Northern Colorado. “I think the conversation is about how to use the tool.”

Setting boundaries

Chris Geanious, a senior instructional designer at Colorado State University’s Institute for Learning and Teaching, says there isn’t much data on student AI usage, so instructors are mostly dealing with anecdotal evidence when it comes to issues like AI-facilitated cheating. Given the increasing availability of AI technology and its integration into web searches, instant messaging services and other online platforms, however, most students have already used these tools, whether they’re seeking them out or not.

In a 2024 report, the academic research firm Wiley found that about 45 percent of students surveyed had used generative AI tools in class in the past year, primarily for writing papers, brainstorming ideas and learning about difficult concepts. Among the students who hadn’t used AI for their classes, 37 percent reported concerns that their instructor would consider it cheating if they used AI, and 36 percent said they didn’t trust AI tools.

At CSU, Geanious works with faculty to set boundaries regarding AI. A key first step, he says, is to set clear expectations early on. Those are important, as nearly half of the students surveyed by Wiley said AI makes it easier to cheat.

“What we are currently trying to impress upon instructors is to communicate in the first week of class their expectations concerning student use of AI in their course,” he says. “This needs to be spelled out specifically and the consequences of misuse explained clearly.”

This burdens instructors to “AI proof” their assignments or replace those that AI can produce, such as straightforward essays. This was a method identified by instructors in the Wiley survey: Many said they were creating more personalized assignments and tailoring topics to individual students to make it more difficult for AI to produce a response.

“I make up wildly hypothetical questions rooted in real life but based on movies or television shows, then I subtly change them each term so it is obvious if a student uses AI. When I do allow the use of internet, they are required to provide references, which AI does poorly in the sciences,” one instructor said.

Another instructor reported assigning more project-based work, including projects that require students to collect their own data.

Students themselves identified a return to traditional, in-person classes as a way to reduce misconduct. Understanding the purpose or value of a course also reduced their inclination to cheat. They reported that they were less likely to cheat when they saw the benefits of the class and how it applied to the real world.

Teaching the teachers

Molly Jameson, director of the Center for the Enhancement of Teaching and Learning at UNC, says the university has drafted guidance on certain AI topics, ranging from limiting classroom applications to embracing AI.

“There really isn’t a standard, outside of it being well known that faculty members need to know how to use AI because our students are going to be using AI,” Jameson says. “You need to know what a response from AI looks like. You need to know what it can do.”

In other words, teachers need to know how to use AI so they can recognize when students are overusing it, just like Farber, who knew enough about Wikipedia to recognize the footnotes.

Many educators are also becoming familiar with AI and how it responds through their own professional applications. Jameson surveyed approximately 115 teachers on how comfortable they feel navigating these challenges and applying AI in their practice. She’s found that about 70 percent feel comfortable using AI as a teaching tool and to help with some of their own tasks, like lesson planning and generating ideas.

“Nobody’s really reporting using it to create their entire content, right? That’s where the human part comes in,” Jameson says. “They’re the experts. Sometimes the expert just needs help on the pedagogy side.”

On a statewide level, Patty Quiñones, a senior partner with the Colorado Education Initiative, helped create the first guidelines for K-12 educators on AI education. The effort brought together more than 100 stakeholders from school districts, nonprofits, higher education and the Colorado Department of Education. Much like AI, the guidelines (called the Colorado Roadmap for AI in K-12 Education) are also evolving, even a year after their release.

“Things are happening so quickly that I’m now in the process of putting together a task force to actually review it and update it,” Quiñones says.

This will mean providing professional development opportunities and certifications to ensure that teachers remain at the cutting edge of new technologies. Quiñones’ hope is that Colorado teachers will be able to leverage AI tools designed for education, like MagicSchool, Khanmigo and Playlab, to accelerate learning and hone in on individual student needs.

“We have a lot of data in schools,” she says of information regarding student needs, performance and how AI can help teachers tailor their content to them. “But the time [it takes] to analyze and use that data appropriately is really hard for a teacher, especially if, at the high school level, they’re teaching 150 kids in one day.”

Helping students use AI

As AI becomes embedded in all aspects of life and learning, Jameson sees a need for educators to shift their mindset around the goals of classroom learning. Rather than viewing fact recall as a sign of success, for example, she encourages teachers to empower students with critical thinking skills. Students will need to become AI literate for their future careers, and that will require the ability to evaluate the dubious or bias-prone information sometimes produced by AI programs.

“We have to think about how to teach students to be critiques of information,” she says. “Our goal is to create critical thinkers who can critique and evaluate, not just produce.”

To assess students more effectively, she suggests that teachers do things like move exams back to an in-person setting and veer away from online evaluations that are more vulnerable to cheating. She also recommends engaging students in activities, like debates or presentations, that challenge them to think through ideas and demonstrate their knowledge.

While AI poses new educational challenges, if applied properly, it also creates opportunities. For example, the Colorado Roadmap touts AI’s ability to tailor learning to students’ individual goals. It can also increase accessibility for students with disabilities, promote peer learning through collaborative platforms and facilitate critical thinking through feedback exercises.

Farber reminds instructors that AI tools can complement their teaching, but they’re not there to replace them. No matter how sophisticated it is, AI lacks the complex decision-making skills and emotional capabilities teachers can provide.

“Teaching is a human business,” she says.