Artificial intelligence is reshaping industries across the globe, and school districts on Long Island are no exception. Educators and administrators on the North Shore, Locust Valley and Oyster Bay-East Norwich school districts are grappling with how to integrate AI into the classroom while managing its risks.
Locust Valley and Oyster Bay-East Norwich are in the early stages of navigating AI’s role in education, though the impact of the technology industry’s adoption of AI is already noticeable.
Ken Packert, Locust Valley’s executive director of administrative operations and technology, explained that AI has been quietly integrated into existing applications like Grammarly and IXL, platforms that provide real-time feedback and adapt to students’ individual needs.
“It’s not so much that we’re using AI directly in the form of ChatGPT, but rather, AI is built into the products we already use,” Packert explained.
He added that he sees great potential in AI’s ability to tailor education to individual students’ interests and needs, including those with disabilities. But he echoed concerns about privacy, security and the ethical implications of relying too heavily on AI.
“There’s always the question of bias and misinformation — what are these suggestions based on, and are they safe for our students?” Packert said.
At Oyster Bay-East Norwich, Superintendent Francesco Ianni emphasized that the district is still in the learning phase, focused on providing educators with the necessary professional development to help students use AI responsibly and effectively.
“Right now, teachers are trying to figure out how to use AI in ways that enhance learning, rather than just as a tool for students to complete assignments,” Ianni said.
North Shore is in its second year of exploring AI’s potential to enhance learning, while simultaneously developing policies to address concerns about academic integrity and student well-being. Superintendent Chris Zublionis explained that while the district is enthusiastic about AI’s potential, it is proceeding cautiously. At the same time, educators see the promise it offers for personalizing learning.
“Teachers were concerned about academic integrity, whether the work students were submitting was original or AI-generated,” Zublionis said, noting that those concerns arose early in the 2022-23 school year, after the release of popular AI tools like ChatGPT.
“AI has a lot of promise when it comes to differentiating instruction and making learning more accessible for students who may need additional support,” he said. But he stressed that the district has not yet implemented any formal AI tools for students or teachers. “We are still in the exploration stage, experimenting with what’s possible before putting anything into formal practice.”
One tool that is generating buzz among district leaders is OpenAI’s ChatGPT-4.0, which can perform tasks such as solving complex algebraic equations step by step when presented with an image of a problem. Zublionis called the platform’s potential “amazing,” but he also emphasized that any AI-generated content must be scrutinized for accuracy.
“It’s like the internet — it’s not infallible, so you always have to approach it with a healthy skepticism,” he said.
Zublionis acknowledged the risks of students using AI to complete assignments dishonestly. While the district uses detection software such as Turnitin to flag potentially AI-generated writing, he stressed that the real solution is creating assignments that can’t be completed simply with the use of a prompt.
“We’re looking to do more deliberate professional development on it and adopt policies at the district level,” he said.
Currently, North Shore High School is working on its own academic integrity policy on AI, and the district plans to follow suit.
Zublionis also raised concerns about the potential misuse of AI outside of academics, particularly in cyberbullying. “The idea of students using AI to create deepfakes or other harmful content is very scary,” he said. He described how easy it is for AI programs to generate fake videos that can be used maliciously, adding that the issue has already surfaced in other parts of the country.
Despite these concerns, Zublionis said he believed that AI, when used responsibly, could help bridge educational gaps. He said that the district’s technology committee, composed of parents, educators and administrators, is actively researching how AI can be safely integrated into the curriculum, and plans to present its findings to the Board of Education in January.
“For students who can’t afford tutors, AI has the potential to level the playing field,” Zublionis said. “It can provide support with homework, and help make complex texts more accessible to students with different reading levels.”
A second committee, focused on social media, smartphones and student well-being, is also looking at how AI impacts life outside the classroom. Zublionis said that the district is developing a technology white paper to formalize its stance on AI and inform future policy decisions.
“We can’t ignore this,” he said. “The wall between what happens in school and out of school is thinner than ever, so we need to address AI both academically and socially.”
As AI continues to evolve, the challenge for school districts across the North Shore is to harness its potential while safeguarding academic integrity and student safety. For North Shore, Locust Valley and Oyster Bay-East Norwich, the goal is to ensure that AI remains a tool for learning, not a crutch for shortcuts or a vehicle for harm.
“We’re working toward a future where AI enhances learning, but doesn’t replace it,” Zublionis concluded.