Artificial Intelligence (AI) Policies in Schools: Resources and Requirements as of March 2026
AI becomes ubiquitous in schools and classrooms. The Association has collected resources to support school leaders in understanding requirements and accessing resources to build out their own school-based AI policies
What are the current laws regulating AI use in New York?
As of Spring 2026, New York and the NY State Education Department have not officially passed laws that specifically regulate or prohibit the use of AI by teachers or students in schools. While there are 28 state AI-related bills across the country that may impact children, they typically regulate general use or AI companies themselves.
Current proposed laws typically aim to limit the harm from AI use by focusing on the following:
- Regulating or limiting access to AI tools to protect children (e.g., requiring parental controls or age verification);
- Requiring companies to transparently indicate what input data is used to train the AI tool; and/or
- Curbing AI use in schools, particularly in Kindergarten - 8th grade.
In this legislative season, NY passed GenAI Warning Bill (A3411/S934) that required AI systems to notify users that outputs may be inaccurate. Two other significant bills are advancing in the NY Senate. First, S9051, which aims to limit minors' exposure to unsafe features of AI chatbots, and, second, S7263, which aims to prevent AI tools from impersonating specific lisenced professionals (e.g., psychiatrists).
With Gov. Kathy Hochul's endorsement, two additional laws were introduced to specifically limit AI use in schools. A9190 aims to limit the use of AI in classrooms prior to high school except for specific purposes. A7029 would require the Education Department to recommend AI literacy curriculum for all grade levels.
The NY Board of Regents and staff from the Education Department shared the perspectives on AI use in schools in March 2024. More recently, the NYC Department of Education released its full guidance on how schools should use AI.
Does my school need to create AI Policies?
There is no legal requirement for schools to have specific AI policies at this time. Regardless, schools can proactively define expectations and protect students by developing AI policies. The Association suggests the following steps when drafting policies.
First, identify your audience(s). Which stakeholders does the school need to outline expectations and regulate use for? This could include:
- Students (split by grade band)
- Teachers
- Leaders
- Operations and other staff
- Vendors/service providers
Define inappropriate use of AI by each stakeholder group. List and define what uses are inappropriate (e.g., searching for or generating explicit or violent content, bullying, or exceeding a certain number of minutes) for each stakeholder group . Consider which staff roles and responsibilities are and are not appropriately delegated to AI.
More importantly, define appropriate use of AI by each stakeholder group. What tools will students and staff be allowed to access? For what purpose? When? Will there be a list of approved/allowable tools? How does these tools support the schools' mission, model, and instructional priorities?
Outline how the school selects and monitors AI tools. How will the school select and monitor AI tools? How will the school check the data privacy of selected platforms and services? Under what circumstances would a school share student data with an AI-enabled tool (e.g., purpose, type of data, frequency). How will PII be protected - and what would the school do if/when an AI tool is no longer safe or appropriate?
Ensure the AI policies are aligned with the school's other relevant policies. Review the language listed in all technology, cell phone, and data use policies to ensure alignment. Update other policies as needed.
Proactively developing AI policies enables charter schools to define when and how they intend to use AI in support of their charter's mission and vision and set the standard for other schools across the state.
Resources