AI Primer for K-12 leaders


Informing decision-making about AI for your school/district

As a K-12 leader you will need to make decisions about how uses of AI may be allowed – or even promoted – in your school/district. This webpage provides a quick introduction to key issues we encourage you to take into consideration when making these decisions.

Click on each of the topics listed below for some brief considerations/recommendations, followed by links to additional “user-friendly” and concise resources we selected to provide support and/or elaboration to those points.  These curated resources were produced based on a review of existing literature, combined with data collected through interviews and surveys during the 2023-24 school year from over 150 K-12 leaders in Western New York.

These materials are based upon work supported by the National Science Foundation under Grant No.2333764. Any opinions, findings, and conclusions or recommendations expressed in this paper are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

Getting Started

Setting appropriate expectations

Unknowns and rapid changes in AI technology call for decisions that are:

  • temporary and continually reviewed and modified
  • accompanied by experimentation and professional learning.  

All constituencies should share these expectations.

You can find what other K-12 leaders have shared about these challenges in this 4-page article, titled “An AI Wishlist from School Leaders 

What you need to know about AI:

As you approach decisions about AI, consider that:

  1. Most jobs will increasingly require some use of AI.
  2. Some students and educators are already using AI – whether or not their school provides access to it.
  3. There are no tools to accurately detect uses of AI.
  4. AI tools can produce amazing results – but are also prone to inaccuracies and biases.
  5. The quality of AI-generated output depends on our input.
  6. AI can be used to assist or replace humans – with different results and implications.

Many of these points are elaborated and supported with evidence in this 2024 report from MIT

For now, there are just a few types of AI tools for K-12 education you should know about:

  1. General tools – like ChatGPT, Microsoft Co-pilot, or Google Gemini – that can be used for multiple applications with appropriate prompting  
  2. Tools created for K-12 schools to do specific tasks with minimum prompting – like Magic School and Diffit.
  3. Existing tools with enhancing AI features – like Adobe suite.

Becoming familiar with even just one tool within each of these categories can give you a good sense of the possibilities.

Watch any of these short demo videos to get an idea of what these tools can do: Co-Pilot  (1:36) –  Magic School (6:00) –  Diffit (6:14) – Adobe Firefly (1:00)

The best way to understand AI’s affordances and limitations and inform your decisions is to try using an AI tool to help you with some authentic task – and then reflect on that experience.

New to AI? You can try ChatGPT without even having to start an account by using this website, created by metaLAB (at) Harvard.

You may want to start entering in the “send a message” box: “What should I see in a 1-day visit to [a location you know]?”; then try using the alternative prompt: “I would like suggestions about what to see in a 1-day visit to [same location]. Ask me questions until you have enough information to give me your suggestions. Ask me a question at a time.” responding to each ChatGPT question, and engaging in a dialogue. Notice the difference in process and output depending on your prompts.

 AI uses in K-12 schools:

K-12 educators can benefit from using AI as an “assistant/ thinking partner” for:

  • Routine every-day tasks – such as responding to emails, writing memos or letters of recommendation);
  • Specific instructional tasks – such as writing learning objectives or creating/differentiating lesson plans;
  • Improving back-office/school operations – such as scheduling or budgeting;
  • Supporting decision-making.

This may not only save time, but also improve the end product through some new functionalities – provided the output is carefully checked and the AI tool is used safely and ethically.

More information about risks and benefits of specific uses of AI by educators can be found in this report, based on interviews with K-12 leaders. 

While some students may use AI to do their work for them (i.e., “cheating”), there are also powerful ways in which AI could support student learning when used appropriately – here are some examples::

  • Reading: get help to understand complex texts. 
  • Writing: get feedback on initial ideas or a first draft; help to edit a final draft.
  • Problem solving: help identify key information; ask for clarifications; provide hints when stuck; provide feedback on proposed solution(s).
  • Research: research a topic, while evaluating the source of information provided.
  • Remediation/acceleration: Using AI tutors that personalize learning.
  • Increasing access for ELLs: get immediate translations for readings and directions; provide alternatives to demonstrate learning.

So, rather than trying to detect and punish students’ inappropriate uses of AI, we should teach students how to use AI effectively and ethically. Each teacher should also be explicit about which uses of AI are allowed for specific tasks, and how students should report on their use of AI.

This 2024 article provides valuable suggestions about how to provide students with a “decision tree” to help guide their use of AI.

Surveys conducted in 2024-25 reported:

  • Compared to the previous year, nearly twice as many educators are using AI regularly, while the number of those who rarely or never use it has dropped a lot.
  • 58% of administrators use AI frequently or all the time, which shows they’re still using it more than teachers.
  • Three out of five teachers have talked with their students about the proper ways to use AI. That’s up from two out of five last year, meaning 22% more teachers are having these conversations.

Students, teachers and administrators’ opinions about whether and how AI should be used in K-12 education is still quite divided.

To learn more, look at this report from Carnegie Learning on 2025 survey data collected from 49 states & Puerto Rico

Things to consider:

Fear of sensitive data breaches and cybersecurity risks have been the main reason many schools have banned the use of AI. These risks are real – but should not stand in the way of realizing the many benefits of AI.

Precautions that can help mitigate these risks include:

  • Vet AI tools to ensure they satisfy Ed Law 2D.
  • Figure out how developers are going to use the data you input into their AI tool.
  • (most important!) Provide training about safe and ethical use of AI to all potential users.

Additional considerations can be found in this 2024 blog post.

When making decisions about proposed AI-related innovations, keep in mind that AI could help address as well as exacerbate current inequities in K-12 education, as:

  • Students’ unequal access to AI technology and/or ability to use AI effectively will exacerbate the digital divide.
  • AI can help level the playing field for ELLS and students with disabilities.
  • Biases built into current AI systems/tools may negatively affect specific groups, unless awareness and strategies are developed to counter these biases.
  • Using AI can make school-family communications more accessible.

Support to these claims can be found in this paper (8 pages), ), based on 2023-24 K-12 leaders’ interview and survey data

Realizing AI potential benefits while controlling its related risks will require better prepared K-12 educators – thus calling for timely and high-quality professional learning opportunities for all K-12 constituencies.  This should include:

  • Developing awareness of AI potential and risks for K-12 education.
  • How to use AI tools effectively and ethically
  • Experiencing using AI in authentic tasks 

Survey results about teachers’ desires and opportunities for professional development about AI can be found in this short 2024 EducationWeek article.

Taking action:

As of 2024, very few districts had developed an AI policy and many K-12 leaders cautioned against doing so prematurely.  They suggested instead:

  • reinterpreting existing policies (e.g., code of conduct, student privacy, acceptable use) to address the use of AI
  • Creating guidance documents that could be more easily modified based on new developments and experimentation.

The rationale for these recommendations can be found in this 2024 4-page article

Students’ easy access to AI tools – outside as well as inside schools – challenges many current assumptions and practices about WHAT students should learn and how that learning can be measured.

This calls for radical rethinking of learning goals as well as how those goals can be assessed, beyond adding units on AI literacy and curtailing students’ possible cheating.

Some ideas about how using AI can more radically transform teaching and assessment are offered in this 2023 blog post.