top of page

The Presidential AI Challenge and West Virginia's AI Guidance: An Alignment for Administrators and District Leaders

Introduction

Hampshire County Schools have an opportunity (like all K-12 institutions and homeschools in the United States) to participate in the Presidential AI Challenge, an initiative aligned with West Virginia's official AI Guidance (WVDE AI Guidance 1.2, March 2025) and state computer science standards (WVBE Policy 2520.14). This post explains how the challenge embodies the state's vision for responsible, student-centered AI use in K-12 education.


If you're an administrator evaluating whether the challenge fits your district's goals and policies, this guide will help you understand the alignment and address key implementation questions.


Executive Summary: Three Core Alignments

The Presidential AI Challenge aligns with West Virginia's AI guidance in three fundamental ways:

  1. People, Not AI, Are in Charge. The challenge keeps humans in the loop at every stage—students drive decisions, teachers maintain oversight, and AI serves as a supporting tool.

  2. AI Literacy is Woven Into Authentic Learning. Students learn computational thinking (a cornerstone of computer science) through solving real problems, meeting state standards for technology education.

  3. Privacy, Academic Integrity, and Responsible Use Are Built In. The challenge includes consent processes, prohibits PII entry, requires citation of tools, and encourages critical evaluation of AI outputs.


Section 1: The WVDE AI Guidance Framework and How the Challenge Fits


What the WVDE Guidance Says (And Why It Matters)

In March 2025, the West Virginia Department of Education released comprehensive guidance on AI use in schools. The guidance is centered on one core principle: "It cannot be overstated that the fundamental principle of using AI to bolster educational efforts must be a balanced and people-centered endeavor." The document emphasizes this through the concept of "humans in the loop"—meaning people remain in control, make decisions, and are accountable for outcomes.


The WVDE guidance identifies AI's potential to:

  • Enhance personalized learning and accessibility

  • Streamline administrative tasks

  • Provide instantaneous feedback

  • Equip students with workforce-ready skills in computer science and data literacy


At the same time, it names realistic risks:

  • Over-reliance on AI that reduces human judgment

  • Challenges to independent and creative thinking

  • Privacy and data safety concerns

  • Plagiarism and academic integrity violations

  • Reinforcement of bias and misinformation


The guidance's response is not to avoid AI, but to use it responsibly and intentionally, with clear policies and practices that keep people in charge.


How the Presidential AI Challenge Embodies This Approach

The challenge is structured to keep people in charge at every stage:


Problem Identification (Student-Led) Students identify a real problem they notice in their classroom or community. This is entirely student-driven. The teacher's role is to facilitate discussion, not to select problems for students.


Brainstorming with AI (Supervised) Students use an AI chatbot (Poe) to explore ideas and refine their thinking. However:

  • The teacher is present and guides the conversation

  • Students ask the questions and direct the brainstorming

  • The AI provides suggestions; students evaluate and decide whether to use them

  • Everything is documented in a chat that shows the students' authentic thinking

The WVDE guidance warns against "overreliance on AI technologies" that "decrease human discretion and oversight." The challenge structure prevents this by requiring students to review, evaluate, and make intentional choices about AI suggestions.


Solution Design (Student-Owned) Students specify what their app should do and how it should work. They are designing the solution, not copying one from AI. The app concept reflects their judgment about what would actually help.


Testing and Reflection (Critical Thinking) Students test their solution idea and reflect on whether it would work, whether it's fair, and what problems might arise. This mirrors the WVDE's emphasis on "critical analysis and discernment" and ensures students don't blindly accept their first idea.


Citation and Transparency (Academic Integrity) The final submission requires students to cite all tools used, including Hampshire County AI's AI Challenge Helper. This directly aligns with the WVDE guidance, which states: "Any use of AI to aid assignments, projects, or research must be declared." The guidance explicitly recommends: "Whenever a teacher or student uses an AI system, it is imperative to openly acknowledge and describe its usage."


Result: Students demonstrate that they are "actively engaged in their learning processes, developing abilities that are uniquely human and that cannot be outsourced to AI, thereby fostering deeper understanding and intellectual growth." (WVDE Guidance, p. 16)


Section 2: Computational Thinking and Computer Science Standards


WVBE Policy 2520.14: What the State Expects

West Virginia Board of Education Policy 2520.14 (West Virginia College- and Career-Readiness Standards for Technology and Computer Science) requires that all students receive grade-appropriate computer science instruction throughout the K-12 experience.


The policy specifies that our youngest learners (K-2) should learn about algorithms, data, staying safe online, and how computing is changing over time. By high school, students should understand algorithms, how computing works, and how to use coding and computers to advance society.


The policy also emphasizes that students should become:

  • Empowered learners (mastering their education)

  • Digital citizens (understanding online safety and ethics)

  • Knowledge constructors (building understanding through critical analysis)

  • Innovative designers (solving problems creatively)

  • Computational thinkers (breaking problems into parts, recognizing patterns, developing algorithms)

  • Creative communicators (producing digital content)

  • Global collaborators (connecting with peers and ideas)


How the Challenge Teaches Computational Thinking

Computational thinking is a problem-solving process that involves breaking down complex problems, recognizing patterns, abstracting key information, developing step-by-step solutions, and evaluating outcomes. The WVDE guidance identifies five key components:


1. Decomposition (Breaking Problems Into Parts). Students begin by identifying a complex problem ("Our library is disorganized") and breaking it into smaller parts: What's the core issue? Who is affected? What needs to change?

The AI chatbot helps students articulate these parts, but students decide what matters most. This is decomposition in action.


2. Pattern Recognition. As students explore their problem, they notice similar issues in other contexts ("People also struggle to find things in our supply closet"). Recognizing patterns across situations is foundational to computational thinking and to designing generalizable solutions.


3. Abstraction (Focusing on What Matters). Not every detail of a problem is relevant. Students learn to focus on the essential elements. For the library example: the core issue is discoverability, not the specific shelf locations. Abstraction helps students think in terms that can scale.


4. Algorithm Development (Step-by-Step Thinking). When students specify what their app should do ("When a student searches for a book, the app shows where it is"), they're designing an algorithm—a series of steps a computer would follow. This is explicit computational thinking.


5. Evaluation. Students reflect: Does this solution actually solve the problem? Is it fair? What could go wrong? This evaluation step is critical—it's how students learn that first ideas aren't always best, and that iteration is part of design.


The Pedagogical Benefit

By engaging in authentic problem-solving that naturally incorporates computational thinking, students don't just learn about algorithms and systems—they think like computer scientists. They understand that every app, every system, every piece of technology is built on deliberate choices about how to break down problems and solve them step by step.

This meets state standards while also building the 21st-century skills employers expect: problem-solving, critical thinking, and the ability to design solutions that actually work.


Section 3: Privacy, Data Safety, and Compliance

The WVDE's Privacy Framework

The WVDE guidance emphasizes that student privacy is a foundational ethical concern. The guidance states:

"AI systems, especially those used in education, require extensive data to operate efficiently. In an educational setting, these data could include sensitive information about students like learning patterns, academic achievements, and personal details. The foremost ethical concern is student privacy, and districts need to ensure that such sensitive data is handled with utmost care."

The guidance also notes: "Any data information inputted into the AI model, including prompts, has the potential to be incorporated into the model's future iterations and potentially shared with other users. Hence, strict policies, consideration, and discretion are essential when integrating AI in educational settings."


How the Challenge Addresses Privacy


No Personal Information in AI Tools. The challenge explicitly prohibits entering personally identifiable information (PII) into Poe, the AI chatbot students use. Students don't type their names, classmate names, teacher names, or identifiable details about individuals.

Instead, students keep their prompts general: "Some students said the library is crowded" rather than "Emma and Marcus said..."

This practice ensures that:

  • No sensitive student data is uploaded to an external AI system

  • Student privacy is protected even if the AI tool's data practices change

  • The focus remains on the problem and solution, not on individuals


Consent at Registration. The Presidential AI Challenge requires parental consent before any student participates. Teachers collect signed Parental and Legal Guardian Consent and Media Release forms for elementary youth. These forms are available on the official eligibility page.


This consent process ensures families understand:

  • That students will participate in an AI-based learning activity

  • What tools are being used

  • What data, if any, is being collected


For more details on the consent process, districts and teachers should contact the challenge team at AI.Challenge@science.doe.gov.


Built-In Safeguards. By design, the challenge uses Hampshire County AI's tools (the AI Challenge Helper and AI_App_Ideator), which are designed with student privacy in mind. These tools are vetted alternatives to consumer-grade AI platforms that may have less stringent privacy standards.


Alignment with WVBE Policy 2460

WVBE Policy 2460 (Educational Purpose and Acceptable Use of Electronic Resources, Technologies, and the Internet) requires that schools ensure compliance with federal privacy laws including:

  • COPPA (Children's Online Privacy Protection Act) for children under 13

  • CIPA (Children's Internet Protection Act)

  • FERPA (Family Educational Rights and Privacy Act)


The Presidential AI Challenge's design—no PII in AI tools, consent collection, use of vetted platforms—supports districts in meeting these compliance requirements.


Section 4: Academic Integrity and Responsible AI Use


The WVDE's Academic Integrity Framework

The WVDE guidance recognizes that AI presents both a challenge and an opportunity for academic integrity. Rather than banning AI, the guidance recommends advancing academic integrity through transparent, ethical use.

The guidance states:

"The arrival of AI in education presents a unique opportunity to revisit and even reinforce the values of honesty, trust, fairness, and responsibility in academic work. AI tools have the potential to be used to educate students about the nuances of ethical research and writing, helping them understand the importance of originality and the consequences of plagiarism."

Key recommendations include:

  1. Transparency: Any use of AI must be openly acknowledged

  2. Citation: Students must reference AI tools they used, following citation style guidelines (MLA, APA, Chicago)

  3. Authentic Work: Students must give credit to sources and be honest about work that is genuinely their own

  4. Clear Expectations: Teachers must clarify when and how AI tools may be used for different assignments


The guidance also advises: "As of the release of this guide, it is advised that educators refrain from using programs that claim to detect the use of generative AI for cheating or plagiarism purposes due to concerns about their reliability."


How the Challenge Supports Academic Integrity

Citation Requirement. The Presidential AI Challenge requires students to cite all tools used in their submission. For Hampshire County AI's tools, students generate a proper citation using the citation button in the AI_App_Ideator tool.

This requirement:

  • Teaches students that using a tool is not the same as doing the thinking

  • Models professional practice (scientists, engineers, and writers always cite their methods and tools)

  • Makes the role of AI explicit and transparent

  • Demonstrates that the student directed the thinking while using a tool


Structural Prevention of Plagiarism. The challenge is designed so that submitting AI-generated content as one's own thinking is structurally impossible:

  • Students must submit a narrative that explains their problem and their solution, not a generic description

  • They must include visuals of their design, not an AI image

  • They must submit the Poe chat showing their brainstorming, which demonstrates their active participation

  • The final submission requires students to reflect on their testing and learning


A student could not simply ask an AI to write the entire project and submit it. The project requires authentic student thinking at every stage.


Teaching Academic Integrity. By requiring citation and showing students how to properly attribute tool use, the challenge teaches academic integrity in action. Students learn that using tools ethically means acknowledging them—and that doing so doesn't diminish their work; it clarifies what they contributed.


Section 5: Addressing Bias and Ensuring Equitable Use


The WVDE's Stance on Bias

The guidance warns: "AI tools trained on human data will inherently reflect societal biases in the data. Risks include reinforcing stereotypes, recommending inappropriate educational interventions, or making discriminatory evaluations."


The recommended response is: "Staff and students will be taught to understand the origin and implications of societal bias in AI; additionally, AI tools will be evaluated for the accuracy of their training data and transparency, and humans will review all AI-generated outputs before use."


How the Challenge Builds Bias Awareness


Student Evaluation of AI Suggestions. As students brainstorm with the AI chatbot, they're encouraged to ask: "Does this idea make sense? Is it fair? Could it hurt someone? Could it help everyone?"


By regularly evaluating AI suggestions, students develop the critical thinking needed to spot bias. For example:

  • If an AI suggests a solution that disproportionately affects one group, students can identify this and reject it

  • If an AI suggestion reflects an unfair assumption, students can notice and correct it

  • If an AI output includes stereotypes, students learn to recognize and question them


This practice aligns with the WVDE guidance's emphasis on teaching students to "critically analyze and discern AI-generated materials."


Teacher Oversight. Teachers are directly engaged in all AI interactions and can flag suggestions that reflect bias or stereotypes. This "human in the loop" approach ensures bias doesn't get into the final project.


Inclusive Problem-Solving. Because students identify real problems they notice in their own communities, the solutions they design naturally consider the actual people affected. A student designing a solution for their library thinks about all the students who use that library, not a generic or stereotypical user. This grounds AI use in equity from the start.


Section 6: Age-Appropriateness and Student Access

The WVDE's Guidance on Student Age

The WVDE guidance notes:

"Specialists in the field of classroom AI have advised that students should ideally be at least 13 years old to utilize AI technology. Nevertheless, acquiring AI literacy and developing a comprehension of the technology remains crucial for learners of all ages in our increasingly digital world."

The guidance also emphasizes: "WVBE Policy 2520.14 notes that technology tools must be age-appropriate, specifically when used with the youngest learners."


How the Challenge Adapts for K-5 Students

K-2 Adaptation: Teacher-Guided Brainstorming. For students in K-2, the teacher leads the brainstorming conversation with the AI chatbot, while students direct the thinking. The teacher types prompts based on student ideas, and students hear and respond to the AI's suggestions.


Students at this age are learning about AI and how it works, not using AI directly. This meets the WVBE requirement that technology be age-appropriate while still building foundational AI literacy.


3-5 Adaptation: Supervised Student Interaction. For students in grades 3-5, students may interact more directly with the keyboard or input devices for the AI tool or read information directly from a large monitor, but always with direct teacher supervision and support.


This approach aligns with the WVDE guidance: "Ensuring accessibility to those with disabilities" is a core requirement, and the challenge's design allows for adaptation.


Section 7: Implementation Considerations for Districts


Consent and Registration

The Presidential AI Challenge requires:

For specific questions about consent, data use, or form requirements, teachers and administrators should contact the challenge team: AI.Challenge@science.doe.gov


Data Practices Aligned with State Requirements

The challenge's design supports compliance with state and federal data privacy laws:

  • No PII is entered into AI tools, reducing the risk of sensitive data exposure

  • Consent is collected before participation, meeting parental notification requirements

  • Students learn about data privacy by practicing it (not sharing PII)

  • Tools are vetted for security and privacy, reducing liability


Professional Development

Teachers leading the challenge benefit from understanding:

  • How the challenge teaches computational thinking (meeting WVBE 2520.14)

  • How to facilitate student thinking (keeping AI as a supporting tool, not a replacement)

  • How to help students evaluate AI suggestions for bias and accuracy

  • How to document student thinking (the Poe chat as evidence of authentic engagement)

  • How to properly cite AI tools in student work

Hampshire County AI provides resources and guidance to support teachers in these areas.


Addressing Community Questions

Families and community members may have questions about AI use in schools. Key talking points:

  • AI is a tool. It supports students' thinking; it doesn't replace human teachers or student thinking.

  • Humans are in charge. Students and teachers make all decisions about how to use AI.

  • Privacy is protected. Students don't enter personal information into AI tools, and parental consent is required.

  • This teaches real skills. Students learn computational thinking and AI literacy—skills they'll need for college and careers.

  • The work is authentic. Students identify real problems, make design decisions, and reflect on their work. The AI chat is evidence of their thinking.


Section 8: Alignment Summary

The Presidential AI Challenge aligns with West Virginia's AI guidance and computer science standards in the following ways:

WVDE/WVBE Requirement

How the Challenge Aligns

People, not AI, are in charge

Students drive all decisions; teachers provide oversight; AI is a brainstorming tool

Students engage in authentic, meaningful learning

Students solve real problems they identify; AI supports brainstorming, not completion

Privacy is protected

No PII entered into AI tools; consent collected; student names not used in prompts

Academic integrity is transparent

Tools are cited in final submission; student thinking documented in Poe chat

Bias is addressed

Students evaluate AI suggestions for fairness; teacher reviews all outputs

Age-appropriate use

K-2: teacher-guided; 3-5: supervised; all ages learn AI literacy

Computational thinking is taught

Challenge naturally incorporates decomposition, pattern recognition, algorithms, debugging, evaluation

Computer science standards are met

Aligns with WVBE Policy 2520.14 (all students receive grade-appropriate CS instruction)

Students are prepared for future careers

Students develop problem-solving, critical thinking, and technology skills employers expect

Conclusion: Why This Matters

The Presidential AI Challenge represents a thoughtful, deliberate approach to AI in education—one that aligns with West Virginia's vision. Rather than avoiding AI or treating it as magic, the challenge helps students develop a clear-eyed understanding of what AI can and cannot do, while keeping human judgment and creativity at the center.

Students emerge from the challenge with:


  • Real problem-solving experience

  • Understanding of how algorithms and systems work

  • Practice evaluating AI outputs critically

  • Knowledge of how to use tools ethically and transparently

  • Confidence in their ability to shape technology, not just use it


For administrators and district leaders, the challenge offers a structured, compliant, pedagogically sound way to integrate AI into K-5 classrooms while meeting state standards and protecting student privacy.


Resources for Administrators and District Leaders

Official Presidential AI Challenge:


West Virginia Standards and Guidance:


Hampshire County AI:


Questions? Reach out to Hampshire County AI or contact the Presidential AI Challenge team for clarification on any aspect of the program.


elementary classroom with students and a teacher gathered around a laptop, actively discussing and pointing at an AI chatbot interface on screen. The students are clearly directing the conversation—gesturing, asking questions, leaning in with interest. The teacher supervises thoughtfully. The AI tool is visible but clearly secondary, a resource supporting the students' thinking, not replacing it. Bright, inclusive, professional atmosphere that conveys active human thinking at the center.

 
 
 

Comments


bottom of page