Crafting experience...
3/8/2026
A Project Made By
Submitted for
Built At
HuddleHive's WIT Hackathon #5
Hosted By
Teachers face an impossible task: monitor 20–30 students simultaneously while delivering detailed lessons. When a student struggles, communicates a need, or becomes overwhelmed, teachers more often than not don't see it, especially if the student raises their hand while the teacher is focused on another student.
The consequences are real:
Student's distress escalates because their raised hand is missed, turning a simple need into a behavioral incident
Teachers miss critical moments where early intervention could have prevented a student from shutting down
Parents are left in the dark about how their child communicated and what support they needed during the school day
Teachers feel guilty and overwhelmed, knowing they can't be everywhere at once
This is especially threatening for students with developmental, emotional, or sensory needs — the very students who benefit most from immediate, responsive support.
Additionally, schools currently need to purchase multiple specialized communication boards to meet different student needs: one for choice and expression, another for individual needs, another for daily routines. This fragmentation can be expensive, confusing, and may leave some students without the right tool.
HeardFirst is a real-time digital communication board that ensures students are always heard and teachers are always aware — with multiple specialized boards unified in one platform.
For students: A visual, symbol-based interface where they can articulate needs into simple sentences ("I don't understand," "I'm overstimulated," "I need help"). The app is designed with large icons to minimize cognitive load, making it accessible even in moments of distress. HeardFirst also supports free-form communication - students can type or speak incomplete thoughts, broken sentences, or raw ideas that don't fit the pre-built prompts, capturing the full richness of student voice.
Modular board system: Rather than purchasing separate tools, teachers and parents can choose from multiple board types within HeardFirst, that may suit their child best, it may be;
Choice & Expression Board: Helps students make choices and express preferences
Individual Needs Board: Communicates personal, emotional, or regulatory needs
Daily Routine Board: Manages transitions, schedules, and daily activities
Custom Boards: Teachers can create boards tailored to specific students
Teachers configure which boards each student has access to, ensuring the right tool for the right student, all in one platform.
For teachers: Instant, real-time alerts when a student needs support, no missed signals. HeardFirst first guides students through self-regulation strategies (breathing techniques, sensory breaks, grounding exercises) before escalating to the teacher. This teaches independence, reduces teacher workload, and empowers students to become more independent in managing their own emotional regulation.
For parents: Daily communication summaries and full transcripts with timestamps showing exactly how their child communicated and what support they needed. This transforms school-home communication, giving parents visibility into their child's learning experience and creating a shared understanding of their emotional needs. This also helps reduce teacher workload of needing communicate to everyone.
HeardFirst is built as a full-stack TypeScript application with a modern, scalable architecture:
A full-stack AAC (Augmentative & Alternative Communication) platform where non-verbal students communicate via symbol boards or free-form input, teachers get real-time alerts, and parents get daily summaries — all enhanced by AI.
Stack at a Glance
LayerTechnologyFrontendReact 19 + TypeScript + ViteStylingTailwind CSS v4AnimationsFramer MotionChartsRechartsBackendNode.js + Express + tsxDatabaseSQLite + Better-SQLite3AuthFirebase AuthenticationAIGoogle Gemini APIReal-timeWebSocketsDeploymentVercel (frontend)
User Flow
Student opens app
→ Board loads from backend (REST API)
→ Student taps symbol OR types free-form input
→ Backend logs it to SQLite
→ Gemini analyzes in parallel
→ Self-regulation suggestion shown to student
→ Student self-regulates OR requests help
→ WebSocket fires alert to teacher in real-time
→ All interactions logged → Parent dashboard updated
The 3 Interfaces:
Student View
Large symbol/icon boards (configured per student)
Free-form text input for more expressive students
Framer Motion animations keep it engaging but not overwhelming
Gemini-generated self-regulation prompts shown inline (e.g. breathing exercises)
Teacher View
Live WebSocket feed — instant alert when a student communicates distress or needs help
Full context sent with each alert (what was communicated, Gemini's interpretation, confidence score)
Recharts dashboards showing communication patterns over time
Parent View
Daily summaries of what their child communicated
Trend insights generated by Gemini (e.g. "Struggled with transitions 4x this week")
Read-only — no ability to intervene directly
Challenge 1: AI Hallucinations in High-Stakes Environment Working with vulnerable students, we can't afford AI-generated suggestions that are inaccurate or harmful. Early testing showed Gemini could generate plausible but unfounded self-regulation tactics.
Solution: We implemented a human-in-the-loop validation system. Every AI-generated suggestion is cross-referenced against a curated library of evidence-based interventions (backed by child psychology research). Only suggestions that match validated techniques are presented to students. Teachers can also see the confidence score and reasoning behind each suggestion. This reduces AI's speed advantage but ensures safety.
Challenge 2: Capturing Authentic Student Voice Without Overwhelming the Interface We wanted to support free-form communication (incomplete sentences, broken thoughts), but we also needed to keep the interface simple for students in distress. Too many options = cognitive overload.
Solution: We created a progressive disclosure design. Students first see the structured boards. After they've communicated, a subtle "Tell me more" button appears, inviting free-form input if they want it. This captures the raw voice without forcing it, and the data flows into the analysis pipeline for patterns.
Challenge 3: Real-Time Notifications Without Notification Fatigue Teachers would get overwhelmed if every student input triggered an alert. We needed to distinguish between "student is using self-regulation" and "teacher needs to step in now."
Solution: We built smart escalation logic. Only urgent flags (escalations, repeated failed self-regulation attempts, or keywords indicating crisis) trigger immediate notifications. Routine communications are batched into a teacher dashboard summary. Teachers can customize escalation rules per student.
Challenge 4: Privacy & Data Governance in Education Student communication data is deeply sensitive. We needed robust governance, especially around AI use and data retention.
Solution: We established clear data policies:
All student data is encrypted at rest and in transit
Parents have full visibility into what the app collects and how Gemini is used
Students are never profiled without awareness
Quarterly audits of AI recommendations to catch bias or drift
Data retention policies aligned with GDPR; parents can request full deletion
Technical Accomplishments:
Designed a real-time architecture that handles multiple concurrent users with sub-second notification latency
Integrated Gemini API with rigorous safeguards to prevent harmful outputs in a child-facing application
Built a modular board system that's flexible enough to meet diverse student needs while remaining simple enough for students in distress to use
Created a comprehensive audit logging system for compliance and accountability
Pedagogical Insights:
Self-regulation as a feature, not a side effect: Early conversations with teachers revealed that many students just need a moment and a tool to calm down; they don't always need adult intervention. Baking in self-regulation coaching changes the classroom dynamic.
Incomplete sentences are data: Teachers were surprised by how much they could learn from broken, unfiltered student communication. A student typing "can't...too much...head" tells you more than a structured "I need a break."
Parents want transparency: Schools often don't share communication logs with parents. When we showed sample transcripts to parents, they were grateful and asked for more detail. This simple feature can transform home-school relationships.
Modular boards matter: Different students have wildly different communication needs. One solution doesn't fit all, but one platform can.
User Research Learnings:
Tested low-fidelity prototypes with 3 special education classrooms
Gathered feedback from teachers on escalation logic, notification frequency, and board design
Observed that students with AAC (Augmentative and Alternative Communication) needs responded best to the visual board interface, while older students preferred free-form input
Immediate (Next 1-2 months):
Beta testing in 5+ real classrooms to validate the self-regulation coaching module and modular board system with diverse learners
Refine AI safeguarding: Work with special education experts and privacy advocates to audit Gemini outputs and build an external advisory board
Parent dashboard MVP: Deliver basic communication summaries and transcript access to parents
Short-term (2-4 months):
AI-powered insights: Deploy Gemini-based pattern analysis to identify emerging student needs, learning gaps, and emotional trends from free-form communication
Personalized coaching expansion: Train student-specific ML models to provide increasingly personalized self-regulation suggestions (with human oversight)
Teacher analytics: Build dashboards showing which students struggle most, which escalation types are most common, which self-regulation tactics work best
Integration roadmap: Begin conversations with school behavior tracking systems for unified student view
Long-term (6+ months):
Expand accessibility: Voice input improvements, symbol libraries in multiple languages, integration with AAC devices
School-district licensing: Move from classroom pilot to district-wide rollout with administrator dashboards
Research & publication: Document learnings from real classrooms and publish findings to support adoption
Community: Build a teacher community around board templates and self-regulation tactics sharing
How can you improve it?
Deepen AI governance: Build an external ethics board to audit all AI recommendations quarterly
Expand evidence base: Partner with academic researchers to validate which self-regulation tactics work best for which students
Accessibility depth: Continue iterating with students who have complex communication needs
Parent engagement: Move beyond visibility to active participation in setting student communication goals
The core belief: Every student deserves to be heard, not just their structured needs, but their authentic voice. Technology should capture that, learn from it, and help teachers respond - safely, transparently, and ethically.