Crafting experience...
3/8/2026
A Project Made By
Submitted for
Built At
HuddleHive's WIT Hackathon #5
Hosted By
What is the problem you are trying to solve? Who does it affect?
We are tackling the problem of patients getting lost in the referral system and reception teams being overwhelmed, which delays care for everyone.
Our goal is to reduce referral black holes and reception overload by using an AI assistant that helps patients before they reach the hospital. This assistant guides them through key questions and generates a clear, structured report of their symptoms and history. Hospital staff can then quickly review, prioritise, and sort patient information in a much clearer and simpler way.
This benefits two main groups:
Healthcare staff, who gain time, clarity, and better‑organised data.
Patients, who experience a smoother journey and faster access to the care they deserve, instead of facing a long, confusing process.
What is your idea? How does it fix the problem?
we have three main core components :
The first feature is a personalised AI medical assistant. Patients answer simple, guided questions and the assistant generates a structured, clinically relevant report that can be shared with GPs or reception staff before the appointment. This means clearer information, less back‑and‑forth, and fewer opportunities for referrals to fall through the cracks.
The second feature is a motion‑sensor pain mapper. Instead of struggling to explain “it’s sort of here, but sometimes it moves there,” patients can point to the exact area on their body. The app translates those gestures into a precise visual and text description of pain and movement, especially helping children, neurodivergent patients, and anyone who finds verbal descriptions difficult.
The third feature is a live translation layer. Patients select their preferred language, and the app provides real‑time translation support for key questions, answers, and instructions. This reduces misdiagnosis risks and improves safety in multilingual settings, where language barriers are known to harm outcomes
How do all the pieces fit together? Does your frontend make requests to your backend? Where does your database fit in?
Our system is built as a mobile and web application (frontend) connected to a secure server (backend) and a clinical database.
The frontend handles the user experience: patients interact with the AI assistant, select their language, and use the motion‑sensor interface to indicate pain.
The backend receives these inputs via API requests, runs the AI models to generate structured reports and translations, and processes motion‑sensor data into clinically useful descriptions.
A secure database stores anonymised or consented patient records, symptom reports, and language preferences, so that staff can access and review them before or during appointments.
Hardware Integration: Integrated the plyer library to access the smartphone's native accelerometer, translating physical kinematics(coordinate system) into actionable symptom data.
Backend Communication: To seamlessly bridge this mobile frontend with the GP's system, implemented a lightweight real-time backend communication using Python's urllib and json.
Data Pipeline: The moment a physical symptom is registered, the app instantly fires an HTTP POST request to a Webhook, delivering a structured, real-time report directly to the GP Dashboard.
Hospital staff access a dashboard that displays these reports in a clear, prioritised format, integrating into existing workflows with minimal disruption.
What did you struggle with? How did you overcome it?
-Challenge: As Engineers that just started out tech journey, we had faced multiple technical difficulties including not being too familiar with certain coding languages and having a limited understanding in aspects such as key concepts such as backend–frontend integration, database design, and working with AI models.
-Solution: We aksed asked for help from more experienced peers and mentors whenever we hit a wall, which saved time and prevented us from getting stuck for too long.
-Challenge: App crashes due to strict android hardware permission policies when trying to run the live camera and motion sensors simultaneously.
-Solution: instead of forcing unstable hardware connections, we redesigned the UX to feature a branching path: visible symptoms (camera)" vs. internal pain (motion). We implemented a safe, mock-up viewfinder for the camera to ensure 100% app stability for the demo, whilst successfully maintaining the live, real-time data processing for the motion sensors.
What did you learn? What did you accomplish?
Building ClearCare taught us that the real challenge in healthcare isn't technology; it's understanding what both sides of the consultation actually need. We spent time thinking deeply about two very different people: an anxious patient, possibly unable to communicate in English, and a GP with 10 minutes and no context. That insight shaped everything we built, from the multilingual voice interface for patients to the structured summary for GPs. The accomplishment we're most proud of isn't the code, it's correctly identifying that the gap between patient and GP isn't just language, it's communication itself.
Moreover, We deepened our technical skills in mobile development using python and real-time data transmission via webhooks. More importantly, we learned the value of agile problem, which is prioritising user experience.
What are the next steps for your project? How can you improve it?
User testing with patients and staff to gather feedback on usability, clarity, and perceived value, then iterating on the interface and question flow.
Clinical input and validation from healthcare professionals to refine the question sets, report structure, and risk‑flagging so that outputs align with real clinical needs.
Technical improvements, including optimising the AI models, enhancing motion‑sensor accuracy, replace the current webhook prototype with a secure, encrypted database, and strengthening the translation engine for medical terminology.
Integration planning with existing hospital and GP systems, starting with simple export formats and, in the future, exploring interoperability standards (e.g. FHIR) for smoother adoption.
In the longer term, we aim to pilot the tool in a real clinical setting, gather outcome data on referral times and staff workload, and scale the solution to more clinics and hospitals.