NeuronFRAMES is an AI-assisted rehabilitation platform that combines computer vision, sensor-based input, and gamified therapy to make recovery more accessible, engaging, and measurable.

Uses MediaPipe to track body landmarks through a standard webcam, enabling movement analysis without specialized equipment.
Interactive browser-based games transform repetitive exercises into engaging activities that increase patient motivation and adherence.
Arduino-powered sensors such as force-sensitive resistors connect physical rehabilitation tools to the digital therapy platform.
NeuronFRAMES is a research and innovation project developing a web-based AI-assisted rehabilitation system. It combines computer vision pose detection, sensor-based input devices, gamified therapy exercises, speech therapy tools, and progress tracking into a single accessible platform.
Globally, millions of patients cannot access consistent rehabilitation therapy due to cost, geographic distance, or limited clinical resources. NeuronFRAMES aims to address these challenges by delivering therapy tools through a standard web browser, requiring only a webcam and internet connection.
Built with HTML, CSS, JavaScript, MediaPipe, and Arduino sensors, the platform is designed to be affordable and deployable in diverse settings โ from clinical facilities to patients' homes.
Runs in any modern browser โ no software installation or expensive hardware required.
Data-driven session records help clinicians and patients monitor improvement over time.
Game mechanics keep patients engaged and encourage consistent practice between sessions.
Open web technologies and affordable sensors make the system accessible in resource-limited settings.
Each mode addresses a specific rehabilitation need, from physical movement to speech recovery, using accessible web-based technology.
A pose-detection rehabilitation mode that tracks patient movement using camera-based body landmark detection. It provides visual guidance through pose overlays and automatically counts repetitions and evaluates movement accuracy โ giving patients real-time feedback during exercise sessions.
A game-based therapy mode where patients perform rehabilitation movements through interactive browser games โ including reaction tasks, card matching, and motion-based challenges. Designed to increase motivation and adherence by making repetitive exercises feel engaging and rewarding.
A sensor-integrated rehabilitation system that connects hardware devices โ such as grip sensors using force-sensitive resistors and Arduino microcontrollers โ to interactive therapy games. Patients interact with therapy exercises using physical input devices, bridging the gap between tangible rehabilitation tools and digital feedback.
A speech rehabilitation module that uses browser-based speech recognition to support pronunciation practice, vocabulary exercises, and communication training. Patients receive immediate feedback on their speech accuracy through the Web Speech API, making speech therapy more accessible outside clinical settings.
NeuronFRAMES is developed through academic research, innovation competitions, and collaboration with rehabilitation professionals who inform every design decision.
The project is grounded in research on computer vision for movement analysis, gamification in healthcare, and accessible rehabilitation technology. Each module is informed by existing evidence on effective therapy delivery.
Physiotherapists, occupational therapists, and speech-language professionals provide ongoing guidance to ensure the system's exercises, metrics, and interfaces align with real clinical workflows and patient needs.
NeuronFRAMES has been developed through iterative prototyping cycles, incorporating user testing, hardware integration experiments, and software architecture improvements at each stage.
NeuronFRAMES continues to evolve through research activities, innovation competitions, and technical development milestones.
EPT, GPT, ITS, and SIT modules fully prototyped with working web-based interfaces and real-time feedback systems.
Successfully integrated camera-based body landmark detection for real-time movement tracking and accuracy evaluation.
Designed and built custom sensor input devices using force-sensitive resistors and Arduino microcontrollers for the ITS module.
Received the People's Choice Award at the 20th ACM/IEEE International Conference on Human-Robot Interaction.
Developed multiple interactive therapy games including reaction tasks, card matching, and motion-based challenges.
Implemented browser-based speech recognition for pronunciation practice and vocabulary training using the Web Speech API.
Chacharin Lertyosbordin, Maythus Tangprapa, Nuntipat Jiwasurat โ Presented at ACM/IEEE HRI 2025 and awarded People's Choice Award.
Chacharin Lertyosbordin, Maythus Tangprapa, Nuntipat Jiwasurat โ IEEE Xplore listing of the HRI 2025 conference paper.
Nuntipat Jiwasurat, Maythus Tangprapa, Filippo Sanfilippo โ Presents a gamified rehabilitation system combining mirror therapy with FSR-based input for stroke recovery.
Whether you are a rehabilitation professional, researcher, educator, or student interested in accessible therapy technology โ we welcome your questions, feedback, and collaboration ideas.