The Facial-Orientation and Concentration Understanding System (FOCUS) addresses the growing challenge of maintaining attention among young students, particularly those with ADHD or focus-related difficulties, in learning environments. Many students struggle with sustained concentration during independent work, and there is a lack of accessible, real-time tools that provide immediate, personalized feedback on attention levels. This project develops a webcam-based application that analyzes facial orientation, gaze direction, and behavioral indicators to detect inattention and deliver timely reminders. By supporting students, educators, and caregivers with actionable insights, FOCUS aims to improve engagement, learning outcomes, and overall academic performance.
The system is implemented as a real-time desktop application using Python, with computer vision powered by MediaPipe and OpenCV. Facial landmarks are extracted from webcam input to estimate head pose using the Perspective-n-Point (PnP) algorithm, while additional metrics such as eye gaze, eye aspect ratio (EAR), and mouth aspect ratio (MAR) are computed to detect signs of distraction, fatigue, and yawning. These measurements are continuously processed over time to calculate higher-level indicators like PERCLOS and yawn frequency, which reflect the user’s level of alertness. A rule-based algorithm then combines these features to classify the user’s state of attention and provide immediate feedback through an intuitive user interface built with PySide6.
The project resulted in a fully functional desktop application that monitors and evaluates user attention in real time using webcam-based facial analysis. The system integrates multiple attention-detection algorithms with an interactive GUI that supports two study modes—Free Study and FOCUS Mode—allowing users to customize their study experience while receiving live feedback and notifications. Key deliverables include the real-time attention classification system, a user-friendly interface for session control, and a post-session analytics dashboard that summarizes attentiveness metrics. Overall, the project successfully demonstrates an end-to-end solution that combines computer vision, user interaction, and behavioral analysis to support improved study habits.