2024 Medical Technologies Congress, TIPTEKNO 2024, Muğla, Türkiye, 10 - 12 Ekim 2024, (Tam Metin Bildiri)
This paper introduces a smart home control system leveraging gesture-based interactions to enhance the accessibility and usability of home automation technologies, particularly for users with physical disabilities. Traditional control mechanisms, such as touchscreens and remotes, often fail to meet the needs of users with disabilities. This research addresses this gap by developing an intuitive, non-verbal communication system that interprets human hand gestures as commands using deep learning techniques. A ceiling-mounted camera detects and tracks humans, with fine-tuning of the YOLO-V8 model allowing for precise hand detection and classification. Experimental results on HaGRID (HAnd Gesture Recognition Image Dataset) using 554,800 images over 37,583 subjects in various scenes and lighting conditions demonstrate the system’s effectiveness and accuracy, making it suitable for extended smart home automation and human-computer interaction by transforming gestures into com- mands that control various devices and increase the accessibility of individuals with disabilities.