International Journal of Control, Automation, and Systems 2025; 23(2): 459-466
https://doi.org/10.1007/s12555-024-0555-3
© The International Journal of Control, Automation, and Systems
This study aims to enhance assistive technologies by predicting head movement intentions in real-time using surface electromyography (sEMG) signals and machine learning algorithms. The primary motivation is to improve the responsiveness and accuracy of gaze tracking systems for individuals with physical disabilities. Six healthy adult males participated in the experiments, with their head and neck muscle activities recorded using a high-speed optoelectronic motion capture system (Vicon Vero/Nexus) and wireless sEMG sensors (Delsys Trigno). Reflective markers were positioned on the subjects’ heads and shoulders, and mini-size sEMG sensors were placed around the eyes and neck muscles. The experimental procedure involved the participants sitting 1.5 meters from a visual guide, performing head movements in four directions, and holding each position for three seconds. Four EMG feature sets were created for analysis, combining signals from different muscles and time intervals. Various machine learning models, including kernel naïve bayes, gaussian naïve bayes, bagged trees, and subspace KNN, were applied to predict head movement states. The subspace k-nearest neighbors (KNN) model applied to EMG set 3 achieved the highest classification accuracy of 78.0%. The study demonstrates the potential for sEMG combined with advanced computational techniques to significantly improve the real-time prediction of head movement intentions, offering valuable applications in human-computer interaction, virtual reality, and assistive technologies.
Keywords Head movement, intent prediction, neural network, surface electromyography.
International Journal of Control, Automation, and Systems 2025; 23(2): 459-466
Published online February 1, 2025 https://doi.org/10.1007/s12555-024-0555-3
Copyright © The International Journal of Control, Automation, and Systems.
Yundong Kim, Jirou Feng, Taeyeon Kim, Gibeom Park, Kyungmin Lee, and Seulki Kyeong*
Hannam University
This study aims to enhance assistive technologies by predicting head movement intentions in real-time using surface electromyography (sEMG) signals and machine learning algorithms. The primary motivation is to improve the responsiveness and accuracy of gaze tracking systems for individuals with physical disabilities. Six healthy adult males participated in the experiments, with their head and neck muscle activities recorded using a high-speed optoelectronic motion capture system (Vicon Vero/Nexus) and wireless sEMG sensors (Delsys Trigno). Reflective markers were positioned on the subjects’ heads and shoulders, and mini-size sEMG sensors were placed around the eyes and neck muscles. The experimental procedure involved the participants sitting 1.5 meters from a visual guide, performing head movements in four directions, and holding each position for three seconds. Four EMG feature sets were created for analysis, combining signals from different muscles and time intervals. Various machine learning models, including kernel naïve bayes, gaussian naïve bayes, bagged trees, and subspace KNN, were applied to predict head movement states. The subspace k-nearest neighbors (KNN) model applied to EMG set 3 achieved the highest classification accuracy of 78.0%. The study demonstrates the potential for sEMG combined with advanced computational techniques to significantly improve the real-time prediction of head movement intentions, offering valuable applications in human-computer interaction, virtual reality, and assistive technologies.
Keywords: Head movement, intent prediction, neural network, surface electromyography.
Vol. 23, No. 2, pp. 359~682
Youngmin Yoon and Ara Jo*
International Journal of Control, Automation, and Systems 2025; 23(1): 126-136Shengya Meng, Fanwei Meng*, Wang Yang, and Qi Li
International Journal of Control, Automation, and Systems 2024; 22(1): 163-173Dong-Han Lee, Kyung-Soo Kwak, and Soo-Chul Lim*
International Journal of Control, Automation, and Systems 2023; 21(12): 4032-4040