A gaze-controlled wheelchair is an innovative assistive technology designed to empower individuals with mobility impairments. This system utilizes eye-tracking technology to interpret a user’s gaze patterns and translate them into control commands for the wheelchair. By integrating sophisticated sensors and algorithms, the wheelchair can accurately respond to the user’s eye movements, allowing for intuitive navigation. This technology not only enhances the user’s independence but also addresses the challenges faced by those with limited or no use of their limbs. The abstract highlights the potential of gaze-controlled wheelchairs in improving the quality of life for individuals with diverse mobility needs.
Introduction
I. INTRODUCTION
The "Gaze Controlled Wheelchair" project represents a cutting-edge advancement in assistive technology, aiming to enhance the mobility and independence of individuals with physical disabilities. Leveraging the power of computer vision, OpenCV, and MediaPipe, this innovative system enables users to navigate a wheelchair effortlessly through the manipulation of their gaze. The integration of OpenCV facilitates real-time eye tracking, while MediaPipe's pose detection algorithms enable precise and responsive control. The project also incorporates a specially designed chassis to translate the gaze-based commands into seamless movements, ensuring a smooth and intuitive user experience. By merging these technologies, the Gaze Controlled Wheelchair not only showcases the potential of computer vision in assistive devices but also opens new avenues for personalized and accessible mobility solutions.
II. LITERATURE REVIEW
The literature review provides a comprehensive foundation for the advancements achieved in gaze-controlled wheelchairs, showcasing a trajectory of innovation and practical implementation. Control of pursuit eye movement [1] established fundamental principles in eye movement control, serving as a cornerstone for subsequent developments. Building upon this groundwork, Eye controlled electric wheelchair [2] demonstrated the practical implementation of eye-tracking technology, laying the groundwork for real-world applications. Explorations into the precision of eye pupil movements for wheelchair navigation were presented in Eye Pupil Controlled Transport Riding Wheelchair [3], shedding light on the potential of utilizing subtle eye movements for precise control. Addressing the technical intricacies, Implementation of an Eye Gaze Tracking System for the Disabled People [4] delved into the complexities of implementing eye gaze tracking systems, offering insights into system design and calibration. Dual contributions, Using an Eye Gaze New Combined Approach to Control a Wheelchair Movement [5] and A Novel Eye-Gaze-Controlled Wheelchair System for Navigating Unknown Environments [11], introduced combined approaches and novel systems for enhanced control and adaptability, showcasing a dynamic landscape of innovation. Practical applications of eye control were exemplified in Eye Controlled Wheel Chair [6], while Smart wheelchair based on eye tracking [7] integrated eye tracking for intelligent mobility solutions, emphasizing the intersection of assistive technology and user-centered design. The multimodal exploration in Eye And Voice Controlled Wheel Chair [8] offered a holistic interaction paradigm, highlighting the potential for leveraging multiple input modalities for enhanced user experience and control. Additionally, Experimental study report on Opto-electronic sensor based gaze tracker system [9] provided experimental validation of gaze tracking systems, underlining the importance of rigorous testing and validation in research and development efforts. Advancements in video processing techniques were demonstrated in A Video Processing Based Eye Gaze Recognition Algorithm for Wheelchair Control [10], showcasing the integration of cutting-edge technologies for precise and responsive control. The practicality of leveraging Oculus technology for wheelchair control was explored in Oculus Supervised Wheelchair Control for People with Locomotor Disabilities [12], highlighting the potential for immersive technologies to enhance accessibility and usability. Finally, Eye-Controlled Wheelchair [13] encapsulated diverse applications and user experiences, offering insights into the multifaceted nature of assistive technology solutions. In the presented Gaze Controlled Wheelchair project, leveraging OpenCV [1], a customized chassis for mobility [2], and MediaPipe for precise motion detection in eyes [4], builds upon this rich literature to offer a robust and innovative assistive technology solution.
III. METHODOLOGY
The methodology for the development of the Gaze-Controlled Wheelchair involves a systematic integration of key technologies to enable intuitive and responsive control based on user gaze. Firstly, OpenCV is employed to capture real-time video frames from the webcam, serving as the primary input for subsequent gaze analysis. The gaze_tracking library, integrated with MediaPipe, facilitates precise eye motion detection through the application of the gaze.refresh() method, enabling real-time updates of gaze information. Gaze-based control logic is implemented using conditional statements to interpret the user's gaze direction, distinguishing between left, right, center, and blinking patterns. The annotated frame is then updated with corresponding text, providing visual feedback on the user's gaze. Additionally, the system incorporates pupil tracking functionality by retrieving the coordinates of the left and right pupils. This information is displayed on the annotated frame, offering further insights into the user's visual focus. The translation of gaze information into wheelchair control signals is achieved through the integration of a controller module, exemplified by the cnt.led(gaze) function. This module acts as a bridge between the gaze analysis and the wheelchair's hardware, allowing for the customization of control responses based on specific gaze patterns. The execution phase involves continuously capturing frames, analyzing gaze dynamics, and dynamically adjusting wheelchair movements based on the user's eye motions. A termination condition is implemented to gracefully exit the application when the 'Esc' key is pressed. The release of webcam resources and closure of OpenCV windows ensure the proper conclusion of the system's operation. This comprehensive methodology amalgamates OpenCV, gaze_tracking, and a dedicated controller module, offering an innovative and practical solution for the development of a Gaze-Controlled Wheelchair.
IV. ALGORITHM
1. Initialize the GazeTracking object and webcam.
2. Create a continuous loop for real-time processing:
a. Acquire a new frame from the webcam.
b. Send the frame to the GazeTracking object for analysis.
c. Refresh the gaze information based on the analyzed frame.
d. Retrieve the annotated frame from the gaze analysis.
e. Initialize an empty string for textual feedback.
3. Analyze Gaze and Update Annotated Frame
a. If the user is blinking, set the text to "Blinking."
b. Else, if the user is looking to the right, set the text to "Looking right."
c. Else, if the user is looking to the left, set the text to "Looking left."
d. Else, if the user is looking at the center, set the text to "Looking center."
4. Display Gaze Information on the Annotated Frame
a. Place the text on the annotated frame at coordinates (90, 60).
b. Utilize the cv2.putText function with appropriate formatting parameters.
5. Display Pupil Coordinates on the Annotated Frame
a. Retrieve the coordinates of the left and right pupils.
b. Display the left pupil coordinates on the annotated frame at coordinates (90, 130).
c. Display the right pupil coordinates on the annotated frame at coordinates (90, 165).
6. Execute Controller Module for Wheelchair Control
a. Call the led function from the controller module, passing the gaze object.
7. Display the Annotated Frame
a. Show the annotated frame with gaze information and pupil coordinates using cv2.imshow.
8. Check for Termination
a. Wait for a key event (cv2.waitKey) with a delay of 1 millisecond.
b. If the key pressed is 'Esc' (27 in ASCII), break out of the loop
9. Release Webcam Resources
a. Release the webcam resources using webcam.release().
10. Close OpenCV Windows
a. Close all OpenCV windows using cv2.destroyAllWindows()
V. RESULT AND DISCUSSION
The Gaze-Controlled Wheelchair project demonstrates a breakthrough in assistive mobility solutions, employing OpenCV and the gaze_tracking library to enable real-time gaze analysis and intuitive control. The system accurately interprets user eye movements, distinguishing blinks, left, right, and center gazes, with pupil tracking providing additional insights. The integrated controller module translates gaze information into wheelchair control signals, offering a responsive and natural interface. The project finds applications in healthcare, rehabilitation, and home environments, targeting individuals with physical disabilities seeking greater independence. The system's benefits include increased autonomy, non-intrusive interaction, adaptability, and enhanced safety. Future enhancements may explore obstacle detection and customization options. In conclusion, the Gaze-Controlled Wheelchair project signifies a transformative step in assistive technology, promising a more inclusive and empowering mobility solution for its users.
References
[1] M. S. Sugathadasa, W. P. Dayawansa and C. F. Martin, \"Control of pursuit eye movement,\"
[2] B. Thakur and K. Kulshrestha, \"Eye controlled electric wheelchair,\"
[3] J. M. Akanto, M. K. Islam, A. Hakim, M. A. H. Sojun and K. Shikder, \"Eye Pupil Controlled Transport Riding Wheelchair,\"
[4] J. Park, T. Jung and K. Yim, \"Implementation of an Eye Gaze Tracking System for the Disabled People,\"
[5] D. Cojocaru, L. F. Manta, I. C. Vladu, A. Dragomir and A. M. Mariniuc, \"Using an Eye Gaze New Combined Approach to Control a Wheelchair Movement,\"
[6] R. Hyder, S. S. Chowdhury and S. A. Fattah, \"Real-time non-intrusive eye-gaze tracking based wheelchair control for the physically challenged,\"
[7] P. Y. Reddy, N. Motaparthi, R. G. Balakrishna, K. L. Sai Chaitanya, N. G. Kumar and L. Ganti, \"Eye Controlled Wheel Chair,\"
[8] N. Wanluk, S. Visitsattapongse, A. Juhong and C. Pintavirooj, \"Smart wheelchair based on eye tracking,\"
[9] H. B H, S. R C, V. P. N K, V. R and Y. R, \"Eye And Voice Controlled Wheel Chair,\" A. Nanditha Sree and A. Balaji Ganesh, \"Experimental study report on Opto-electronic sensor based gaze tracker system,\"
[10] M. R. Tejonidhi and A. M. Vinod, \"Oculus Supervised Wheelchair Control for People with Locomotor Disabilities,\" ]
[11] Sudhir Rao Rupanagudi1 , Varsha G Bhat1 , Shreya K. Gurikar1 , S. Pranava Koundinya1 , Sumedh Kumar M. S.1 , Shreyas R.1 , Shilpa S.2 , Suman N. M.2 , Rachana R. Bademi2 , Manisha Koppisetti3 , Vasanthi Satyananda “A Video Processing Based Eye Gaze Recognition Algorithm for Wheelchair Control”
[12] Dorian Cojocaru , Liviu Florin Manta, Ionel Cristinel Vladu , Andrei Dragomir, Alexandru Marin Mariniuc “Using an Eye Gaze New Combined Approach to Control a Wheelchair Movement”
[13] MOHAMAD A. EID1 , NIKOLAS GIAKOUMIDIS1 , AND ABDULMOTALEB EL SADDIK2 “A Novel Eye-Gaze-Controlled Wheelchair System for
[14] Navigating Unknown Environments: Case Study With a Person With ALS”
[15] “Eye-Controlled Wheelchair” Shawn Plesnick, Domenico Repice, Patrick Loughnane