In Human Computer Interaction (HCI), the finest invention is the mouse. Even though a wireless mouse or the Bluetooth mouse is in demand today, it still lacks in many fields like cost and power. This paper, therefore, puts forward a modern approach to regulate the mouse movements using a real-time camera with the help of the hand. Our System aims to improve the recognition of human hand postures in a Human Computer Interaction application, reduce the time spent computing and improve user comfort related to human hand postures. We have developed an application for computer mouse control. Based on the proposed algorithm and selected hand feature, the application has good time-based performance. The user finds it easier to operate the system due to the proposed hand postures in combination with the voice assistant. Voice assistants are the great innovation in the field of AI that can change the way of living of the people in a different manner. Many devices are becoming smarter in their own way to interact with human in an easy language. The Desktop based voice assistant are the programs that can recognize human voices and can respond via integrated voice system.
Introduction
I. INTRODUCTION
Gesture Controlled Virtual Mouse makes human-computer interaction simple by making use of Hand Gestures and Voice Commands. The computer requires almost no direct contact. All i/o operations can be virtually controlled by using static and dynamic hand gestures along with a voice assistant.
This project makes use of state-of-art Machine Learning and Computer Vision algorithms to recognize hand gestures and voice commands, which works smoothly without any additional hardware requirements. It leverages models such as CNN implemented by Media Pipe running on top of pybind11. It consists of two modules: One which works direct on hands by making use of Media Pipe Hand detection, and the other which makes use of Gloves of any uniform color. Currently, it works on the Windows platform.
With such developed technology, the real time mouse movement using hand gesture and voice will be able to move the mouse according to the gesture and the voice fed in the system.
II. APPLICATIONS
Amidst the COVID-19 situation, it is not safe to use the devices by touching them because it may result in a possible situation of spread of the virus by touching the devices, so the proposed AI virtual mouse can be used to control the PC mouse functions without using the physical mouse
The system can be used to control robots and automation systems without the usage of devices
AI virtual mouse can be used to play virtual reality- and augmented reality-based games without the wireless or wired mouse devices
Persons with problems in their hands can use this system to control the mouse functions in the computer.
III. METHODOLOGY
In this work, they proposed a novel virtual mouse method using RGB-D images and fingertip detection [3].
This study consists of two methods for tracking the fingers, one is by using colored caps and other is by hand gesture detection. This includes three main steps that are finger detection using color identification, hand gesture tracking and implementation on on-screen cursor [4]. This paper presents a design for a fully operational hand-gesture controlled wearable mouse. It is a simple plug and play type device without the need for a special driver and other software support [5]. In this paper a gesture recognition models is designed which recognize hand gestures that is down, up, left, right and cross, based on the input signal from three-axis accelerometer. Integrated circuit is used. A Java program was also creating to handle the mouse interaction [6].
The system will allow the user to navigate the computer cursor using their hand bearing color caps or tapes and left click and dragging will be performed using different hand gestures [7].
A. Limitation Of Existing System Or Research Gap
Many previous studies on hand-gesture recognition have been conducted using colored gloves or markers. Despite remarkable successes, recognition remains challenging, due to the complexity of using gloves, markers, or variable glove sizes for users. Consequently, many recent efforts have focused on camera-based interfaces.
In recent years, traditional camera-based approaches that detect the area of the hand and recognize hand gestures have been developed.
These approaches had obvious detection difficulty when the light levels were changed or a complex background was used and required a fixed distance from the camera to the users [4].
The first challenge was to correctly detect the hand with a webcam. We had to learn about the skin detection techniques and image processing techniques like Background Subtraction, Image Smoothening, Noise Removal and Reduction [3]..
???????B. Algorithms used to track Hand Gestures
OpenCV: OpenCV stands for Open-Source Computer Vision which is a library in which is written in C++ used for Computer Vision. It is cross-platform and free to use. It provides real-time GPU acceleration features which are used in wide areas like 3D and 2D feature toolkits, Faces and gesture recognition systems.
Autopy: Autopy is a cross-platform GUI automation module in python which keeps the track of our fingertips. It tells us whether a finger is up or down in 0 or 1 output format. OpenCV visualizes output from this and makes a proper frame of an image.
MediaPipe: MediaPipe is a cross-platform framework which is used for building multimodal pipelines in ML and is an opensource framework made available by google. MediaPipe Hands Model makes hand detection possible by taking thousands of previous inputs and learning precise detection of hands. MediaPipe Hands can form the basis for sign language understanding and hand gesture control. It can also enable the overlay of digital content of the physical world in augmented reality (AI).
Mediapipe uses Machine Learning (ML) to mark 21 3D landmarks of a hand just from a single frame. MediaPipe Hands uses a number of models which are available in an ML pipeline.
A Hand Landmark Model operates on the cropped image region and a Palm Detection Model operates on the whole image and returns an oriented hand bounding box used for the hand detection process respectively. The pipeline created can run in various platforms allowing scalability in mobile and desktops.
IV. FUTURE SCOPE
We can work to create more gestures thus increasing the functionality of the virtual
Furthermore, the proposed method can be developed to handle the keyboard functionalities along with the mouse functionalities virtually which is another future scope of Human-Computer Interaction (HCI).
V. RESULTS AND DISCUSSIONS
???????
Conclusion
1) Gesture recognition gives the best interaction between human and machine.
2) Gesture recognition is also important for developing alternative human computer interaction modalities.
3) Furthermore, the proposed method can be developed to handle the keyboard functionalities along with the mouse functionalities virtually which is another future scope of Human-Computer Interaction (HCI).
References
[1] Khushi Patel , Snehal Solaunde , ShivaniBhong , Prof. Sairabanu Pansare (2022) Virtual Mouse Using Hand Gesture and Voice Assistant 2022 IJIRT International Journal Of Innovative Research In Technology (IJIRT) doi:10.1109/ ijirt.2022 2349-6002
[2] Shashank Salian, Dhiren Serai, Pranav Ganorkar (2015) Hand Gesture Recognition and Cursor Control 2015 RG Research Gate(RG) doi:10.13140/RG.2.1.3185.0082
[3] Tran, D.-S., Ho, N.-H., Yang, H.-J., Kim, S.-H., & Lee, G. S. (2020). Real-time virtual mouse system using RGB-D images and fingertip detection. Multimedia Tools and Applications. doi:10.1007/s11042-020-10156-5
[4] Reddy, V. V., Dhyanchand, T., Krishna, G. V., & Maheshwaram, S. (2020). Virtual Mouse Control Using Colored Finger Tips and Hand Gesture Recognition. 2020 IEEE-HYDCON. doi:10.1109/hydcon48903.2020.9242
[5] Barot, T., & Karhadkar, S. (2018). Development of Platform Independent Hand Gesture Controlled Wearable Mouse. 2018 3rd International Conference for Convergence in Technology (I2CT). doi:10.1109/i2ct.2018.8529544
[6] Ghute, M. S., Anjum, M., & Kamble, K. P. (2018). Gesture Based Mouse Control. 2018 Second International Conference on Electronics, Communication and Aerospace Technology (ICECA). doi:10.1109/iceca.2018.8474905
[7] Abhilash S S1, Lisho Thomas2, Naveen Wilson3, Chaithanya C4 (2018) Virtual Mouse Using Hand Gesture.2018 International Research Journal of Engineering and Technology, (IRJET) doi:10.1109/irjet.2018.