Ijraset Journal For Research in Applied Science and Engineering Technology
Authors: Anubhav Sharma, Shikhar Srivastava, Vinay Kanaujiya, Uttam Kumar, Tushar Gupta, Vaibhav Tyagi
DOI Link: https://doi.org/10.22214/ijraset.2023.49381
Certificate: View Certificate
One of the marvels of Human-Computer Interaction (HCI) technology is the mouse.Since a wireless mouse or Bluetooth mouse still need a battery for power and a dongle to connect it to the PC, they are not entirely device-free at this time. In the modern era of invention, a remote mouse or a contact-less mouse still uses gadgets and isn\'t entirely free of them because they use power from the gadget or, occasionally, from external power sources like batteries and take up more space and electricity. Without using a real mouse, the computer can be virtually controlled using hand gestures to accomplish left-click, right-click, scrolling, and computer cursor tasks..
I. INTRODUCTION
The devices we use every day are getting smaller thanks to the advancement of Bluetooth or wireless technologies, as well as advancements in the field of augmented reality.In this research, a computer vision-based AI virtual mouse system is proposed that uses hand motions and hand tip detection to perform mouse functionalities. The primary goal of the suggested system is to replace the use of a typical mouse device with a web camera or a built-in camera in the computer to perform computer mouse pointer and scroll tasks. A HCI [1] with the computer is used to recognise hand gestures and hand tips using computer vision. Using a built-in camera or web camera, we can monitor the fingertip of a hand gesture with the help of the AI virtual mouse system, perform mouse cursor operations, perform scrolling, and move the cursor along with it.
Python programming language is utilised for empowering the AI virtual mouse structure, what's more, Open CV that can't avoid being that the library for varied PC vision is utilised at ranges the AI virtual mouse framework.Additionally, Numpy, Autopy, and PyAuto GUI packs were used to propel the PC screen and perform verbalizations such as left click, right click, and examining limits inside the projected AI virtual mouse using hand signals.
The model uses the Python Media-pipe group for the journey for the hands and for pursuit of the tip of the hands.. Consequently, the projected model exhibits high accuracy in a perfect world. As a result, the projected model can function admirably in straightforward applications without using the PC GPU.
Major corporations are developing a technology based on the hand gesture system for applications like:
The goal is to create and put into use a different mouse cursor control system.An alternate approach uses a webcam and a color-detection technique to recognise hand gestures.The ultimate goal of this work is to create a system that uses any computer's colour detection technology to detect hand gestures and control the mouse cursor.
A. Problem Description and Overview
The proposed AI virtual mouse system can be utilised to solve real-world issues, such as those where there isn't enough room to use a physical mouse or for people who have hand issues and aren't able to handle one.
While using a Bluetooth mouse or a remote, a few particular equipment, such as the mouse, the device to connect with the computer, and aside from the battery to power the mouse to operate a utilised, As a result, the client controls the PC mouse activity throughout by using his or her natural camera or visual camera as well as hand movements.
B. Objective
The main goal of the proposed AI virtual mouse system is to create a replacement for the conventional mouse system that can perform and control mouse functions. This can be done with the aid of a web camera that records hand gestures and hand tips and then processes these frames to perform the specific mouse function, such as the left click, right click, and scrolling function.
II. RELATED WORK
There have been related studies on virtual mice that use gloved hands to detect hand gestures and color-tipped hands to recognise gestures, but these studies do not improve mouse functionality.The wearing of gloves causes the recognition to be less accurate; also, certain users may not be able to utilise the gloves; and occasionally, the failure to identify colour tips causes the recognition to be less accurate.There have been some attempts at camera-based hand gesture interface detection..
III. ALGORITHM USED FOR HAND TRACKING
The Media Pipe framework is utilised for hand tracking and gesture detection, and the OpenCV library is used for computer vision. The method utilises machine learning principles to monitor and identify hand movements and hand tips.
A. Media Pipe
The Media Pipe framework is used by the developer to create and analyse systems using graphs as well as to create systems for application-related purposes. Developers utilise the Media Pipe library to create and analyse various models using graphs, and many of these models have been incorporated into applications.
The ML pipeline used by Media Pipe Hands consists of several interconnected models.
The Media Pipe-integrated model must function in a pipeline-like manner. Graphs, nodes, streams, and calculators make up the majority of it.
The Media-Pipe structure is used by the engineer to design and deconstruct the frameworks using diagrams, as well as to foster the frameworks for the purpose of machines.
The means worried inside the framework that utilises Media- Pipe square measure administrated inside the line setup. The pipeline will function in a number of steps, enabling quantity friability in work and portable locations.The Media-Pipe structure is built on the foundation of three basic components: execution investigation, system for recovering identifying data, and a collection of reusable mini-computers.
A pipeline is a diagram made up of number crunching components where each mini-computer is connected by streams that the information packets pass through.
For live and streaming media, Mediapipe provides open source, cross-platform, and configurable ML solutions. This is advantageous in a variety of circumstances, including:
B. Open CV
A computer vision and machine learning software library called OpenCV is available for free use.More than 2500 optimised algorithms are available in the collection, including a wide range of both traditional and cutting-edge computer vision and machine learning techniques.This library, which is built in the Python programming language, aids in the creation of computer vision applications.In this approach, the OpenCV library is used for object and face detection and analysis as well as image and video processing.[11] The theories of hand segmentation and the hand detection system, which use the Haar-cascade classifier, can be used to the development of hand gesture recognition using Python and OpenCV. The OpenCV library is utilised in the processing of images and videos as well as in analysis tasks like face and object detection.
IV. METHODOLOGY
Pre-processing, or more specifically, picture handling, is an earlier development in computer vision. Its goal is to transform a picture into a structure appropriate for further examination.Examples of activities like openness correction, shade adjustment, picture sound reduction, or increasing image sharpness are extremely significant and quite consideration requesting to achieve suitable results.
The flowchart explains the many actions and conditions that are employed in the system:
A. Camera Used in the Virtual Gesture Mouse project
In the proposed model, the system runs off of frames taken by either the built-in camera or a separate web camera.OpenCV, a Python library, enables the camera window to open and begin video recording.The AI mouse system receives these video frames from the web camera.
B. Detecting Which Finger Is Up and Performing the Particular Mouse Function
Using the tip Id of the specific finger that we located using the MediaPipe and the corresponding co-ordinates of the fingers that are up, as shown in Figure 5, we are able to determine which finger is up at this point. Then, in accordance with that determination, the specific mouse function is carried out.
C. Detect the Finger tips & doing the Mouse Cursor Movements
According to this framework, the actual mouse perform is used to carry out its tasks because the AI mouse is police evaluation that finger is up misleading the spot co-ordinate of the specific finger that it'll found abuse the Media-Pipe and thus the individual bits of the fingers that region unit up.
V. EXPERIMENTAL RESULTS AND EVALUATION
The idea of employing computer vision to advance human-computer interaction is presented in the suggested AI virtual mouse system.Just a limited number of datasets are obtainable for the suggested AI virtual mouse system's verification and testing.The hand tracking, fingertip identification, and gesture recognition have all been done under a variety of lighting situations and camera distances.For tracking of the hand gestures and hand tip detection, the webcam has been placed at various distances from the user in order to evaluate the hand gestures and finger tip detection in various lighting situations. Each person tested the AI virtual mouse system 10 times in normal light, 5 times in faint light, 5 times in close proximity to the webcam, and 5 times in a long proximity to the webcam. The test was carried out 25 times by 5 people, yielding 600 gestures with manual labelling. It was also made in different lighting conditions and at different distances from the screen.
VI. FUTURE SCOPE
Making PC understanding initiatives that can handle ongoing problems and accomplish goals for affiliations, everyday responsibilities, and persons is the work at hand.There is formal training in boosting machine games, talk affirmation machines, language revelation, PC vision, ace systems, advanced mechanics, etc.The suggested AI virtual mouse has various drawbacks, including a little reduction in right click precision and some issues with the model's ability to click and drag to pick text.These are some of the drawbacks of the suggested AI virtual mouse technology, which we will try to address in our upcoming research. In addition to the aforementioned, key board functionality may be added to imitate keyboard and mouse operations simultaneously, which shows the potential for the future.
VII. APPLICATIONS
Man-madeMany uses of consciousness exist in contemporary society.It is becoming essential at our time since it can deal with complex challenges in a variety of endeavours, such as health care, distraction, finance, education, and so forth, in an efficient manner.Our daily lives are becoming easier and faster thanks to computer intelligence.
Some Applications:
The primary goal of the AI virtual mouse system is to replace the use of a physical mouse with hand gestures for controlling mouse cursor functionalities.The suggested system can be implemented utilising a webcam or an integrated camera that recognises hand movements and hand tips and processes these frames to carry out certain mouse actions.After testing, we have determined that the suggested virtual mouse system has outperformed other models that were previously offered and discussed in the relevant work in terms of performance and accuracy. It has also overcome the shortcomings of the other systems.This means that the suggested AI-based virtual mouse technology can be applied in real-world settings and in real-time. The model has some limitations such as small decrease in accuracy in right click mouse function and some difficulties in clicking and dragging to select the text. Hence, we will work next to overcome these limitations by improving the finger tip detection algorithm to produce more accurate results.
[1] J. Katona, \"A review of human-computer interaction and virtual reality research domains in cognitive infocommunications\", Applied Sciences, vol. 11, no. 6, p. 2646, 2021. [2] \"Real-time virtual mouse system using RGB-D pictures and fingertip detection,\" Multimedia Tools and Applications, by D.-S. Tran, N.-H. Ho, H.-J. Yang, S.-H. Kim, and G. S. Lee80, no. 7, Multimedia Tools and Applications, 2021, pp. 10473-10490. [3] K. P. Vinay, \"Cursor control via hand gestures,\" International Journal of Critical Accounting, volume 0975-8887, 2016. [4] Computer vision system for food quality evaluation—a review. Proceedings of the 2013 International Conference on Current Trends in Engineering and Technology (ICCTET), Coimbatore, India, July 2013. P. Nandhini, J. Jaya, and J. George. [5] S. U. Dudhane, \"Hand gesture detection for cursor control system,\" IJARCCE, vol. 2, no. 5, 2013. [6] David Lee, Hsieh, Chen-Chiung, and Dung-Hua Liou. \"A system that recognises hand gestures in real time using motion history images.\" IEEE, 2010. Second International Conference on Signal Processing Systems. [7] D. L. Quam, \"Gesture recognition with a DataGlove,\" IEEE Conference on Aerospace and Electronics, vol. 2, 1990, pp. 755-760. [8] \"Design and development of hand gesture based virtual mouse,\" in Proceedings of the 2019 1st International Conference on Advances in Science, Engineering and Robotics Technology (ICASERT), Dhaka, Bangladesh, May 2019, pp. 1-5. K. H. Shibly, S. Kumar Dey, M. A. Islam, and S. Iftekhar Showrav.
Copyright © 2023 Anubhav Sharma, Shikhar Srivastava, Vinay Kanaujiya, Uttam Kumar, Tushar Gupta, Vaibhav Tyagi. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Paper Id : IJRASET49381
Publish Date : 2023-03-03
ISSN : 2321-9653
Publisher Name : IJRASET
DOI Link : Click Here