Ijraset Journal For Research in Applied Science and Engineering Technology
Authors: Lokesh Gagnani, Hetvi Patel, Sakshi Chaturvedi, Rajeshwari Jaiswal
DOI Link: https://doi.org/10.22214/ijraset.2022.47077
Certificate: View Certificate
This project advocates a Human Computer Interaction method where cursor movement can be controlled via real-time camera by making use of human hand postures recognition. This method is an alternative to current techniques, which include manually pressing buttons or using physical computer mouse. Instead, it uses camera and computer vision software to manage different mouse events and can carry out any action that a conventional computer mouse can. The Virtual Mouse color recognition application will continuously gather real-time photos, which will then go through several conversions and filters. Once everything is converted, the application will use image processing to extract coordinates of specified color position from converted frames. The process then compares current color schemes within the frames to a list of color combinations, where each combination corresponds to a particular set of mouse operations. If the current color scheme matches, the application will perform the mouse command, which will be converted into a real mouse command on the user\'s computer. In addition to that authors have also developed a voice assistant to improve user productivity by managing routine tasks of the user and by providing information from online sources to the user because voice assistant is effortless to use.
I. INTRODUCTION
This project suggests a method for moving the cursor with just your hands, without the aid of any gadgets. The proposed system is designed to perform operations like right and left click, double click, drag and drop, multiple item selection, volume control etc. For the suggested system, the only necessary input device is a webcam. Python and OpenCV are the two pieces of software needed to put the suggested method into practice. On the system’s screen, the camera’s output will be shown so that the user may adjust it further. NumPy, math, wx, and mouse are the python dependencies that will be utilised to create this system.
The other functionality offered by this project is voice assistant. Voice searches have dominated over text search. The analysts are already predicting that 50% of searches will be via voice by 2024.Voice assistants are turning out to be smarter than ever. Voice Assistant is helpful to ease our day-to-day tasks such as showing date and time, performing searches on google, finding any location on google maps, opening any application etc. The voice assistants can take commands via text or by voice. An activating word, also known as a wake word, is required for voice-based intelligent assistants before the order may be given.For this project the wake word is quantum.
II. LITERATURE SURVEY
III. BACKGROUND
A. Scope
Our objective is to offer more gestures so that users may complete more tasks quickly in the future. This proposal suggests a system that only makes use of the proper hand when making gestures. As a result, future improvements to the technique currently in use will allow for the use of both hands for various gestures. Many applications have been substantially improved through the rapid development of hand gesture recognition systems.
Projects should accomplish the goal for which they were created, demonstrating the effectiveness of their execution. And through the voice assistant feature, authors have automated tasks such as finding locations on Google Maps, navigating documents, launching and stopping gesture recognition, performing Google searches, and sleeping / waking up the voice assistant. this function will benefit the user by saving them time and effort, as well as making computers more accessible to people who are blind or disabled.
B. The Root of Problem being Studied
The proposed Gesture controlled mouse system can be used to overcome problems in the real world such as situations where there is no space to use a physical mouse additionally for those who have hand issues and are unable to use a physical mouse. Moreover, there are various other problems like Physical mouse being subjected to mechanical wear and tear. It requires special hardware and a surface to operate. It is not easily adaptable to different environments and its performance varies depending on the environment.
Even in the current operational contexts, the mouse’s capabilities are limited. All wired mouse and wireless mouse have its lifespan so the proposed mouse can be used to overcome these problems since hand gesture and hand Tip detection is utilized to operate the PC mouse using a webcam or an internal camera. Similarly, voice assistants could prove to be helpful for solving the problems related to the speed and errors that are faced when the searches are made by typing. It is a proven fact that voice is reputed to be four times faster than a written search: whereas one can write about 40 words per minute, one can speak approximately 150 words per minute.
IV. PROPOSED ALGORITHM
A. Gesture Controlled Mouse
Binary values given to different gestures are:
FIST = 0
PINKY = 1
RING = 2
MID = 4
LAST3 = 7
INDEX = 8
FIRST2 = 12
LAST4 = 15
THUMB = 16
PALM = 31
V_GEST = 33
TWO_FINGER_CLOSED = 3
PINCH_MAJOR = 35
PINCH_MINOR = 36
4. Step-4: Recognition and Execution of Hand Gestures: Finally, after successful mapping and extraction of different landmarks of hand meaningful hand gestures are recognized and executed. This is done with the help of both OpenCV and Mediapipe modules. Depending on how many and what kinds of fingers are found, hand gestures can be anticipated. What fingers are detected depends on the content of the fingers? For example, if five fingers are detected it is classified as a neutral gesture, if two fingers are detected it is classified as a move cursor gesture, if the right finger is pulled down it is classified as a right click, etc.
B. Voice Assistant
The work started with analyzing the audio commands given by the user through the microphone. This can be anything like getting any information, operating the computer’s internal files, etc.
The following algorithm shows how the voice assistant works.
V. IMPLEMENTATION
VI. ACKNOWLEDGMENT
This research paper would not have been possible without the guidance, assistance, and suggestions of many individuals. This acknowledgment transcends the authenticity of convention while we might express appreciation to all those persons on the back of the screen who directed and encouraged us for the accomplishment of our project. We would like to express our deepest appreciation and wholehearted thanks to Lokesh Gagnani. He has been a perpetual source of direction throughout the course of this research paper
In this paper, authors have discussed an alternative to the conventional physical mouse that provides mouse functions with the help of computer vision that houses a web camera that recognizes fingers and hand gestures and processes the captured frames and uses a machine learning algorithm to execute the defined mouse functions like moving the cursor, right-click, left click and scrolling function. After testing authors have concluded that the proposed virtual mouse system has worked exceedingly well and with great accuracy and the current system has overcome the drawbacks of the other systems. Through the Voice Assistant, authors have automated various services using a single-line command. It eases most of the tasks of the user like searching the web, file navigation, finding a location on google maps, opening any application, etc. The future plans include integrating quantum with mobile using React Native to provide a synchronized experience between the two connected devices. It is basically designed to minimize human efforts and control the device with just a human Voice.
[1] https://www.hindawi.com/journals/tswj/2014/267872/ [2] https://courses.ece.cornell.edu/ece5990/ECE5725_Spring2019_Projects/Wednesday_051519/pa394_md848/index.html [3] http://eprints.utar.edu.my/2262/1/IA-2016-13ACB07377-1.pdf [4] https://www.irjet.net/archives/V5/i4/IRJET-V5I4872.pdf [5] https://www.researchgate.net/publication/280112512_Hand_Gesture_Recognition_and_Cursor_Control [6] https://www.studocu.com/in/document/lords-universal-college/human-machine-interaction/545737271-voice-assistant-project-report/282760
Copyright © 2022 Lokesh Gagnani, Hetvi Patel, Sakshi Chaturvedi, Rajeshwari Jaiswal. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Paper Id : IJRASET47077
Publish Date : 2022-10-14
ISSN : 2321-9653
Publisher Name : IJRASET
DOI Link : Click Here