The complexity of controlling an automobile increases with the incorporation of more significant capabilities. Planning automobile-controlling frameworks that enable users to teach the car only to demonstrate what it should do is the focus of numerous research efforts; yet, for non- expert users, using a switch or remote to operate the car is difficult. In light of this, this paper proposes an Arduino-based automobile-controlling framework that eliminates the need for physical car control. This work exhibits two main commitments. First, we demonstrate that hand gestures maybe used to drive the car, with the vehicle agreeing to the hand\'s position and movement. An Arduino Nano, an accelerometer, and a radio-frequency (RF) transmitter are used by the hand-gesture framework. The Arduino Nano, which is attached to the hand glove, receives data from the accelerometer, which is connected to the glove to sense the acceleration powers produced by hand movement .following data reception, Arduino Nano converts the data into unique angle values between 0 and 450° and transmits it to the Arduino Uno\'s RF receiver, which is mounted on the vehicle, via the RF sender. Moreover, a mobile application with unique modes based on Android will operate the suggested automobile framework. The hand-gesture framework was expanded with the addition of the Bluetooth module to become the mobile application framework. In this instance, the Arduino Uno receives the comparing signal whenever the user sends any commands. Once you\'ve acknowledged the signal, The Arduino will compare this to its pre-programmed commands for braking, left, right, forward, and back- ward motion before instructing the motor module to move the vehicle in that direction.
Introduction
I. INTRODUCTION
In the realm of technology, robots is currently becoming one of the most sophisticated. An electromechanical device controlled by a computer software is called a robot. It is possible for robots to be semiautonomous. One kind of robot that you can operate with your hand is a gesture-controlled robot. Motions not by outdated buttons. All you have to do is carry around a little hand-held transmitting device with an acceleration meter. This will enable the robot to receive an appropriate command and do any desired action. The data will be rendered by the ADX L 335 Accelerometer and transmitted via an RF Transmitter module as part of the transmitting apparatus. An RF Receiver module at the entering end allows the decoded data to be admitted, and a crack signal is sent to the L293D Motor motorist. A microcontroller also makes advantage of this information and ultimately our motor driver to operate the motors. Now is the time to divide the work into several modules in order to simplify and make any design easier to complete or error-free. Previously, our design was separated into transmitter and receiver parts. Robotics is used extensively in the automo tive, medical, construction, and defence industries. It is also employed as a firefighting robot to assist individuals affected by fires. Nonetheless, using a remote to operate the robot or a switch has a fair amount of complexity.
Thus, a grounded gesture control robot with an accelerometer is designed. This design's main purpose is to employ hand gestures to control the robot's mobility via an accelerometer. Generally speaking, a robot is an electromechanical device that can carry out activities automatically. Certain robots have some degree of guidance, which can be adjusted via a computer interface or a remote control.
II. LITERATURE SURVEY
In [1] This research investigates a hand gesture-based concept for portable robot control. Hand signals that transmit control signals can cause portable robots to move. Motion recognition is achieved through picture handling, picture counter handling, and other techniques. A portable robot's control is based on regarding identified and decoded data.
In [2] This paper describes a method of using the Arduino Lilypad to operate an automata with hand signals. To regulate the anticipated show, a movement device attached to the gloves is used. The main goal of this technique is to manipulate the robot's hand gesture for exploitation.
In [3] A hand-gesture based control interface for manipulating a car-robot is presented in this work. The user's hand movements are recorded using a three-axis accelerometer. A microcontroller can receive information wirelessly from any frame of association. At that stage, the received signals are converted into one of six orders for car-robot navigational control.
In [4] The main goal of this extension is to use an accelerometer/gyroscope-based gesture controller to control the automated arm's movement. This method is far more useful than using a joystick or console .The primary goal of this paper is to enhance an uncomplicated and strong structure for the protest place on the robot's live demonstration. To survey the suggested question discovery computation and gesture controller, exploratory results are employed.
In [5] This study describes the use of hand gestures to control a robot. They demonstrated a modern client hand discovery strategy in addition to a hand signal finding method that uses the robot's camera to recognize the hand in progressive outlines. They succeeded in making the robot follow the hand that was identified. In the future study participants, the hand's rate of detection will increase.
III. EXISITING SYSTEM
In the early days of robotics, robots were controlled by cables. This served as the robot's physical link to the Stoner control. The cable's length determines the range. To counter this flaw, a wireless connection is established.
With the use of a wireless link and a remote control, the robot is operated. The robot and control panel must be in line of sight since this technology uses infrared transmission. Following remote control, gesture recognition and an image captive system will be implemented. Initially, commands were sent by hand between humans and Stoner, who then recorded them using the camera. This allowed for the reuse of the robot's camera photos and their transfer to Stoner. If the image has previously been registered in the library, the robot will receive the command; if the optimized library offers a relief command, the robot will be admired and receive the command. Having a real library of hand gestures is the most challenging aspect of this idea.
IV. IMPLEMENTED SYSTEM
Natural hand gestures are used in the proposed technique to continuously operate the robot. The stoner's hand inclination position is detected by the detector in the transmitter circuit, which generates a distinct analogue reading and communicates it to the receiver via the RF transmitter. The AT89C51 MCU on the receiving end is responsible for transferring these values to the robot's motor. The robot can now move left, right, and backward as a result. To better understand the concept behind the hand gesture control robot, let's divide the design into three corridors. The MPU6050 Accelerometer Gyro Sensor provides the data to the Arduino in the first stage. Based on user-specified parameters, the Arduino continuously gathers data from the MPU6050 and transmits it to the RF Transmitter. An essential component of the architecture is the ability to link the RF wirelessly between the RF receiver and transmitter. Data is received from Arduino by the RF Transmitter, which then transmits it to the RF Receiver via RF Communication. In the end, the configuration requires the data to be decoded the required signals are received from the RF Receiver and sent to the Motor Motorist IC, which turns on the robot's motor.
V. METHODOLOGY
For designing of an gesture controlled robot the microcontroller is used. In this we use our hand gestures as input signals for driving the robot in varied directions. According to the hand gestures, microcontroller shoot signal to motor motorist of the robot in asked direction. The control hand gestures for the robot are right, left, forward, backward direction and to stop independently. You can move your hand in asked direction and drive the robot as you want.
Figure 1 shows the main features and the entire functioning element of the suggested modified robot motor vehicle for ease of analysis. On the other hand, I/P and O/P stand for input and yield, respectively ,in the system flow. The robot motor vehicle can be controlled and transmitted in two ways. The hand-gesture system is the most important one. Initially, the hand accelerometer powers the commands for speeding up from the hand's direction and transmits them to the hand-associated Arduino Nano. Following data collection, the Arduino Nano transforms the data into distinct points, ranging from 0 - 450 degrees, and transmits it via the RF sender to the Arduino Uno's RF recipient, which is connected to the robot motor vehicle.
Following the information's acceptance, the motor vehicle's Arduino Uno will compare the entered points to a predefined set of points and send a signal to the motor module, which will cause the robot motor vehicle's wheels to move in the direction indicated by the points. Note that the current of the points is predetermined for the robot motor vehi cle’s wheels to go forward, backward, away from obstructions, and to the right. An further method of operation involves operating a robot car using an Android flexible operation, which may be downloaded and opened from the Google Play Store. Here, a signal is sent to the Arduino UNO connected to the stoner when they pass the signals automobile via the built-in, moveable Bluetooth gadget. When Arduino receives the next signal command, it compares it to a predefined instruction and sends the next signal to the motor module, which moves the robot auto's wheels in response to the entered signal. When ultrasonic detector detects an objects an within an given range the robot stops and looks right to check for an hurdle, if there's no hurdle is detected, turn the robot to the right direction and move forward in that direction. However, look left, If there's an hurdle on the right side. still, turns the robot to the left and moves forward in that direction, if there's no hurdle detected still, rotate the robot to 180 degree and go forward, If an hurdle detects in all three directions .Additionally, it can be operated via voice commands through the use of a mobile interface.
VII. FUTURE SCOPE
Hand gesture-controlled gadgets are expected to find widespread use in the military, business, healthcare, and other fields. The future physician will be able to identify the issue by utilizing picture manipulation to study the human body or by employing motions to move the little robot in the mockery. The disabled will be able to accomplish more jobs thanks to this.
Conclusion
The project\'s goal is to use hand gloves with accelerometer sensors attached to control a toy automobile. The sensors are meant to take the place of the standard remote control for operating the vehicle. With the same controls, we\'ll be able to move left and right as well as forward and backward. Accelerometer sensor to regulate the vehicle\'s throttle. according to the hand gestures. The hardware was assembled utilizing the aforementioned parts, forming a robot as a result. The software component was created in the Arduino IDE, where the real direction was determined by analyzing the hand gestures.
References
[1] \" 2. Gesture Control Of Mobile Robot Based Arduino,\" by M.B.S.M. Sofiane TECHOKETECH KEBIRI, in the 8th International Conference on Modelling, Identification and Control
[2] \"Hand Gesture Controlled Robot,\" by R.T.V.R.S.S.B.S. Kantaravan Seker, International Journal of Engineering Research and Technology, Vol. 09, No. 11, 2020
[3] The article \"Gesture Control Robot Using Arduino\" by W.M.N.N.A. Parimala was published in the International Journal of Advanced Science and Technology in 2020. It can be found on pages 4196-4203.
[4] Gesture Control Robot: R. S. B. B. D. N. K. P. Prof. Chetan Bulla, International Journal of Research in Electronics and Computer Engineering, vol. 7, no. 2, 2019.
[5] Hand Gesture Controlled Robot, S. C. Narsingoju Adithya, International Journal of Recent Technology and Engineering (IJRTE), vol. 8, no. IS4, 2019.