Ijraset Journal For Research in Applied Science and Engineering Technology
Authors: Ashutosh Kabra, Abhishek Jagtap, Jayaprakash Dharmavaram, Shital Raut
DOI Link: https://doi.org/10.22214/ijraset.2023.52690
Certificate: View Certificate
Utilizing embedded systems like Raspberry Pi and camera modules, there has been an increase in interest in developing smart luggage tracking systems in recent years. Travelers will be able to track and monitor their bags in real-time with the help of these gadgets, allowing for a safer and more convenient vacation. This research paper describes the design and development of a smart luggage monitoring system that makes use of a Raspberry Pi and a tracking camera module. The essay also examines the challenges and limitations of implementing such a system and suggests workable solutions. According to the study\'s results, the proposed system can successfully track luggage, reducing the likelihood of loss or theft while also improving travellers’ overall travel experiences.
I. INTRODUCTION
The development of advanced technology has greatly improved the travel industry, increasing comfort and convenience for passengers. One of the most common problems faced by travellers is carrying heavy luggage. This can make the travel experience stressful and exhausting. To address this problem, a team of researchers developed a smart luggage system that leverages state-of-the-art technologies such as Raspberry Pi and camera modules to provide an innovative and practical solution.
The intelligent luggage system has an automated decision-making engine that uses sensors to gather data about user movements. Data is processed using a Raspberry Pi to send messages to the luggage's motors and wheels, allowing for automated tracking of the user. This eliminates the need for users to lift and carry heavy luggage, making travel more comfortable and convenient.
The system's mobile control feature allows users to manually control their luggage via a mobile app, giving them greater control over their luggage. This further enhances the user experience and offers a more personalized and convenient travel experience.
The Smart Luggage System is lightweight, durable and easy to use, making it suitable for a variety of travel scenarios. Furthermore, the system can be adapted to different user preferences and requirements. B. Adjust the baggage speed, change the following distance, or set a different target for the camera module. In addition, the intelligent luggage system uses advanced sensors and algorithms to ensure the luggage follows the user smoothly and precisely, avoiding collisions and ensuring safety. The system is also energy efficient as it uses low power components and long lasting batteries.
The development of intelligent luggage systems demonstrates the technology's potential to improve various industries and provide innovative solutions to everyday challenges. The use of state-of-the-art technology such as Raspberry Pi and camera modules in the system represents a major advance in the field of embedded systems and robotics. In summary, intelligent luggage systems are an innovative and practical solution to the challenges travelers face when carrying heavy luggage. The system uses advanced sensors, algorithms and technology to ensure a seamless and comfortable travel experience, making travel more accessible and enjoyable for everyone.
II. LITERATURE REVIEW
Until now, a lot of study has been done on this kind of robot, which is categorized as "Human Helping Robots." Many different techniques and lines of reasoning have been used by people to realize their design notions. The primary objective was to build a robot that can follow a target or person. Additionally, there has been some research on IoT-based wifi or Bluetooth-controlled robot automobiles.
By creating a smart luggage carrier system with [1] theft prevention and real-time tracking using mini Arduino construction, P.L. Sanathana Krishnan aimed to reduce the stress of moving luggage and security concerns. An autonomous luggage carrier that is compact and light follows the user wherever they go by using signals received by the wearer's smart watch. With this technique, moving the bags manually or automatically is straightforward.
For the detection process, researchers have used a range of sensors, including a [2] laser sensor to find a person in front of the system and a camera module to detect leg movements. To detect the distance between a human and a robot, however, in order for the robot to follow its user at a specific distance, a distance sensor—an ultrasonic sensor is utilized [3].
The sound waves from the ultrasonic sensor are sent out once more and are this time picked up by the sensor after being reflected by individuals.For the investigation, the scientists built a robot that can follow humans on its own inside a defined area. (Amit K. Sharma, 2021).
Ultrasonic and infrared sensors are employed for human following [4]. An ultrasonic sensor is essentially a device that sends out ultrasonic sound waves that are then reflected back after impacting a target object to calculate how far away the target object is The majority of human-following robots are outfitted with a variety of sensors[5] for target recognition and detection, including lasers, IR, IR, RFID, cameras, range sensors, wireless transmitters, and receivers. [6] Automated luggage carriers for suitcases can be made with robots that can follow moving objects, among other uses.
The ESP8266 Wi-Fi module and Arduino UNO development boards were used by the Wi-Fi control robot employing [7] Node MCU (R. K. Mistri) to remotely operate the robot. The Blynk app for Android phones is increasingly being used to manually operate robots. We can design interfaces that suit our demands thanks to an Android programme called Blynk. Additionally, the robot car may be driven by voice commands. To access these commands, utilize an app that converts [8] voice commands into text.The robot can move left, right, forward, and backward with voice commands and manually.
Ultrasonic sensors, the [9] Global Positioning System (GPS), the Global System for Mobile Communication (GSM), a [10] Bluetooth module, a mobile application, and a power bank are some of the methods used in this research. Additionally, the locking mechanism makes use of fingerprints. There is no mechanism in place to [11] prevent theft. It connects to both the user's phone and the bag using the Bluetooth module it carries. They are each handed a power bank to charge their own devices. Only using ultrasonic sensors means that the bag could not always follow the owner. It might even keep following another person. If the bag is lost, the Global System for Mobile Communication (GSM) and [12] GPS will be able to find it, but only if it is close enough to Bluetooth.
The method in this case uses an Arduino board, a GPS module, and an alarm[13]. All other components are connected to the Arduino board. To locate the bag in the event that it disappears, a map is made and synchronized. The alarm informs the user that the object has left their field of vision. The warning has the benefit of making it easier for the bag's owner to locate[14] and identify the bag. The drawback is that if the bags leave the portion of the map that the server has designated, they can no longer be tracked. It's challenging to interact with an application when using Arduino[15]. Due to the use of Arduino, the system will be quite tough to operate and complex.
In this investigation, radio frequency identification is used to identify consumers and their bags. (RFID). RFID tags are attached to luggage and are also present on some traveler tickets. Radio Frequency Identification (RFID) [16] scanners monitor the baggage of the customers. There are three levels of testing in this situation: acceptance testing, the final testing stage that is then advised to users and stakeholders, is used to ensure that the system is free of errors. System testing is used to ascertain whether the work is harmonious and compatible with one another. Only airports may put it into practice[17]. Every location in the airline's network is affected.Object detection is the process of locating an object in an image and determining its location. The process of identifying a object involves two steps. [18] Using computer vision, the first step is to find an object.The object is then followed in the video frames as it goes around, and its prior positions are drawn as it does so. There are several ways to identify whether an object is in the frame. Sliding windows are just a few examples of such techniques [19]. It is one of the simpleset methods by which we may divide the image into smaller patches and categorize each patch into one of two groups, depending on whether or not it contains an object[20].
III. METHODOLOGY
The project aims to create a human following camera using Arduino and a webcam mounted on a servo motor. The camera will track the face of a person and move the servo motor in the direction of the person's face to keep them in the center of the camera's view. This project will use the inbuilt function TrackerCSRT_create() in OpenCV to track the face of the person, and the coordinates of the face will be used to control the servo motor. In this methodology, we will discuss the steps required to build the human following camera.
A. Hardware Requirements
B. Software Requirements
The first step is to connect the webcam and servo motor to the Arduino board using jumper wires. The servo motor will be used to move the camera, and the webcam will capture the person's face. Once the connections are made, the webcam should be mounted on the servo motor.
OpenCV is an open-source computer vision library that can be used to track faces. The library needs to be installed on the system, and the appropriate header files should be included in the program.
The code will be written in the Arduino IDE, and it will use the OpenCV library to track the face of the person. The code will initialize the webcam, create a rectangle around the person's face, and track the face's coordinates. These coordinates will be used to control the servo motor and move the camera to keep the person's face in the center of the camera's view.
Step 3: Testing the Human Following Camera Once the code is written, the camera should be tested to see if it is tracking the person's face correctly. The camera should be placed in front of the person, and they should move around to see if the camera is following their face.
OpenCV provides different tracking algorithms to track objects in a video stream. Two popular tracking algorithms are the mouse tracker and the CSRT tracker. The mouse tracker is a basic tracking algorithm that uses the position of the mouse cursor to track an object in the video stream. It works by setting a bounding box around the object of interest using the mouse cursor, and then tracking the object based on the movement of the bounding box. The mouse tracker is easy to implement and is useful for basic tracking applications.
On the other hand, the CSRT (Channel and Spatial Reliability Tracker) is a more advanced tracking algorithm that uses a combination of color channels and spatial reliability to track an object. The CSRT tracker is based on a discriminative correlation filter and is known for its high accuracy and robustness. It uses both the spatial and color information of the object to track it. The tracker adapts to changes in appearance, scale, and rotation of the object, making it suitable for complex tracking scenarios.
The webcam captures the video, which is sent to the computer through USB. The OpenCV library “cv2.legacy. TrackerCSRT_create()” processes the video to track the human face using a bounding box which the user will make a trace on the human face with the help of a
cursor to whom the user wants to follow.
To control the servo motor's movements, the Arduino board receives values through USB. These values are calculated using the coordinates of the object being tracked and the size of the frame in the video stream. As the range of the x-coordinate varies from 10 to (frame size width of the bounding box), these values are proportionally mapped to the angle of rotation of the servo motor. Specifically, a value of 10 for the x-coordinate corresponds to 0 degrees of rotation for the servo motor, while a value of (frame size - bounding box width) corresponds to 180 degrees of rotation. This ensures that the servo motor rotates smoothly and accurately to follow the object being tracked. The Arduino board controls the servo motor to rotate the webcam to follow the human face. The angle of rotation is determined by the Arduino code based on the coordinates received from the computer. The servo motor and the webcam are mounted together on a stand. The communication between the computer, webcam, and Arduino board is through USB. The Serial Port is used to send the angle data to the Arduino board.
To further enhance the functionality of the camera, a dataset of people is used to identify specific individuals. This is achieved by extracting relevant features from the human face using OpenCV's HOG (Histogram of Oriented Gradients) algorithm. These features are then used as input to a Support Vector Machine (SVM) for training. Once trained, the SVM is able to recognize specific individuals based on their facial features ,as now it will be perfectly differentiating between two users as SVM works more accurately for classification of two things.
IV. RESULTS AND DISCUSSION
The results of this project showed that the use of the TrackerCSRT_create() function provided a highly reliable method for tracking a human face, which was crucial for the success of the camera. In addition, the use of HOG feature extraction and SVM classification allowed the camera to identify specific individuals based on their facial features, which could have potential applications in security and surveillance systems.
The project also demonstrated the potential of using Arduino and servo motors for building custom tracking systems. The ability to convert coordinates into servo motor angles allowed for precise control of the camera's position and orientation, resulting in smooth and accurate tracking of the human face.
position |
(x,y) co-ordinate s |
Center coordinates |
Servo Motor angle |
center of frame |
(199, 167) |
(209.5,178) |
96 |
leftmost of frame |
(50,172) |
(148, 183) |
68 |
rightmost of frame |
(240, 161) |
(334, 173) |
154 |
In this project, we have successfully designed and implemented a smart luggage system that uses advanced technologies such as Raspberry Pi and camera modules to automatically follow the user. This system aims to provide a more convenient and comfortable travel experience for passengers by reducing the hassle of carrying heavy luggage. The automatic decision-making module, mobile-controlled mode, and object tracking using camera modules are the key features of this system. The integration of these features in the luggage system has the potential to revolutionize the travel industry by making it more efficient and user-friendly. This smart luggage system can be further improved by incorporating additional features such as obstacle avoidance, real-time tracking of the luggage, and integration with GPS services. With these features, the system can provide a more personalized travel experience for the passengers. Furthermore, this system can be integrated with other smart systems in airports and other travel-related areas to create a more comprehensive and interconnected smart travel ecosystem. Additionally, this technology can be used in other industries, such as logistics and transportation, to automate the movement of goods and make the process more efficient. Overall, the smart luggage system has immense potential for future development and innovation. teams a lot in practice. The accuracy of the system can be increased if different angles of the same shot are available and stationary background is required. From this data extract x coordinate properly, which will result in predicting the six values more accurately. With the help of these resources the algorithm will give more accurate results.
[1] P. L. S. Krishnan, R. Valli, R. Priya and V. Pravinkumar, \"Smart Luggage Carrier system with Theft Prevention and Real Time TrackingUsing Nano Arduino structure,\" 2020 International Conference on System, Computation, Automation and Networking (ICSCAN), 2020, pp. 1-5, doi: 10.1109/ICSCAN49426.2020.9262445. [2] D. F. Glas, T. Kanda, H. Ishiguro and N. Hagita, \"Simultaneous people tracking and localization for social robots using external laser range finders,\" 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2009, pp 846-853, doi: 10.1109/IROS.2009.5354198. [3] A. K. SHARMA, A. PANDEY, M. A. KHAN, A. TRIPATHI, A. SAXENA and P. K. YADAV, \"Human Following Robot,\" 2021 International Conference on Advance Computing and Innovative Technologies in Engineering (ICACITE), 2021, pp. 440-446, doi: 10.1109/ICACITE51222.2021.9404758. [4] Sowmya B J1, Supriya M2|Robot Controlled Car Using Voice and Wi-Fi Module |International Research Journal of Engineering and Technology (IRJET) |Volume: 08 Issue: 08 | Aug 2021. [5] P.L.Santhana Krishnan, R.Valli, R.Priya ,V. Pravinkumar ,”Smart Luggage Carrier system with Theft Prevention and Real Time TrackingUsing Nano Arduino structure” published in IEEE conference. [6] Afrin Khan, Bandini Nalwade, Neha Kharshinge, Sonali Kamble ,”SMART LUGGAGE SYSTEM” International Research Journal of Engineering and Technology (IRJET) ,2019. [7] Sebin J. Olickal, Amal Yohannan, Manu Ajayan, Anjana Alias, “Smart Bag (it can follow you)”, International Research Journal of Engineering and Technology, April 2017. [8] Shrinidhi Gindi, Irshad Ansari, Kamal Khan, Farooqui Bilal, “Smart Bag Using Solar and RFID Technology”, Imperial Journal of Interdisciplinary Research (IJIR), Issue 5 2016. [9] Sudha Senthilkumar, Brindha.K, Rathi.R, Charanya. R, Mayank Jain, VIT, Vellore – 632 014. Tamil Nadu, India, “Luggage Tracking System UsinG IoT. [10] Deepti Mishra, Alok Mishra, “Improved Baggage Tracking, Security and Customer Service with RFID in Airline Industry”, Acta Polytechnica Hungarica, Feb. 2010. [11] Akansha Bathija , Prof. Grishma Sharma, “Visual Object Detection and Tracking using YOLO and SORT” , International journal of engineering research and technology (IJERT) , November 2019. [12] Weiyao Lin, et.al “A Heat-Map-Based Algorithm for Recognizing Group Activities in Videos” , ieee transactions on circuits and systems for video technology, vol. 23, no. 11, november 2013. [13] Nilesh J. Uke,”Efficient Method for Detecting and Tracking Moving Objects in Video” , 2016 IEEE International Conference on Advances in Electronics, Communication and Computer Technology (ICAECCT),2016. [14] Chandan G, Ayush Jain, Harsh Jain, Mohana, “real time object detection and tracking using deep learning and opencv”, International Conference on Inventive Research in Computing Applications in 2018. [15] Jinsu Lee1, Junseong Bang, and Seong-Il Yang , “Object detection with sliding window in images including multiple similar objects“ , 2017 International Conference on Information and Communication Technology Convergence (ICTC), 14th December 2017 [16] M. Sahasri C. Gireesh ,”Object Motion Detection and Tracking for Video Surveillance”,International Journal of Engineering Trends and Technology (IJETT) 2017. [17] Ibrahim Masri, Erdal Erdal , “review paper on real time image processing: methods, techniques, applications , Research Gate, June-2019 [18] Alexey Bochkovskiy , Chien-Yao Wang , Hong-Yuan Mark Liao , “YOLOv4: Optimal Speed and Accuracy of Object Detection”arXiv:2004.10934v1 [cs.CV] 23 Apr 2020 [19] Amit Chaturvedi , Vikas Kumar , “An Analysis of region growing image segmentation schemes” International journal of emerging trends & Technology In computer science - april 2016 [20] Om Prakash Verma, et.al “A Simple Single Seeded Region Growing Algorithm for Color Image Segmentation using Adaptive Thresholding” - 2011 International Conference on Communication Systems and Network Technologie
Copyright © 2023 Ashutosh Kabra, Abhishek Jagtap, Jayaprakash Dharmavaram, Shital Raut. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Paper Id : IJRASET52690
Publish Date : 2023-05-21
ISSN : 2321-9653
Publisher Name : IJRASET
DOI Link : Click Here