Ijraset Journal For Research in Applied Science and Engineering Technology
Authors: Amal Alex, Rajesh Kannan Megalingam
DOI Link: https://doi.org/10.22214/ijraset.2022.45392
Certificate: View Certificate
Pipelines are mainly constructed to transport all kinds of fluids and gases. Many accidents had occurred from fluid leaks because of cracks and corrosion of pipelines. To eliminate or minimize the accidents periodical inspection of pipelines should be done. Analysis and control of autonomous robot for pipeline inspection requires design and model of a robot equipped with the proper sensors. This project deals with the design, modelling (software) and simulation of pipeline inspection robot. The robot is an autonomous mobile robot. The robot is placed at the entrance of the pipe, by giving suitable commands it will move forward and inspect the defects inside the pipe. This inspection robot includes camera for visual inspection to identify the cracks and corrosion in pipe. It captures the inner images of the pipe for further investigation. This robot also includes a LIDAR sensor for mapping of the pipe. The mapping and navigation are all done in ROS (Robot Operating System) with help of Gazebo and Rviz. The design of the robot is created in Fusion 360 and then it is transferred into Gazebo. The simulation of robot is done in circular pipes.
I. INTRODUCTION
Pipelines are used to transport all kinds of fluids, such as toxic, highly flammable fluid and others partially unreactive. In every scenario, it is important that transported fluid is to be contained within the pipeline, under ideal situations. However, every pipe is depended on the material from which it is designed, it deteriorates progressively with time, and the pipe becomes prone to cracks and corrosion. Many accidents have occurred from fluid leaks due to the cracks and corrosion of pipelines. Autonomous pipe inspection method is introduced to improve the efficiency and reduce manpower in the inspection process.
One effective way of doing this is to perform regular or periodical inspection of pipelines. The inspection of the pipeline is analyzed to identify the areas/points of possible fluid leaks and to do maintenance operations for the pipeline. In this way the pipe condition is monitored and accidents are reduced to the minimum.
Pipe Inspection is an essential method for fitness-for-service (FFS) assessments of surfaces, piping of nuclear plants, oil and gas terminals, refineries, industrial sites, cased pipeline crossings etc.
This pipeline inspection robot has the ability to move from one point to another within the pipeline, the camera detects the defects inside the pipe. With active revolute mechanism this robot can move in pipes and ducts with round and square cross sections. The LIDAR sensor gives depth map of the inner surroundings of the pipe.
The simulation is done in Gazebo and Rviz. In Rviz the camera module, LIDAR module and navigation module is added. The environment and robot model are transferred to Gazebo. Robot is controlled by command known as teleop twist keyboard. The platform used is Ubuntu through ROS (Robot Operating System) command.
II. PROBLEM STATEMENT
In order to maintain the pipe, one needs to inspect the pipe periodically or regularly. Pipes are to be checked from the inside and outside also, thus to do that we need to do inspection of pipes.
It is a challenging task for humans to work manually. There are certain areas which are inaccessible to humans and also the pipe diameter may be very small and as a result one cannot reach inside. Sometimes the oxygen level in the pipe will also be low making it difficult for human assistance and a bystander should also be there near the pipe for help. To overcome all these problems this robot can seamlessly enter into the pipe and inspect it for any defects. The robot can easily move in circular pipes.
The robot in real life scenario can reduce human intervention and achieve better accurate results.
This robot can also move autonomously in difficult areas where human presence is impossible.
The robot uses camera sensor to visualize the defects and it captures images for further improvement. The LIDAR sensor maps the entire pipe for inspection.
III. RELATED WORKS
The authors proposed, the design of a pipe line inspection robot that can climb vertical and horizontal pipes is modelled and simulated [1]. The stress simulation and analysis were all done in ADAMS software. Analysis such as displacement, velocity, spring force, and torque curves are done. The design that is shown in the paper have stable motion and can climb vertical pipes without much issue. The authors introduced a pipeline inspection robot which Consists of Active and Passive Compliant Joints [2]. In this paper AIRo-5.1 is a pipeline inspection robot that consists of two passive compliant joints and a single active compliant joint that is driven by a series elastic actuator (SEA). To sense the joint torques, an improved durable polyurethane rubber spring is installed and to smoothly pass-through T-branches, the angle trajectory of middle joints is calculated based on the pipe geometry and thus it was interpolated using a cosine curve. In this paper the authors proposed an intelligent and efficient path guidance robot to assist the visually impaired people in their movements [6]. There was a novel device for the replacement of guide dogs. The robot has the capability to move along multiple paths and then remember as well as retrace all of them, thus making it a perfect for a guide dog which is often not affordable for the ninety percent of blind people living in low-income settings. In this research work they are checking the flexibility of a SLAM based mobile robot to map and navigate in an indoor environment [9]. It is purely based on the Robot Operating System (ROS). The robot model is created using Gazebo and simulated in Rviz. The mapping is done by using GMapping algorithm, which is an open-source algorithm. The authors reviewed the different types of pipeline robots based on various characteristics such as accuracy in inspection, size and adaptability to shape, flexibility, vertical mobility, scalability, cost, speed, design complexity, impact on the pipe and motion efficiency [4]. In this paper they have compared the analysis of different pipe line inspection robots. The main aim of the paper was to study the differences between the planned path and travelled path of a virtual differential drive robot in Gazebo-ROS simulator while using ROS navigation stack [5]. An environment was created in the GAZEBO for simulation. For the path comparison experiment the differential drive robot was autonomously navigated to four fixed destinations in the environment using a python command. In this paper authors presents modeling and simulation of a mobile robot for pipeline inspection with the help of MATLAB and V-REP software [13]. Mechanical structure of the robot was described with focus on pedipulators, used to change pose of track drive modules to adapt to different pipe sizes and shapes. Modeling of the pedipulators was shown with application of MATLAB environment. The models were verified using V-REP and MATLAB co-simulations. Finally, operation of a prototype was shown on a test rig. The robot utilizes joint space trajectories and mathematical models for the pedipulators. The author introduced a gamer’s steering wheel is used to control the movement and speed of the robot in various schemas [8]. The integration of the robot with Gazebo needs a well-equipped robot which understands all the transforms from ROS base_link to all other ROS child_links.
IV. DESIGN AND IMPLEMENTATION
A. Architecture
A robotic system is a collection of sensors and actuators that can interact and communicate with the environment to accomplish variety of tasks. The main objective of a robotic is to accomplish a specific set of tasks, but there are often many difficult tasks that must be handled to ensure that the robot operates in a safe and efficient way. The use of a well-built architecture, together with programming tools can help to manage the complexity. Currently, there is no single architecture that is best for all applications, different architectures have its own advantages and disadvantages. The Fig.1 shows the block diagram of the robot. The robot consists of camera to record and identify defects inside the pipe. It also includes LIDAR sensor for depth mapping of pipe. The mapping, path planning and localization are all done in Gazebo and Rviz using ROS (Robot Operating System) in Ubuntu. Different Gazebo plugins and packages are used for simulation. Skid Steering Drive plugin is used for four-wheel drive of the robot inside the pipe.
B. Design of the Robot
A robotic design is the creation of a plan for the construction or modelling of a robotic system. It is the important part of a robot. The design and modelling of the robot is done in fusion 360 software. Fusion 360 is an extremely powerful software were simulations, modelling, manufacturing, design can be done. The robot consists of a simple design consisting of camera, LIDAR and four wheels mounted in the body as shown in the Fig.2. The four wheels are given as revolute joint. The camera and LIDAR are given as rigid joint. The material of the body is ABS plastic and wheel is nitrile rubber. The camera is used to identify defects and also take photographs for future references. The LIDAR sensor is used for depth mapping of the pipe. At first the parts were given as components and later it was all assembled as a single component. The robot should be above the axis of the plane or the model will not work properly in simulation. The wheel diameter is 80mm and length of body is 170mm.
C. Programming of Robot
Robot programming is the main development of a control scheme for how a robot interacts with its environment and achieve its tasks. A variety of programming languages can be used to program robots such as C/C++, Python, Java etc. Robotics consists of different electronics, mechanics, and coding software to do specific jobs. The most important part of a robot is programming, without coding the robot cannot do a particular task. Robots are not self-learning machines like humans, we have to give certain type of instructions in the form of programming. In this robot different coding packages are used such as skid steering drive, camera sensor plugin, LIDAR sensor plugin, navigation and mapping packages. All these programming is done in ROS command through Ubuntu platform. By uploading all these programming, the robot can autonomously perform tasks given by the user. The Fig.3 shows the program for camera sensor plugin, the camera identifies the defects in the pipe. The Fig.4 shows the program for skid steering drive, this drive helps the movement of the robot inside the pipe. The Fig.5 shows the program for LIDAR sensor, the sensor provides depth mapping of pipe.
D. Simulation of Robot
The simulation environment is all done in Gazebo and Rviz by ROS command through Ubuntu platform. The pipe model is first created in Fusion 360 and then it is transferred into gazebo as URDF file format. The pipe diameter is 100mm as shown in Fig. 6. By adjusting the axes of the plane in gazebo the robot is placed inside the pipe as shown in the Fig.7, the blue light shows the lidar sensor to map the whole pipe.
There is a Rviz platform for mapping and navigation of the robot shown in figure 8. In Rviz it contains camera module of the robot to see inside of the pipe. All the programming part is done in ROS. When launching Gazebo simultaneously Rviz should also launch then only the connection is established. The environment in Gazebo is also added in Rviz. The robot is controlled through ROS command known as rosrun teleop_twist_keyboard teleop_twist_keyboard.py shown in figure 9. The camera unit of the robot is shown in Fig.10.
V. EXPERIMENT AND RESULTS
For experiment the robot is brought into the Gazebo and Rviz shown in fig 11. The robot travels inside the pipe using teleop_keyboard twist command in ROS. During the travel the camera of the robot detects the defects inside the pipe. The final result was that the robot successfully completed the pipe inspection by identifying defects through the camera. The robot was able to move autonomously inside the pipe. The LIDAR sensor was able to do depth mapping of pipe. In the pipe the robot was not able make a turn and there were flaws in the navigation. The robot failed many times in attempt to move inside the pipe but it achieved almost 80% accuracy in identifying the defects. The robot can also move in square and circular pipes. The movement of the robot was perfect in the pipe.
VI. FUTURE WORKS
This robot currently consists of a camera, LIDAR sensor and four-wheel drive mechanism. The future works can be implementation of ultrasonic sensors for better detection of defects inside the pipe. Use of tilted and guide wheels in the robot for movement in curves and bends in pipes. Long range sensors can be added to capture 3D structure of the environment. The robot can be also implemented as a bore well rescue robot. Graphical user interface (GUI) can be created for more user-friendly approach.
VII. ACKNOWLEDGMENT
To our most beloved Amma whose impeccable love and grace was the invisible key behind my success. I pray to be worthy of her grace, already bestowed. I am forever indebted towards my mother because without her, I won’t even be doing this course. I express our heartfelt gratitude to our project guide Dr. Rajesh Kannan Megalingam, Director, HuT Labs, Amrita School of Engineering, Amritapuri Campus for his expert guidance, constant encouragement and creative suggestions during this project. His enthusiasm, knowledge and attention have given an inspiration and kept my work on progress from my first encounter to the final draft of this paper. The generosity and expertise have improved this study in innumerable ways and saved me from many errors; those that inevitably remain are entirely my own responsibility.
Last but never the least, I thank the Supreme; the Almighty for literally everything.
Pipe line inspection robots can be effectively used as a tool to carry out inspection in hazardous and unreachable work environments. The design goal of the pipe line inspection robot to adapt different diameters of pipe is accomplished. The design and modelling of the robot is done in Fusion 360 software. This robot can travel autonomously using the help of navigation and Gmapping packages. It can detect defects such as rust, leakage, wear and tear through the camera. The LIDAR sensor used in the robot is for depth mapping of the pipe. The design model is transferred from Fusion 360 to Gazebo in the format of URDF. The simulation is all done in Gazebo and Rviz using ROS through Ubuntu platform. This robot can move in square and circular pipes. The robot is developed for small-scale applications. Different packages and plugins are used for the robot to navigate autonomously. This robot can be housed with different sensors for more precise and accurate operations. The robot is limited in several ways and can be worked upon to broaden its features and applications. With the help of this robot labour force can be minimized and accurate results can be obtained. This robot can work in difficult environments where human presence is impossible.
[1] Carl Kenneth F. Flores, John Ira C. Nagar, Zachary Raphael B Origenes et al - Design, Modelling, and Simulation of a Wheeled, Wall-Pressed, In-Pipe Inspection Robot for Pipes with 6-8 inches Inside Diameter - 2021 IEEE International Conference on Automatic Control & Intelligent Systems (I2CACIS), 26 June 2021,vol 6. [2] Atsushi Kakogawa and Shugen Ma - A Multi-link In-pipe Inspection Robot Composed of Active and Passive Compliant Joints- IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 24 Jan. 2021, vol 8. [3] Karthik CH and Pramod Sreedharan- Design and Development of Pipe-inspection robot with vision 360 degree- Journal of Physics: Conference Series ICCIEA, Vol 2062, (2021) ,012015. [4] Prasanthi Ambatia, K M Suman Raj and Joshuva- A Review on Pipeline Inspection Robot- 3rd International Conference on Frontiers in Automobile and Mechanical Engineering (FAME 2020) , AIP Conference Proceedings 2311, 07 December 2020, vol 5. [5] Rajesh Kannan Megalingam, Anandu Rajendraprasad and Sakthiprasad Kuttankulangara Manoharan- Comparison of Planned Path and Travelled Path Using ROS Navigation Stack- 2020 International Conference for Emerging Technology (INCET) Belgaum, India. Jun 5-7, 2020. [6] Rajesh Kannan Megalingam, Souraj Vishnu, Vishnu Sasikumar & Sajikumar Sreekumar- Autonomous Path Guiding Robot for Visually Impaired People- Cognitive Informatics and Soft Computing. Advances in Intelligent Systems and Computing, vol 768. Springer, 2019. [7] Roussialian, M., Al Zanbarakji, H, Khawand, A., Rahal, A and Owayjan, M -. Design and Development of a Pipeline Inspection Robot. Mechanisms and Machine Science, 43–52, doi:10.1007/978-3-319-89911-44, (2018). [8] Rajesh Kannan Megalingam, Deepak Nagalla, Ravi Kiran Pasumarthi, Vamsi Gontu and Phanindra Kumar Allada- ROS Based, Simulation and Control of a Wheeled Robot using Gamer’s Steering Wheel- 4th International Conference on Computing Communication and Automation (ICCCA), IEEE,2018. [9] Rajesh Kannan Megalingam, Chinta Ravi Teja, Sarath Sreekanth and Akhil Raj - ROS based Autonomous Indoor Navigation Simulation Using SLAM Algorithm- International Journal of Pure and Applied Mathematics-,Volume 118,2018. [10] Mohammed, M. N, Shini Nadarajah, V, Mohd Lazim, N. F, Shazwany Zamani, N, Al-Sanjary, O. I, Ali, M. A. M, and Al-Youif, S. Design and Development of Pipeline Inspection Robot for Crack and Corrosion Detection, IEEE Conference on Systems, Process and Control (ICSPC). doi:10.1109/spc.2018.8704127, 2018. [11] Rajesh Kannan Megalingam, Sricharan Boddupalli and K G S Apuroop- Robotic Arm Control through Mimicking of Miniature Robotic arm- International Conference on Advanced Computing and Communication Systems (ICACCS), Jan. 06 – 07, 2017. [12] Rajesh Kannan Megalingam, Deepak Nagalla, Pasumarthi Ravi Kiran, Ravi Teja Geesala and Katta Nigam “Swarm based Autonomous Landmine Detecting Robots” (ICICI) IEEE Xplore Compliant - Part Number: CFP17L34-ART, ISBN: 978-1-5386-4031-9, 2017. [13] Micha Ciszewski, Lukasz Mitka, Tomasz Buratowski and Mariusz Giergiel - Modelling and simulation of a tracked mobile inspection robot in MATLAB and VREP software , Journal of Automation, Mobile Robotics & Intelligent Systems, VOLUME 11, 18th February 2017. [14] H. Gopinath, Indu V, and Dharmana, M. M, “Development of autonomous underwater inspection robot under disturbances”, International Conference on Technological Advancements in Power and Energy (TAP Energy), Kollam, India, 2017. [15] Hoshina, M, Mashimo T, and Toyama, S. Development of spherical ultrasonic motor as a camera actuator for pipe inspection robot. IEEE/RSJ International Conference on Intelligent Robots and Systems. doi:10.1109/iros.2009.5354315, 2009.
Copyright © 2022 Amal Alex, Rajesh Kannan Megalingam. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Paper Id : IJRASET45392
Publish Date : 2022-07-06
ISSN : 2321-9653
Publisher Name : IJRASET
DOI Link : Click Here