Ijraset Journal For Research in Applied Science and Engineering Technology
Authors: Chandana J, Priya Kumari, Prof. Rekha B S
DOI Link: https://doi.org/10.22214/ijraset.2023.53660
Certificate: View Certificate
Anatomy is the main topic of human biology. The study of interior body organs, from cells to organ systems, is known as human anatomy. Studying the 2D model of the internal structure is difficult and tiresome. The purpose of this study is to develop a 3D Augmented Reality anatomy of human learning system. When learning anatomy through textbooks, puppets, or 2D models, it might be difficult to understand the internal structure. One solution that has been developed is an Android app that uses augmented reality. An interactive menu in the program allows you to scan an image and see a 3D version of a model in actual space. This study uses Augmented Reality technology to create a human body learning system. It is hoped that by using this approach, students will be able to visualize the human body\'s parts in 3D with ease. Students can learn more quickly with the use of AR technology.
I. INTRODUCTION
The usage of technology is widespread due to the rapid advancement of technologies. Virtual things that are incorporated into the actual world through the use of augmented reality (AR) are real and virtual objects that are combined in real situations, running interactively in real time[1]. The goal of the research is to create an augmented reality (AR) application that will aid in the learning of human body anatomy. In order to help students comprehend [2] and research the human body's anatomy with the interactive 3D Model, an issue of Visualizing and Imagining complex organ parts of the human must be addressed.
The problem statement identifies the remedy to be used.
The 2D platform for body anatomy of the human presents a visualization issue [3]. Creating a 3D model of the human anatomy system and using an AR program to visualize it.
The main component of learning human anatomy is visualization. The anatomy structure is difficult for students to visualize [4]. Using an Augmented Reality (AR) - based mobile application, the difficulty of Visualizing and Imagining complex human anatomy components will be solved, assisting students in comprehending and studying the body of a human has 3D interactive model.
This outlines the solution to the visualization issue that results from using a 2D platform for human body anatomy. Designing a 3D human anatomy model with an interactive user interface and visualization using an Android app [6].
II. LITERATURE REVIEW
The literature survey includes the work done by the authors and researches in developing and use of AR to visualizing the 3D person model anatomy.
Rita Layona and others in [1], created an embeddable web application. Using Google Sketchup and 3Ds Max 2011, create 3D objects. ActionScript 3.0 and the Windows Presentation Foundation (WFC) template are used in the development of applications. This online application makes use of the Kinect Xbox 360 and AR Marker as physical instruments. This study outlines how simple it is to comprehend the anatomy of the human body.
Rahul Yogesh Karekar and others used dot image mapping technology in [2], where the image with the most dots helps us superimpose the 3D object. 3D Model Overlay will help to overlay the 3D object properly onto the image which is going to superimpose. Sensor Trigger - The image sensor will be triggered after 3D objects are layered on it. The electronic Internet of Things gadget connected to send data to the user is called Particle Photon. Heart rate is monitored and shown by the Pulse heart rate monitor. Cardiac is a Real-Time Augmented Heart Rate monitor application, and that is the major goal of this study.
In [3] The goal of Saad Badie Younis and others work is to develop an augmented reality application to enable students visualize the complex architecture of the human body and comprehend total hip replacement (THR) surgery. Three steps make up the THR technologies employed in the AR application: the quiz, the surgery, and the anatomy sections. The method for a total hip replacement uses an AR marker-base. Each object used in the app was built with Blender Software and 3ds Max.
In [4], Sanket Patil and others offered a 3D picture from every angle of medical organs and combined precise labeling and information from the virtual visualization. Simply moving devices forward and backward will activate zooming features. The goal is to look at how Augmented Reality (AR), a technology, used to make learning more effective. Better virtual visualization is provided by the 3D model, which can be viewed from all angles, zoomed in and out to examine minute details, and has labels and other specific information included.
The technology of marker-based AR applications was used by authors Michael H. Kurniawan and others in [5]. Pictures are kept in the database. In this study, the Floating Euphoria Framework and also the SQLite database were made into one software and used. The creation of android application uses the Java programming and also the Android Studio platform. The AndAR AR engine is employed along with the mobile-optimized Unity and Vuforia AR frameworks. The developed application offers interactive characteristics of 3D anatomical models that may be displayed, making it concise and straightforward to understand the model in 3D.
The authors of [6] Joshia Felim Efraim and others first used Google Forms to gather the necessary data for their research. Afterwards, using Unity 2018 with the Vuforia extension, construct an AR app. If the camera is pointed at the marker, which is shaped like an image, the augmented reality can be used to study the human digestive system.. The goal of this study is to further the application of mobile technologies, specifically augmented reality and visualization, to biology teaching and the study of the human digestive system. This work attempts to provide an interactive learning tool for the visualisation of the human digestive system.
In [7], authors Dhias Fajar and others in the programme is installed via Android by Widya Permana, Tegar Bhakti Sandhiyudha, and ARMOUR, an augmented reality human anatomy operator.
Ensured that the application in mode human anatomy, musculoskeletal, and skeletal was ready by preparing the wall or background. The ARMY 3D animation was rotated or displayed using the phone's camera.
The goal of this project was to make it easier for students, teachers, and scientists to learn about the anatomical structure employing augmented reality technology, of the human body. ARMY is one of the ways that individuals may use their smartphones to study about anatomy.
Using Autodesk Maya 2018, the Luther technique is used to create a three-dimensional model of a person digestive system. The authors Sonjaya and R Fadlurahman used the UV Editor on virtual to create texture on the model in [8]. The main goals were creating applications that could operate on Android-based platforms and creating interactive learning media applications. The end product contains a visual object feature that includes 3D model of the person’s digestive system.
The demographic survey and sample selection were carried out in [9] by authors Sumitra Nuanmeesri and others in the system's AR model was created and implemented (Noonchu, 2016).
Being able to scan photographs on smartphones or tablets to improve the process of remembering the shape of the heart. An augmented reality (AR) study of the human heart's anatomy has been planned and created as part of research to develop AR as a teaching tool.
III. METHODOLOGY
The process or the methodology used in construction of AR App is as follows :
The 3D model was created from a reference image. The model was created in Blender utilising a variety of parts, textures, cubes, cylinders, spheres, and angled pipes.
The model surface is made smooth and flexible using various transformation, triangulation, and subdivision techniques.
2.The 3D model is visualised and displayed using marker-based augmented reality technology.
3. The application was created utilising Unity 3D, the Vuforia Framework, and C# code. The use of a camera to take a picture of a marker and then assess its size and location.
4. The next step is to track the marker (identify the marker) by taking a snap of it.
5. The identification tag allows you to identify the marker; examine if you can use it in accordance with the database that has been saved.
6. In the data appearance or visualisation display stage, the application shows the current data as 3D models and the part model coordinates.
The Architectural Diagram in fig 1 defines the flow of the steps defined in the Methodology.
A 3D model created in the initial step using the programme Blender.
The relevant Vuforia Packages will be downloaded after the file is imported into Unity.
By developing several AR scenes in Unity, it is possible to obtain various 3D anatomical structures in various settings.
Changing the suitable scenes with the marker-based AR technology by replacing the prefabs with AR scenes and adding C# scripts.
To reflect the architecture diagram of the model or AR application and to represent the working status, the sequence diagram in fig 2 is incorporated in the high level design.
User (Student) signs in for the first time to the AR application. Authentication must be performed on the database login.
When the user turns on the camera, the marker image be downloaded and shown in 3D.
After scanning, the marker image be focused. Following a marker evaluation by the database, the display receives the relevant 3D model [11]. The camera then shows the 3D perspective of the model.
IV. IMPLEMENTATION
The creation of the 3D model and the AR application are all included in the implementation.
The creation of the 3D model and the AR application are all included in the implementation. The first stage is to create a 3D anatomical model of the person’s body that takes the circulatory, digestive, and cardiovascular systems into account.
An application built on an augmented reality platform that features 3D visual items in a real-world setting. Visualization of the digestive and circulatory systems' organs and systems in three dimensions, as well as information on how each organ works [12].
Using various transformation, triangulation, sub divisions model surface is made smooth and flexible. Each organs are colored and named accordingly. Installation of Unity Engine 2020.3.43f1(64 bit) version. The file created imported to Unity Engine as fbx file.
The animation and required C# script was written. The 3D model was created with working animation and voiceover.
Based on markers a static image known as a marker image or target image is necessary for augmented reality, which a user can scan with their mobile device using an augmented reality app. [13]. The fig 3 shows the application main menu constructed in Unity. The mobile scan will trigger the additional content (video, animation, 3D or other) prepared in advance to appear on top of the marker.
The two marker images used for the 3D visualization of the Digestive system and Circulatory system. When the 3D model is specifically scanned, these marker photos will be shown in the application interface. The interface will not display the 3D model without these marker photos. The ability to download the marker images is available on the programme interface. In the Marker-based AR technology, focusing the camera on the creator picture is crucial[14].
V. RESULTS AND OUTCOMES
The outcomes demonstrate the potential of a 3D model of the human circulatory and digestive systems for augmented reality applications. The digestive system's operation is one of the application's components. The user needs to download the marker first. Following the scanning of the marking, a 3D model and voice input capabilities will be visible. The ability to zoom in and out allows users to simply change the 3D's size.
The primary menu screen is where the application launches. The first stage in the AR anatomy learning mode is to focus on and capture the marker image. Scan the image after downloading it. The software recognizes the marker using the image stream from the camera. During the marker identification process, the application recognizes the marker and compares it. During the marker identification process, the application identifies markers, and it compares them to the current models.
The 3D model of digestive system is shown in fig 4, which will appear on the application interface in accordance with the model that was created and stored in the database. The model is shown in the application along with its position in relation to the marker. The application allows for user input as well. This method runs in response to the user's touch to properly and clearly display the 3D model by zooming in and out of the model.
The software displays the photo that was taken with the Android device's camera. The existence of the marker will then be detected by the camera. Which 3D human body models are shown when the marker is recognised depends on the user's selection from the main menu. The 3D model's heart, stomach, and esophagus are interactive touches on the screen. The model interacts to display the pertinent data for the selected organ or organ system[15].
The process of assembling or combining all developed assets is part of the development of the application. According to the application's design, Unity 3D's main menu uses useful buttons to enable switching to other scenes. The C# scripts have been introduced in accordance with the defined functional requirements as shown in the application interface.
The creation of interactive learning media applications that can run on an Android-based platform has been successfully completed. The final software includes a visual object feature in the form of 3D models of the human circulatory and digestive systems, as well as information about these systems\' organs[16]. The designed software is more user-friendly, entertaining, and easier to understand than the traditional method of learning human anatomy. In this application, augmented reality (AR) technology can be utilised as a replacement to textbooks and props for understanding the human anatomy[17]. Applications pique kids\' interest in studying through augmented reality software. The following are some ways that the mobile application could be improved ; 1) Menu selection and interaction are possible without the use of a marker; 2) Animations have been developed for additional human anatomical systems; 3) Additional visualisation materials with more featured in multimedia with video; 4) Adding the explanation in several languages; and 5) Adding the more detailed 3D texture graphics that can be retrieved from the scanned object. 6) Better instruction for novice users;
[1] Rita Layona, Budi Yulianto, Yovita Tunardi, “Web based Augmented Reality for Human Body Anatomy Learning”, 3rd International Conference on Computer Science and Computational Intelligence, 2018. [2] Mr. Rahul Yogesh Karekar Mr. Jai Rajendra Patel Prof. Yogita Mane, “Cardiac: Augmented Reality”, International Journal of Engineering Research & Technology (IJERT), April 2019. [3] Saad Badie Younis, Emad H. Al-Hemiary, “Augmented Reality for Learning Human Body Anatomy and Total Hip Replacement Surgery”, Iraqi Journal of Information & Communications Technology, August 2022. [4] Sanket Patil, Anup Rao, Neeraj Pardesh, “Augmented Reality in Anatomy AR in Anatomy”, International Journal of Engineering Research & Technology (IJERT), April 2021. [5] Michael H Kurniawan, Suharjito, Diana, Gunawan Witjaksono, “Human Anatomy Learning Systems Using Augmented Reality on Mobile Application”, 3rd International Conference on Computer Science and Computational Intelligence, 2018. [6] Joshia Felim Efraim, Hafiz Elfia Wedo Putra, Erika Tania Michelle, “Augmented Reality for Human Digestive Anatomy Biology Learning” June 2021. [7] Fajar Awang Irawan, Anies Setiowati, Dhias Fajar Widya Permana, Tegar Bhakti Sandhiyudha, “Augment Reality Human Anatomy (ARMY) as Learning Media in Sport Science”, 5th International Conference on Physical Education, Sport, and Health (ACPES ), 2019 [8] I Sonjaya and R Fadlurahman, “Learning media for human digestive system based on augmented reality”, International Conference of Computer and Informatics Engineering (IC2IE, 2018. [9] Sumitra Nuanmeesri, Preedawon Kadmateekarun, Lap Poomhiran, “Augmented Reality to Teach Human Heart Anatomy and Blood Flow”, TOJET: The Turkish Online Journal of Educational Technology, January 2019. [10] Fernandez M, “Augmented virtual reality: How to improve education systems. Higher Learning Research Communication” , 2017. [11] Yuan Zhou,Jiejun Hou,Qi Liu,Xu Chao,Nan Wang,Yu Chen,Jianjun Guan,Qi Zhang,and Yongchang Diwu, “VR/AR Technology in Human Anatomy Teaching and Operation Training”, March 2021 [12] Zakia Nurhasanah, Riandi Riandi, Ari Widodo, “Augmented reality to facilitate students’ biology mastering concepts and digital literacy”, JPBI (Jurnal Pendidikan Biologi Indonesia), November 2019 [13] Syawaludin, A., Gunarhadi, & Rintayati, P, “Development of Augmented Reality-Based Interactive Multimedia to Improve Critical Thinking Skills in Science Learning”, International Journal of Instruction, 2019 [14] Rusnida Romli, Fatin Nur Hazwani Binti Mohd Wazir &Amar Raaj Singh A/L Gurdial Singh “AR Heart: A Development of Healthcare Informative Application using Augmented Reality”, The 1st International Conference on Engineering and Technology (ICoEngTech) 2021. [15] Andayani, M F Syahputra, M A Muchtar, M Sattar, Santi Prayudani, F Fahmi, “3D Modelling Intestine Anatomy with Augmented Reality for Interactive Medical Learning” IOP Pulishing, 2019 [16] Akinyokun Oluyomi, Aladeselu Oluwamodupe “Augmented Reality (AR) - based Human Digestive System Mobile Learning Platform”, International Journal of Computer Applications, April 2021. [17] Z. A. Aida Zamnah, M. Siti Azreena, Muhammad Biki Saputra, \"C-Heart: Augmented Reality of 3D Heart Anatomy\", International Journal of Advanced Trends in Computer Science and Engineering, 2020 [18] U Andayani, B Siregar, Sapri Hernina Pulungan, M F Syahputra, M A Muchtar, D Arisandi, “A Visualisation of 3D Lung Anatomy with Augmented Reality as Interactive Medical Learning”, 3rd International Conference on Computing and Applied Informatics, 2019 [19] Siti, F., Wawan, S., Enjun, J, “Development of Smart Content Model-based Augmented Reality to Support Smart Learning”, Journal of Science Learning, 2019 [20] Macauda, A, “Augmented Reality Environments For Teaching Innovation”, Sciendo - Research on Education and Media, 2018.
Copyright © 2023 Chandana J, Priya Kumari, Prof. Rekha B S. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Paper Id : IJRASET53660
Publish Date : 2023-06-03
ISSN : 2321-9653
Publisher Name : IJRASET
DOI Link : Click Here