Ijraset Journal For Research in Applied Science and Engineering Technology
Authors: Omkar G Shet, Vamsi Krishna Ponduri, Prem Gupta, Rohan H, Roopa M S
DOI Link: https://doi.org/10.22214/ijraset.2023.49212
Certificate: View Certificate
The healthcare industry is experiencing a lot of innovation. Out of the various new challenges, the speed of technology innovation, the demand for more patient-centred experiences, growing medical expenditures, and extending outreach to remote locations stand out. Augmented reality (AR) is revolutionising the healthcare sector with the rising usage of linked devices, computer vision, and artificial intelligence. The delivery of healthcare services is changing, and the overall standard of patient care is rising as a result of novel interactions between the real world and virtual objects enabled by mobile phones, wearables, and head-mounted displays.
I. INTRODUCTION
Augmented reality (AR) is a digitally augmented depiction of the real-world physical environment made with the help of digital visual elements, audio, or other sensory stimuli.In the middle of data gathering and analysis , one of the primary goals of AR is to highlight particular features of the physical environment, increase awareness of those qualities, and extract intelligent and accessible information that can be utilised in real-world applications.
In the 1960s, the first prototype of AR was developed by Ivan Sutherland along with his students at Harvard University and the University of Utah to present 3D graphics [1].As the research progressed in the 1970s and 80s, mobile devices and digital watches were introduced. Wearable computing was made possible as a result of this. Because the technological needs of augmented reality are greater than those of virtual worlds, progress in the former field has been slow. Although there have been advancements to the original model proposed by Sutherland, the key components required to make an AR system remain the same [2]. There are three major methods for visualising augmented reality. Video see-through is a technique that is most comparable to virtual reality in that the virtual environment is replaced with a video feed of reality and AR is superimposed on the digitised photos [2]. Another method that integrates Sutherland’s notion is visual see-through, which uses transparent mirrors and lenses to leave the real-world vision alone while showing just the AR overlay. The third way involves projecting the AR overlay onto real-world objects, which results in projective displays [2].
AR is divided into two types: triggered augmentation and view-based augmentation [3]. Triggers refer to the stimuli that initiate the augmentation. There are four main kinds of triggers, the first being marker-based AR, which needs a marker, which are real-world physical objects, to initiate augmentation. Another type of trigger is location. The GPS position(trigger) of the device is used to match a dynamic location with the required locations in order to give relevant details. The next trigger is ”dynamic augmentation” which responds to the varying view of the object. The last trigger is complex augmentation, which is a combination of marker-based/location- based AR and dynamic augmentation [3]. The second category of augmentation, view-based, includes indirect augmentation and non- specific digital augmentation. The static view of the world is augmented by indirect augmentation, while non-specific digital augmentation focuses on the dynamic view of the world without taking into consideration what is being viewed [3]. By offering robust and user-friendly ways to examine and interact with digital medical data, as well as by integrating data into the real world to create realistic and interactive virtual experiences [4], augmented and virtual reality are modernising healthcare. These technologies immerse users in simulated and accurate three-dimensional digital environments using lightweight stereoscopic HMDs, demonstrating substantial benefits from the seamless integration of digital data with the experiences of healthcare practitioners and patients [4].
In this study, we aim to discuss the various applications of Augmented Reality in field of healthcare and discuss some of the challenges. This paper contains a survey of various literature works involving the use of augmented reality in healthcare.We have analyzed the different approaches used in the proposed solutions in those works, and have presented them in a simplified manner. The following Section provides a review of related applications of AR in healthcare. The last section concludes by giving a summary of discussion.
II. LITERATURE SURVEY
Bichlmeier et. al. [5] have proposed an Augmented Reality system in the operating theatre to aid Vertebroplasty. The approach is to integrate an AR system into the head-mounted display available in the operating room(OR). The system explores integrating percutaneously tracking targets attached to the vertebrae and tracks rigid targets but needs to provide registration quality for deformable anatomy.
Throughout their daily work, health care professionals require regular access to information systems but need both hands to perform their duties. Klinker et al. [6] have come up with a smart glass prototype that allows hand-free usage by healthcare professionals to document procedures during wound treatment. This prototype makes use of voice commands and/or physical interactions for documentation. Although this benefits healthcare workers during documentation, it does not consider the patients’ perspective and cannot predict their reaction in a real-world setting.
K.-F. Hsiao and H.F. Rashvand. [7] have proposed an AR-based image-sensing system to address the students’ deteriorating health. To produce interactive activities, the gestures and body motions of students are recorded via a camera and blended with pre-designed virtual visuals. Out of all health aspects considered, cardiorespiratory endurance is the most effective and has been used to design activities. These activities were designed by fitness specialists and follow Taiwan national physical fitness criteria.
da Silva et al. [8] have put forward an AR-based interactive tool for teaching anatomical structures. An augmented reality mobile application is created that provides a high-resolution 3D visualisation of any human or animal anatomy. Differential methods of teaching are shown with two case studies included, one with the human humerus and the other with a cat’s liver. This application is lacking and can be improved for holographic visualizations. Cecilia and Sik-Lanyi. [9] have proposed a virtual reality healthcare consultation system that uses augmented reality and virtual reality that can be used by nurses and doctors for either immersive e-health applications or interactive realities such as m-health consultations. The technology has been tested on 14 subjects, of which 12 have been successfully detected. Though there are some drawbacks to these applications, such as graphic errors that can occur if the user’s software is weak, they have come a long way in their advancement and can contribute to the future of health care consultations.
Carvalho et al. [10] have proposed a technique where the software uses the corporal movements of a human through existing inputs on sight to coordinate in the augmented reality environment for several functions, this includes mirroring the reality of hand coordination movements for navigation selection . There are several drawbacks observed during the testing as lack of precision due to shaking of human hand, incorrect gesture, failure to comprehend in virtual environment.These software techniques have been implemented with the support of Microsoft Kinect.
The increasing availability of AR headsets is encouraging their use in the medical field. However, technological and humanrelated constraints continue to impede their broad adoption in clinical practise. Cutolo et al. [11] have proposed a software framework for developing AR applications for HMDs that are custom-made and used to assist with high-precision manual activities. Several studies were carried out in order to assess the platform’s efficiency. The average error in the AR tests when comparing a digital incision performed with and without the AR headset was found to be negligible. The OOPs software framework was created on the Linux OS using C++. The framework, which also makes use of CUDA architecture, can deliver optical and visual see-through enhancements. Jakl et al. [12] proposed a way to enhance the patient education with the usage of augmented reality.The technology and the procedures used by health care domains is so complexed that is often hard for patients to understand their own health issues. Computer aided instructions are more understanding compared to text reports.EPAR provides a mobile application designed for patient education corresponding to eye surgery.Patients get to understand their health problem in 3-dimensional effects.The visualization using EPAR also helps in surgery and recovery process.
The cadiolens system, developed by McDuff et al. [13], enables users to see their own heart rate anytime they are wearing the device as well as the heart rate and heart rate variability of another person just by glancing at them. The technology uses mixed reality to concurrently measure the vital signs of both people. Cardiolens uses the camera positioned on the device to combine imaging photoplethysmography and imaging ballistocardiography in order to capture both the wearer’s and the partner’s heart rates.
Hemanth et al. [14] proposed a research paper which briefly focused on biomedical and health informatics.The main objective is to deal with heart diseases by analyzing the heart sound and help in diagnosing the disease. Doctors record the pathological conditions of the heart by analyzing the heart sound. Heart sounds can be used to diagnose diseases such arrhythmia and heart valve abnormalities. A mobile application called MY HEART is developed which takes heart sound as the input and processes it. It creates graphs , analyzes it and displays the diagnosis results.
Estimations says that for every 40 seconds one person commits suicide. Lush et al. [15] reported an augmented reality system to broadcast the mental health and self-assessment. The aim of the system is to monitor the user’s mental health status and directing them to the resources which are relevant to users mental health status. In turn the resources offers help to the users with professional support.The system uses myGRaCE self-assessment tool and AR technology to achieve the goal.
Physiological measures are the gold standard for assessing gait across all populations. AR devices allow for the evaluation of human motion in more authentic and interactive contexts. The unknown accuracy of systems for quantifying human movement is a hurdle to integrating AR into healthcare. Koop et al. [16] have suggested a technique for determining the accuracy of the HoloLens (HL) in analysing gait in comparison to three-dimensional motion capture. The combination of measuring technologies within the HL offers a one-of-a-kind opportunity for motion detection to be objectively evaluated when engaging with an AR environment using rotation and positioning data from the device. Because the trial participants were primarily middle-aged and healthy, more research involving persons of all ages are being planned.
Negative emotional health is a concern nowadays as it leads to social or mental illness. In order to cope with this issue, Somchanok Tivatansakul and Michiko Ohkura. [17] proposed an augmented reality health care system that displays virtual objects in a real world environment.The approach is to create an application to focus on deep breathing techniques to help people facing stress and negative emotions. A virtual music box is displayed to perform breathing activities, and by integrating Microsoft Kinect, the user can also interact with the virtual world. Heart rates and emotions are detected and monitored using ECG sensors.
Dulf et al. [18] have implemented a system using augmented reality to administer the dosage of medicine given to patients. The approach is to segment the pills based on a specific weight that is recommended by the physician. A MATLAB environment with optical character recognition and otsu’s algorithm for pill weight detection and segmentation, respectively, was used. The results are then displayed in the form of the segmentation that has to be made on the pill in an augmented manner with the help of a webcam.The 3D scanning of the pill and a high-resolution camera can increase the accuracy of the segmentation and provide better results.
Mather et al. [19] have proposed an AR setup called Helping Hands to help students and clinicians practise infrequent medical procedures. The setup comprises a projection technology integrated into eyeglasses or mounted on a helmet with a display screen. The learner’s performance is monitored by a remotely situated instructor, who provides guidance during the procedure.
Visualization of the instructor’s superimposed hands near the learner’s eye aids in correct procedure completion. Audibility was hampered by the bulkiness and lack of flexibility of the headset, which could not accommodate a wide variety of head sizes. The screen size is also a matter of concern among users. One of the key advancements in surgical methods that targeted at giving patients more benefits is minimally invasive surgery. Nicolau et al. [20] developed a system using augmented reality to increase the inter-operative vision of surgeon’s. This system was successful is generating the 3D pathological and anatomical structures and registering the generated visuals into actual patient in real time. The preoperative information in the surgeon’s field of view is presented directly on the patient’s skin via projection-based AR.This helps the surgical team to diagnose the patient in more flexible way.
Many elderly people find it difficult to predict how home improvements will affect their way of life. Focusing on the objectives of elder clients Bianco et al. [21] presented an augmented reality home modification design prototype. This prototype uses an iPad application that is operated by elder clients, and the app is directly connected to occupational therapists. Occupational therapists who work with the elderly are supposed to use this iPad app to discuss and investigate proposed environmental improvements in order to prevent elders’ falls.
Users that use mixed reality can interact with three-dimensional virtual items.This technology is used by Maasthi et al. [22] in the field of rhinoplasty to improve the appearance of the nose and other facial features.This system operates on the Unity platform and the mixed reality tools. Initially the face of the patient is captured which is followed by initial texturing and touch effects. The resulted image is imported into MR toolkit and applied with unity animations. Doctor’s start the operation if the patient is happy with how they look.
Wang et al. [23] proposed an image registration method using augmented reality for the purpose of oral surgery.The process is carried on by forming a virtual scene which is superimposed with reality in order to provide guidance for the surgery.The image of patients teeth is acquired using a three dimensional scanner.The teeth shape is registered with a unique three dimensional stereo matching algorithm. The image registration method is not dependent on patient’s external facial organs. It works automatically to keep a proper AR scene, overcoming the difficulty of misalignment. Because of this, it is safe and useful in oral and maxillofacial surgery.
Corrective jaw surgery is performed on the jaw bones to correct the dental misalignment. It is difficult for dentists to diagnose due to the limited space in the mouths of their patients. Basnet et al. [24] proposed a research paper to solve this issue by using augmented reality. Since dental surgery involves complex procedures, there is always a substantial danger of the surgeon harming the nerve pathways or tooth root during surgery. Using AR surgery, the virtual jaw is superimposed over the real jaw, giving the surgeon a real- time 3D image. This provides the surgeon with realistic information in the surgical environment to locate the disease by finding the nerve pathway.
Ren Wu et al. [25] proposed an AR system to aid in spinal surgery. This is achieved by superimposing the pre-operative three- dimensional images onto the patients so that surgeons can get a more realistic anatomical view of the patient to perform the diagnosis. To achieve this, we use a technique that is referred to as ”visible patient.” It generates a three-dimensional representation of the patient. The main drawback of this technology is that the image displayed by the projector blocks the surgeon’s line of sight while the surgery is being done. Similarly, the surgeon might block the projected image.
Liang et al. [26] have proposed a study to explore the use of augmented reality to teach healthcare practitioners in simulating the assessment of a stroke.To test this idea, a simulation programme for the mixed reality device was developed, which displays a human face with facial drooping, a stroke symptom, onto a computerised training mannequin. A fiducial marker is used as an AR sticker to indicate the projecting location.One of the study’s drawbacks is the limited sample size and the lack of a comparative sample.
Arpaia et al. [27] have proposed an architecture to enhance patient health monitoring during surgical procedures. The patient’s health status information is shown to the surgeon in real time using a video see-through headset. This system includes a machine-learning model that uses the patient’s previous medical records and vitals to predict changes in health conditions and provide alerts. The parameters from the monitor and ventilators are sent to the headset through the laptop, which acts as a wifi client. The study regarding ergonomics and VR sickness has not been addressed.
Accurate glenoid placement is critical in reverse total shoulder arthroplasty (RSA) to ensure a satisfactory functional result and prosthesis durability. The placement of optimal components might be problematic, especially in situations of significant glenoid deformities. In this feasibility research, Kriechling et al. [28] have proposed a new navigation technique for surgical navigation employing AR through a head-mounted display with the goal of improving and enhancing surgical planning. However, the study has some drawbacks. To begin, it exclusively uses 3D-printed scapula models devoid of soft tissue and with a completely visible scapula. This is simply an approximation to the actual scenario as a method-oriented approach.
Umeda et al. [29] have proposed the use of AR in medical training. The application will recognise an augmented reality marker in real time through the respective system web camera, and virtualized images can be visualised. The application can integrate 3D objects into the user’s real-time surroundings seamlessly. The major drawback is that DICOM data is stored in 2D format, so an application is needed to convert it into 3D format.
Leary et al. [30] have proposed a fresh approach to CPR instruction by incorporating a more participatory learning experience. The approach involves superimposing computer-generated graphics on users’ views of the actual world to mimic interactive training scenarios. The setup includes a manikin and a head-mounted AR headset to provide the trainees with audio-visual feedback in real time. The trainee is presented with a holographic image of the circulatory system while performing CPR on the manikin. The paper does not include a comparison of this system with traditional CPR training systems.
In extension to the current work in the kinetic fusion algorithm, Macedo et al. [31] proposed a way to improve real-time face tracking by using augmented reality in the event of algorithm failure. Initially, a 3D model is created as a reference, and the head position is determined. In cases of fast motion, the iterative closest point (ICP) algorithm fails. To resolve this, a head position estimator is used to give a guess of the initial ICP, which compensates for the disalignment due to fast movement. This increases the robustness of the algorithm by handling ICP failures. However, it does not specify the accuracy of the system for its use in applications.
Samavi et al. [32] have proposed an AR system for mobile devices that allows us to obtain 3D images of brain tumours in real-time. The system analyses facial features to track the individual in the scene. Instead of calibrating the camera with a series of checkerboard photographs each time the app is installed on a new smartphone, the system performs camera resectioning based on the subject’s facial size. The camera’s 3D pose is estimated by determining its location and orientation using a set of 3D points and their associated 2D projections. Based on the expected camera position, a reconstructed brain tumour model is exhibited in the same place as the subject’s true anatomy.
Maier-Hein et al. [33] proposed a way to visualise 3D medical images on a patient using augmented reality. A time-of-flight (TOF) camera is attached to any portable device and constantly monitors the patient’s view. The information from the static 3D data sets is taken and transferred to the portable viewer. These data are then combined, and the physician can move the camera to any part of the patient to visualise the data. This is a markerless system that provides proper visualisation of 3D data sets without any image acquisitions or bulky equipment.
Nicolau et al. [34] have proposed an AR system for hepatic therapeutic guidance to simplify the process of radio-frequency ablation. The system uses superimposition of 3D reconstructions and virtual models onto the patients’ views. Many automatic procedures have been developed to overcome the difficulties with ablation. One of the main advantages of this technology is the small amount of material required. In fact, unlike other current systems that need specialised tracking materials, this one requires simply a PC, a video capture card, two cameras, and a printed square.
In augmented reality (AR) applications, polygon model is often used to elevate the real-life view. Nevertheless, volume of data should be presented inside the large majority of health-care uses. One methodology for true interactive mode of visual of health information has been put up by Giraldi et al. [35]. It entails creating and tracking three dimensional view including patient’s area of concern using the Kinect depth feed. Based on the predicted camera posture, a doctor can then see the volumetric data within client’s anatomy.
We have examined many augmented reality applications in healthcare that help the clinical and medical personnel. The evaluations that are provided demonstrate how effectively augmented reality is used, particularly in surgical applications. Due to its unique characteristics, AR is one of the finest applications in the healthcare industry. Instead of visualising, one can observe what occurs inside the human body through screens and sensors. Medical professionals can readily explain surgical processes to patients with the use of AR. For future research and teaching purposes as well as for medical practitioners, augmented reality will be a godsend.
[1] H. Tamura, “Steady steps and giant leap toward practical mixed reality systems and applications,” in Proceedings of the International Status Conference on Virtual and Augmented Reality, pp. 3–12, 2002. [2] D. Van Krevelen and R. Poelman, “A survey of augmented reality technologies, applications and limitations,” International journal of virtual reality, vol. 9, no. 2, pp. 1–20, 2010. [3] A. Edwards-Stewart, T. Hoyt, and G. Reger, “Classifying different types of augmented reality technology,” Annual Review of CyberTherapy and Telemedicine, vol. 14, pp. 199–202, 2016. [4] M. R. Desselle, R. A. Brown, A. R. James, M. J. Midwinter, S. K. Powell, and M. A. Woodruff, “Augmented and virtual reality in surgery,” Computing in Science & Engineering, vol. 22, no. 3, pp. 18–26, 2020. [5] C. Bichlmeier, B. Ockert, S. M. Heining, A. Ahmadi, and N. Navab, “Stepping into the operating theater: Arav—augmented reality aided vertebroplasty,” in 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality, pp. 165–166, IEEE, 2008. [6] K. Klinker, M. Wiesche, and H. Krcmar, “Digital transformation in health care: Augmented reality for hands-free service innovation,” Information Systems Frontiers, vol. 22, no. 6, pp. 1419–1431, 2020. [7] K.-F. Hsiao and H. Rashvand, “Using sensor enabled augmented reality for healthcare,” in IET Conference on Wireless Sensor Systems (WSS 2012), pp. 1–5, IET, 2012. [8] I. C. da Silva, G. Klein, and D. M. Brandao, “Segmented and detailed visualization of anatomical structures based on augmented reality for health education and knowledge discovery,” Advances in Science, Technology and Engineering Systems Journal, vol. 2, no. 3, pp. 469–478, 2017. [9] C. Sik-Lanyi, “Virtual reality healthcare system could be a potential future of health consultations,” in 2017 IEEE 30th Neumann Colloquium (NC), pp. 000015– 000020, IEEE, 2017. [10] M. A. S. Selleh and A. Saudi, “Augmented reality with hand gestures control for electronic medical record,” in 2019 IEEE 10th Control and System Graduate Research Colloquium (ICSGRC), pp. 146–151, IEEE, 2019. [11] F. Cutolo, B. Fida, N. Cattari, and V. Ferrari, “Software framework for customized augmented reality headsets in medicine,” IEEE Access, vol. 8, pp. 706–720, 2019. [12] A. Jakl, A.-M. Lienhart, C. Baumann, A. Jalaeefar, A. Schlager, L. Schoffer, and F. Bruckner, “Enlightening patients with augmented reality,” in¨2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), pp. 195–203, IEEE, 2020. [13] D. McDuff, C. Hurter, and M. Gonzalez-Franco, “Pulse and vital sign measurement in mixed reality using a hololens,” in Proceedings of the 23rd ACM Symposium on Virtual Reality Software and Technology, pp. 1–9, 2017. [14] J. D. Hemanth, U. Kose, O. Deperlioglu, and V. H. C. de Albuquerque, “An augmented reality-supported mobile application for diagnosis of heart diseases,” The Journal of Supercomputing, vol. 76, no. 2, pp. 1242–1267, 2020. [15] V. Lush, C. Buckingham, S. Wileman, S. Edwards, and U. Bernardet, “Augmented reality for accessible digital mental healthcare,” in 2019 5th Experiment International Conference (exp. at’19), pp. 274–275, IEEE, 2019. [16] M. M. Koop, A. B. Rosenfeldt, J. D. Johnston, M. C. Streicher, J. Qu, and J. L. Alberts, “The hololens augmented reality system provides valid measures of gait performance in healthy adults,” IEEE Transactions on Human-Machine Systems, vol. 50, no. 6, pp. 584–592, 2020. [17] S. Tivatansakul and M. Ohkura, “Healthcare system focusing on emotional aspects using augmented reality-implementation of breathing control application in relaxation service,” in 2013 International Conference on Biometrics and Kansei Engineering, pp. 218–222, IEEE, 2013. [18] A. G. Berciu, E. H. Dulf, and I. A. Stefan, “Flexible augmented reality-based health solution for medication weight establishment,” Processes, vol. 10, no. 2, p. 219, 2022. [19] C. Mather, T. Barnett, V. Broucek, A. Saunders, D. Grattidge, and W. Huang, “Helping hands: using augmented reality to provide remote guidance to health professionals,” in Context Sensitive Health Informatics: Redesigning Healthcare Work, pp. 57–62, IOS Press, 2017. [20] S. Nicolau, L. Soler, D. Mutter, and J. Marescaux, “Augmented reality in laparoscopic surgical oncology,” Surgical oncology, vol. 20, no. 3, pp. 189–201, 2011. [21] M. Lo Bianco, S. Pedell, and G. Renda, “A health industry perspective on augmented reality as a communication tool in elderly fall prevention,” in Proceedings of the International Symposium on Interactive Technology and Ageing Populations, pp. 1–11, 2016. [22] M. J. Maasthi, H. L. Gururaj, V. Janhavi, K. Harshitha, and B. H. Swathi, “An interactive approach deployed for rhinoplasty using mixed reality,” in 2020 International Conference on COMmunication Systems & NETworkS (COMSNETS), pp. 680–682, IEEE, 2020. [23] J. Wang, Y. Shen, and S. Yang, “A practical marker-less image registration method for augmented reality oral and maxillofacial surgery,” International journal of computer assisted radiology and surgery, vol. 14, no. 5, pp. 763–773, 2019. [24] B. R. Basnet, A. Alsadoon, C. Withana, A. Deva, and M. Paul, “A novel noise filtered and occlusion removal: navigational accuracy in augmented reality-based constructive jaw surgery,” Oral and maxillofacial surgery, vol. 22, no. 4, pp. 385–401, 2018. [25] J.-R. Wu, M.-L. Wang, K.-C. Liu, M.-H. Hu, and P.-Y. Lee, “Real-time advanced spinal surgery via visible patient model and augmented reality system,” [26] Computer methods and programs in biomedicine, vol. 113, no. 3, pp. 869–881, 2014. [27] C.-J. Liang, C. Start, H. Boley, V. R. Kamat, C. C. Menassa, and M. Aebersold, “Enhancing stroke assessment simulation experience in clinical training using augmented reality,” Virtual Reality, vol. 25, no. 3, pp. 575–584, 2021. [28] P. Arpaia, M. Cicatiello, E. De Benedetto, C. A. Dodaro, L. Duraccio, G. Servillo, and M. Vargas, “A health 4.0 integrated system for monitoring and predicting patient’s health during surgical procedures,” in 2020 IEEE International Instrumentation and Measurement Technology Conference (I2MTC), pp. 1–6, IEEE, 2020. [29] P. Kriechling, S. Roner, F. Liebmann, F. Casari, P. Furnstahl, and K. Wieser, “Augmented reality for base plate component placement in reverse total¨ shoulder arthroplasty: a feasibility study,” Archives of orthopaedic and trauma surgery, vol. 141, no. 9, pp. 1447–1453, 2021. [30] R. Umeda, M. A. Seif, H. Higa, and Y. Kuniyoshi, “A medical training system using augmented reality,” in 2017 International Conference on Intelligent Informatics and Biomedical Sciences (ICIIBMS), pp. 146–149, IEEE, 2017. [31] S. Balian, S. K. McGovern, B. S. Abella, A. L. Blewer, and M. Leary, “Feasibility of an augmented reality cardiopulmonary resuscitation training system for health care providers,” Heliyon, vol. 5, no. 8, p. e02205, 2019. [32] M. C. de Farias Macedo, A. L. Apolinario, and A. C. dos Santos Souza, “A robust real-time face tracking using head pose estimation for a markerless´ ar system,” in 2013 XV Symposium on Virtual and Augmented Reality, pp. 224–227, IEEE, 2013. [33] Q. Shan, T. E. Doyle, R. Samavi, and M. Al-Rei, “Augmented reality based brain tumor 3d visualization,” Procedia computer science, vol. 113, pp. 400–407, 2017. [34] L. Maier-Hein, A. M. Franz, M. Fangerau, M. Schmidt, A. Seitel, S. Mersmann, T. Kilgus, A. Groch, K. Yung, T. R. dos Santos, et al., “Towards mobile augmented reality for on-patient visualization of medical images,” in Bildverarbeitung fur die Medizin 2011¨ , pp. 389–393, Springer, 2011. [35] S. Nicolau, A. Garcia, X. Pennec, L. Soler, and N. Ayache, “An augmented reality system to guide radio-frequency tumour ablation,” Computer animation and virtual worlds, vol. 16, no. 1, pp. 1–10, 2005. [36] M. C. Macedo, A. L. Apolinario, A. C. Souza, and G. A. Giraldi, “A semi-automatic markerless augmented reality approach for on-patient volumetric´ medical data visualization,” in 2014 XVI Symposium on Virtual and Augmented Reality, pp. 63–70, IEEE, 2014.
Copyright © 2023 Omkar G Shet, Vamsi Krishna Ponduri, Prem Gupta, Rohan H, Roopa M S. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Paper Id : IJRASET49212
Publish Date : 2023-02-22
ISSN : 2321-9653
Publisher Name : IJRASET
DOI Link : Click Here