Ijraset Journal For Research in Applied Science and Engineering Technology
Authors: Vishal Jangale, Dr. R. N. Awale
DOI Link: https://doi.org/10.22214/ijraset.2022.46961
Certificate: View Certificate
Mobile phones have made the Augmented Reality technology easily available to large audience and people are developing applications for it but the interaction with mobile AR is limited by touchscreen gesture, hand-based gesture can be used but it needs additional sensors like depth sensor and better processing power to work properly and lower end mobile phone devices cannot handle it and also external hardware can be build but around it but it adds to additional cost and is also not affordable to everyone. In this project a prototype application is developed to demonstrate the use of image targets to design a virtual controller that can be used to interact with AR content and implement basic tasks such as touching and moving virtual objects available in the AR space with the help of virtual controller and further the paper discusses about the benefits and drawbacks of such implementation and its future usage.
I. INTRODUCTION
Augmented reality (AR) is immersive technology in which digital information in form of images is overlayed on the real world to enhance it with the digital details, it is one of the biggest technology trends around now but it has been around from 1990s and now has entered the market through various HMD like Microsoft hololens and through mobile phone devices by mobile applications. Furthermore, AR can be classified into marker based or marker less AR, Marker based AR uses markers like image target which can be QR code or other images which are traced by AR system using computer vision and according to that AR content is displayed upon it, and Marker less AR doesn’t use markers but it takes advantage of GPS or other depth sensor along with computer vision techniques to track surfaces or a particular location. The research direction of this technology should include the following aspects. Development of cheaper hardware equipment to reduce the hardware costs of AR system, development of efficient algorithms and software that enable seamless integration of the real and virtual situations and improvement of human computer interaction with various devices in this paper the human computer interaction part of AR development is focused.
Lots of hardware that supports this technology are made like HMD (Head mounted devices) i.e., AR glasses and lenses have been made but are expensive and are not affordable to large number of consumers but the accessibility of mobile device which are capable of handling this technology it has made it accessible to large consumer. Mobile devices have good cameras and processer to handle AR and many Applications have been made for this of which one of the most popular game was Pokemon go, such type of AR is known as Mobile AR. But unlike HMD AR which has extra controller for interaction mobile AR is limited to touch screen gesture and hand gesture are not widely used and as of now there are no extra hardware devices are built for mobile AR so there is a need of developing more interactions methods around mobile AR. Previous research in interaction with mobile AR have been related to using of some external hardware like AR pen that contains markers used to select object and draw in space [5]. Also, use of another phone as an external hardware device as controller have been proposed in few researches [2,7,10] these controller phones are used with HMD device or another Mobile phone, these controller phones allow to use additional sensor and UI interface but such implementation will not get to large audience. there some interaction technique such as touch screen-based gesture which makes the users screen cluttered with UI elements or hand gesture should that needs additional depth sensor for it to work properly, Also for Virtual based controller used of Virtual buttons [1,4,8] but are not so interaction and used for showing some few animations on a same image target this can be used with multiple image target to enhance the interactive experience. This technology has various Application area like, manufacturing, military, medical, education, gaming, etc. and application have been built like a maintenance work support system for industrial use, electronics or medical study application which are good but are limited to instructiveness.
In this paper we implement a prototype AR application for electronic circuit that is build using unity game engine and Vuforia SDK [13] for AR. We use multiple image target one two show the AR content and other that acts as a virtual controller and we demonstrate this using two demonstration and doing interaction such as touching and moving virtual object to achieve desired task given in the demo.
II. INTERACTION METHODS
Interaction in AR can be hardware based or Gesture based depending on the viewing device HMD device can have Hardware based controllers and also gesture based as they are specifically designed for them so they are expensive but mobile phone have and advantage of large user base so interaction for AR in mobile should be looked upon.
A. Hardware Controller Based Interaction
This type of interaction has been used in the Head mounted AR glasses like HoloLens but not in Mobile AR as extra component is require and to reach to large consumer such type of interaction method is not feasible.
B. Gesture Based Interaction
Such interaction use’s hand tracking and certain gesture are assigned to interact with the virtual objects, such type of interaction technique is not Widely used in Mobile AR as of now because a good hardware is needed and a need of depth sensor is required.
C. Touch Screen Based Interaction
This is widely used interaction method for Mobile AR until now as it allows to use touch screen gesture of mobile phone to interact with the AR objects. This has a disadvantage that it can clutter the AR view of user and doesn’t give much tangible experience like gesture-based interaction.
III. SYSTEM DESIGN
In this project two image target that are also known as AR marker are printed on a paper and used, one image target is used for AR scene i.e. all the AR content that need to be interacted will be displayed on it and the second image target is used to display AR controller that is programmed accordingly and used to interact with the AR scene the project uses a Redmi note 5 pro smart phone as viewing device running the android application developed in this project and not extra external hardware is used AR scene is virtual and can only be seen through Viewing device.
IV. AR PROTOTYPE APPLICATION
A prototype AR application called Electronics are is developed for mobile phone unity with Vuforia SDK the mobile phone used here is Redmi note 5 pro having android Nougat and snapdragon 636 chipset, the application consists of two demonstration which have specific task related to the demonstration the demonstration are based on Electronics circuit simulator as seen in menu screen in fig , interaction such as touch and moving of virtual object which will be done using one of the image targets as a virtual controller that we have created. The main idea here is to test the interaction using the virtual controller that has been made for the demonstration.
V. VIRTUAL CONTROLLER AND VIRTUAL BUTTONS
A. Virtual Controller
Virtual controller basically is like a hardware based remote controller having virtual buttons which has been programmed to do a specific function. In this proposed system we have created the controllers with the help of Vuforia Virtual buttons functionality these are placed in one of the image target. Here two button based and three button based controller are tested by using these controller in the demos we achieve common interaction in AR such as touch and moving the objects Fig. 1. Show two buttons configuration and three button configuration Virtual controller in three button configuration a third button is used to switch the controller buttons to some other buttons this helps to reuse the assigned area of the image target.
B. Virtual Buttons
Virtual buttons add a mechanism of interactivity to the image target, by using unity and Vuforia we can create virtual button object on an image target they are basically specified an area assigned in the image target which when the area assigned get hidden from the camera by any obstruction like hovering of a finger it register it as a button press so in this paper we have taken advantage of this functionality and created an idea of virtual controller we have made several Virtual button as shown in Fig.4 and assigned them some functionality and various type of controller are built having three or two button configurations.
VI. RESULT AND DISCUSSION
A. Demonstration 1
2. Diode Circuit Demonstration: In diode circuit demo user is able to successfully touch each virtual object with the controller and see the details of the virtual object and by pressing interaction button and user is able to see the current and voltage values of component and also, user can increase or decrease the dc supply voltage through controller when we switch the controller to increase and decrease button also in this demo user can use the touch screen gesture used to touch on dc supply object on phone screen to get the popup of the voltage value hence here we were able to us touch screen gesture and virtual controller together. Here it was observed that the addition of extra buttons created some lag in the renders so basically adding a greater number of buttons may increase the problem more hence the smaller number of buttons should be present this project has used the third button to switch the controller having different buttons and switching of the controller was smooth no lag was observed. Also, buttons in vertical alignment create a problem that sometime below buttons gets triggered as they are not physical so user experiences a discomfort in holding the controller and hovering over the button.
VII. ACKNOWLEDGMENT
This project has got immense support from Dr. R.N Awale, it would have been not possible to complete the project without his support, valuable suggestions, encouragement, and guidance. Also grateful to all other teaching and non-teaching staff members of the Electrical Engineering for directly or indirectly helping in completion of this project and for the resources provided.
Building upon previous work which was done using hardware based or hand gesture-based controller such implementation as shown in this paper helped to make use of AR virtual object and buttons only with the image target to interact with AR content and implementing interaction such as touch and moving the virtual object, no external hardware was used here and interaction had a sense of depth to the user and the Virtual Buttons are also very responsive. One of the big concerns in such type of implementation is user experience in holding the image targets and reaching the buttons which need to be looked upon for creating better user experience and also a improvement in virtual button sensitivity and unnecessary activation of virtual buttons whenever fingers goes over it is an issue which needs to be looked upon on future work. Such implementation can be used to build a large-scale AR application for various field as seen in this paper it is used to create and electronics simulator application that are usually seen in computer simulation software used by electronics enthusiast like multisim and in future we can build a simulator software fully for mobile phone AR and it can be easily available to large audience and interaction may not need external hardware. And such interaction method can be used with combination of touch screen gesture and hand-based gesture for interaction and expand our interaction space further.
[1] Khanna and V. M, \"Augmented Reality Based IOT Controller,\" 2019 International Con ference on Vision Towards Emerging Trends in Communication and Networking (ViTECoN), 2019 [2] Rishi Vanukuru and Amarnath Murugan,“Dual Phone AR: Exploring the use of Phones as Controllers for Mobile Augmented Reality”. VRST \'20: 26th ACM Symposium on Virtual Reality software and Technology November [3] J. He et al., \"The research and application of the augmented reality technology,\" 2017 IEEE 2nd Information Technology, Networking, Electronic and Automation Control Conference (ITNEC), 2017 [4] Kim, E. A. Widjojo and J. Hwang, \"Dynamic hierarchical virtual button-based hand in teraction for wearable AR,\" 2015 IEEE Virtual Reality. R. E. Sorace, V. S. Reinhardt, and S. A. Vaughn, “High-speed digital-to-RF converter,” U.S. Patent 5 668 842, Sept. 16, 1997. [5] Philipp Wacker, Oliver Nowak, Simon Voelker, Jan Borchers :ARPen: Mid-Air Object Manipulation Techniques for a Bimanual AR System with Pen & Smartphone, CHI \'19: Proceedings of the 2019. [6] Kim, M. Lorenz, S. Knopp and P. Klimant, \"Industrial Augmented Reality: Concepts and User Interface Designs for Augmented Reality Maintenance Worker Support Systems,\" 2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), 2020. [7] R. Budhiraja, G. A. Lee and M. Billinghurst, \"Using a HHD with a HMD for mobile AR interaction,\" 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 2013. [8] S. Martin et al., \"Design of an Augmented Reality System for Immersive Learning of Digi tal Electronic,\" 2020 XIV Technologies Applied to Electronics Teaching Conference (TAEE), 2020. [9] Umeda, M. A. Seif, H. Higa and Y. Kuniyoshi, \"A medical training system using aug mented reality,\" 2017 International Conference on Intelligent Informatics and Biomedical Sciences (ICIIBMS), 2017. [10] Fengyuan Zhu Tovi Grossman, \"BISHARE: Exploring Bidirectional Interactions Between Smartphones and Head-Mounted Augmented Reality\",CHI \'20: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems April 2020. [11] Zhenyi He, Xubo Yang:\"Hand-based interaction for object manipulation with augmented reality glasses\". VRCAI \'14: Proceedings of the 13th ACM SIGGRAPH International Con ference on Virtual-Reality Continuum and its Applications in Industry November 2014. [12] D. Wolf, J. J. Dudley and P. O. Kristensson, \"Performance Envelopes of in-Air Direct and Smartwatch Indirect Control for Head-Mounted Augmented Reality,\" 2018 IEEE Confer ence on Virtual Reality and 3D User Interfaces (VR), 2018. [13] https://developer.vuforia.com/ [14] https://unity.com/ [15] https://www.blender.org/
Copyright © 2022 Vishal Jangale, Dr. R. N. Awale. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Paper Id : IJRASET46961
Publish Date : 2022-10-02
ISSN : 2321-9653
Publisher Name : IJRASET
DOI Link : Click Here