A new revolution is occurring recently within the field of human computer access. Since the inception of the desktop, other companies have spent millions, and are still doing so, to develop highly attractive GUIs and state-of-the-art communication systems for the typical user. This eye-catching system may be a system for people with natural disabilities who can operate this method with their own eyes. Where we all feel blessed to be able to operate computers easily with our hands. But within the case of paralyzed patients, whether or not they are doing not have movement and speechThe brain and vision are functional, they can not use their intelligence and remain unemployed. Thus, your ophthalmic system helps to resolve this problem
Introduction
I. INTRODUCTION
Eye contact could be a technique that's utilized in a range of fields, including neuroscience, psychology, scientific discipline, human-computer interaction, and more. However, the earliest use has focused on using eye-trackers to review the cognitive processes of the brain. , The concept of using a watch tracker as an device for computer control is one in every of the less studied areas of research, which has focused totally on helping people with motor impairments who don't seem to be hands-on alternatives. is used as an eye fixed. there are differing kinds of eyes gazing methodologies available but most of them is pretty uncomfortable for the user. A more invasive technique involves placing a contact lens with a magnetic coil against the user's cornea and adhering it in place with suction.
II. LITERATURE REVIEW
The existing system such that the interaction amongst the computer and human is carried out with eye-tracking and blink-detection. In this concept, human computer interface system exists which tracks the direction of the human eye. The particular motion and the direction of iris is employed to drive the interface by positioning the mouse cursor consequently. The location iris is completed in batch mode. Here the frames are stored in a permanent storage device and are retrieved one by one. Each of the frames is processed for finding the location of the iris position and there by placing the mouse cursor consequently. Such a system that detects the iris position from still images provides an alternate input modality to facilitate computer users with severe disabilities.
In this paper, an individual human computer interface system using eye motion tracking is introduced. Traditionally human computer interface uses mouse, keyboard as an input device. However, the proposed vision-based virtual interface controls system work on various eye movements such as eye blinking. The planned virtual multimodal interface system provides vision-based mechanism, to convey between human and computer system, instead of conventional human computer interaction through mouse and keyboard. For motion tracking, recognition of eye is explored through an optical flow technique. To minimize the error caused by light variation, histogram equalization and max-min normalization is used to improve every frame. An innovative system for user-computer interaction based on the user’s eye-gaze behavior.
In this paper we roughly describe some representative studies in the field of eye tracking, covering some aspects regarding different types of devices, algorithms for pupil detections, image processing or data filtering and also some well known applications in assistive technology, human computer interaction, virtual reality, psychology or eBlearning. As a general tendency we can conclude that in the future eye tracking approaches will be a hot subject for researchers. It is argued by some traditional conferences, international projects, books and scientific papers and technical reports. For example, held once every two years, Eye Tracking Research & Application (ETRA) Conferences join together companies and researchers involved in eye tracking technologies and highlightnew hardware and software solutions. Among many others research groups, EyeBCom Corporation is an advanced center for eye tracking research and development dedicated to the creation of innovative eye tracking technology to improve and save lives, support the advancement of research, and revolutionize humanBtechnology interaction. Special attention should be paid for performing experimental procedures in order to evaluate the usability, accuracy and reliability of the eye tracking systems.
This research provides a system that is able to trigger mouse movements for controlling an interface for the people who are suffering from some kind of severe. Physical disabilities and who cannot use the system with their hands. The system is able to track eye movements efficiently and accurately by using the pupil portion and can accurately detect eye blinks whether voluntary and involuntary. The system can track eye portion with the 90% detection accuracy. The system is expanded to work in real time using recorded videos. The proposed system is purely non? intrusive as no hardware device has been attached to the human body so the system is user friendly and easier to configure. There are still some aspects of the system that are under experimental conditions and development. But this project proved to be an overall success and achieved the goals and requirements as proposed in the system specifications. Many aspects of the system can be a part of the future work for making more efficient and robust eye tracking system. The system can be shifted fromr ecorded videos to a live web cam video with some modifications, for making it a live system. The system can be developed in such a way so that it could also detect human eye gazes and act accordingly. There can be some kind of mouse action when the blink is detected. System efficiencycan be achieved for making it a more efficient dynamic system.
III. SYSTEM DIAGRAM
V. ALGORITHM
The goal of the eye-tracking algorithm is first to locate the eyes of the user from an image and then use the location information to perform certain functions. Static images are retrieved from an image library and are used to initiate the system. In the first stage, an efficient image enhancement sharpening filter is employed. This is followed by a simple method to segment the eyes. Following this, an iris detection method is used to find the direction of the user’s gaze and finally the computed direction information of eye movements is used to drive the computer interface. Each step will be explained in detail in the following sections.
A. Step 1. Image Enhancement
The first step after retrieving the input image is to enhance it. This increases the image definition by improving contrast. In the presence of noise, the sharpening and smoothing of the image are important pre-processing steps. These are usually the precursors in many operations such as object recognition, edge detection, feature extraction and pattern recognition (Liu et. al., 2002). Smoothing removes noise but typically also blurs edges. To facilitate edge detection and other similar processes, deblurring (sharpening) of the image is required. After several experimental enhancement schemes, it was found that the un-sharp filter provided results that were closest to the ones desired. The unsharp filter is created from the negative of the Laplacian filter. Certain parameters are tuned to provide improved results.
B. Step 2. Boundary Tracing
Tracing the boundaries of the eyes is important as finding the outline of the eyes makes it easier (computationally) to localize the position of the irises. The eye boundaries in the binary image were found by tracing the exterior boundaries of objects, as well as boundaries of holes inside these objects. The boundaries of the outermost objects (parents) are traced along with their children
C. Step 3. Iris and Pupil Detection
Several calculations were performed on both cropped images in order to detect the actual position of the iris. This in turn indicates which direction the user is looking in. There were 8 parameters calculated, namely: (min_x, y_min_x,max_x, y_max_x, min_y, x_min_y, max_y,x_max_y).
D. Steps 4. Driving the Interface
Mouse events were triggered based on the calculated values of the lokvariable. All mouse events were generated in Visual C. When the value of lokis calculated as 1, the cursor moves to the left. Similarly, when value of lokis 2, the cursor moves to the right. When the user is looking straight i.e. the iris is in the center and lokis 3, then the mouse click is generated at the current position of the mouse.
Conclusion
This paper comprehensive study of the gaze-based interaction processes is implemented. The mouse pointer is operated using eye. Hence we successfully develop low-cost based system aims to handle & also affordable for the majority of the physically challenge subjects. This makes the interaction more efficient and enjoyable
References
[1] A Review Paper on Mouse Pointer Movement Using Eye Tracking System and Voice Recognition Prajakta Tangade1 , Shital Musale1 , Gauri Pasalkar1 Miss. Umale M.D., Miss. Awate S.S. 1Computer Engineering, DCOER, Pune, India
[2] Shazia Azam, Aihab Khan, M.S.H. Khiyal, “design and implementation of human computer interface tracking system based on multiple eye features” JATIT?journal of theoretical and applied information technology, Vol.9, No.2 Nov, 2009.
[3] Craig Hennessey, Jacob Fiset, “Long Range Eye Tracking: Bringing Eye Tracking into the Living Room”, IEEE, 2012
[4] Sidra Naveed, Bushra Sikander, and Malik Sikander Hayat Khiyal “Eye Tracking System with Blink Detection”,
[5] Yingxi Chen, “Design and evaluation of a human-computer interface based on electro-oculography”, 2003, unpublished. URL: vorlon.case.edu/~wsn/theses/ yingxichen_thesis.pdf
[6] Arantxa Villanueva ,Rafael Cabeza,Sonia Porta “Eye Tracking System Model With Easy Calibration”,IEEE , 2011
[7] Arie E. Kaufman, Amit Bandopadhay, and Bernard D. Shaviv “An Eye Tracking Computer User Interface”, IEEE , 2011
[8] Takehiko Ohno ,Naoki Mukawa ,Shinjiro Kawato “Just Blink Your Eyes: A Head-Free Gaze Tracking System”, IEEE, 2011
[9] Shazia Azam, Aihab Khan, M.S.H. Khiyal, “design and implementation of human computer interface tracking system based on multiple eye features” JATIT?journal of theoretical and applied information technology, Vol.9, No.2 Nov, 2009.
[10] Margrit Betke, James Gips,Peter Fleming ,“The Camera Mouse: Visual Tracking of Body features to Provide Computer Access for People With Severe Disabilities”, IEEE Transactions On Neural Systems And Rehabilitation Engineering, Vol.10, No.1, March 2008