In Todays time, when maintaining the classes and scheduling time for students of respective subject is dynamically hard, Colleges can’t able to give their 100% because the data of student attendance is not perfectly arranged, hence unable to provide guest lecture, external workshop, and many more extra circular activities to its peak (as much as possible). If we do such task manually then the management will become a very time consuming and difficult task. In today’s modern days algorithm like HOG, CNN, fisherfaces, Eigenfaces, and etc. are examples of one of the many algorithms that are used in these modern days and our system take the count the number of students in a digital format. Our system is used to identify the person who is going in a class and count total number of persons present in a classroom. The count is stored into the person profile or attendance ledger of that person we use google cloud platform as our Database. We use a Raspberry Pi which make our system portable and hence, it’s easy to setup our system anywhere and very easily in the classroom. Our Raspberry Pi is connected to either College WIFI or College Ethernet so that our Realtime system send email to person that are present, if he enters late in the class then a mail will also come that you are late, if the person wants more detail about their attendance, then he/she check our android app where its shows no. of days present, percentage of person attendance and which days he/she is present and absent and etc.
Introduction
I. INTRODUCTION
Many studies have shown that the attendance of students in universities shows a falling trend over years [1]. The various factor that involves in the poor attendance because of student does not like teaching style of teacher or student does not want to have any interest to attend the classes for learning, other extracurricular activities like part-time jobs or availability of online-content material [1]. Thus, our system aims to achieve to solve the problem of poor attendance or management related to attendance of any system for example taking attendance of students [3].
A. Evolution of Face Recognition
The first attempt to use face recognition was made in 1960’s with semi-automated system. The photos that are marked to locate some major feature like eyeball(iris), lips & nose [2]. The main problem was that to select a portion of record from the existing to match and verify the photo image.
Another approach which seeks to classify the human face using combination of gestures and identifying markers [9]. But this approach requires lot of training faces to achieve decent accuracy. Fisher’s approach [8] was to measure different pieces of the face which are then mapped onto a global template. The main problem is that the given features does not have the potential to show a adult face of a person. The first fully automated system [10] developed uses very general pattern recognition. It compared faces to a generic face model and created a pattern. But the approach is failed because of statistical and relies on histograms and grey scale value [5].
II. SYSTEM OVERVIEW
Our system uses the HOG algorithm for face recognition [6]. The method takes the input face from an image frame and take covert the object of focus into grey scale. The method also uses the HOG algorithm to identify the presence of a person(face) and its identity. The method involves the following steps. The 1st step, the system should be initialized with a set of training faces. When a face is detected the algorithm i.e., HOG algorithm checks the person identity for that face. When detection of face is found then it compares the greyscale image of stored person identity which is present in the dataset from the current face and the stored face image and determines whether the face is identified or not. The 2nd step (optional) is that if the face is detected but the person identity is not stored in database then, the person is labelled as “Unknown” [4][6][7].
A. Components
The main component used in the system are open-source library i.e., OpenCV, python 3.8 or above version (as main source code), Google Cloud Platform in which we create 2 API, first for storage of data i.e., Google Drive API and Second for display data in a table for example spreadsheet i.e., Google Sheet API and a JSON file which have credentials of above 2 API’s which are going to be linked in our code. An Android App for people where they can see their attendance in the live running app. A Raspberry Pi 4B device is being used in our system along with a WebCam or Pi Camera to take or capture the frame of person when he/she enters in the classroom.
B. Process
The First step is the student must register themselves into our database. Our system takes their face image and make an encoding file which is used in our recognition part also it registers their email id and generate an OTP code in order to use our Android App, (our system send the OTP code via email to the registered user email id) once registration process is complete then the user of our system when comes near it then, it recognizes the user and send a email that you are marked present if he come on or before the scheduled time of a lecture, meeting and etc but if he/she comes late then the scheduled time then a email will go that you are marked absent. If the user wants to check a detailed status of their attendance, then he/she check our android app in which the percentage of attendance, no of days present and which day he is present and absent. If user haven’t come to attend lecture or meeting then our system automatically marks his attendance absent.
C. Algorithm
HOG is a simple and powerful feature descriptor. It is not only used for face detection but also it is widely used for object detection like cars, pets, and fruits. HOG is robust for object detection because object shape is characterized using the local intensity gradient distribution and edge direction [4][6][7].
Step1: Converts the image into greyscale.
Step2: Divide the image into small block gradients.
Step3: Match the block gradients to the stored data block gradients image.
Step4: If image matched then return 1 else return 0.
III. SYSTEM IMPLEMENTATION
There are some steps involved for implementing proposed system.
Load the face image stored in our dataset and save the details of its encoding file in our encoding folder.
Detect & recognize the image which appear in the camera frame.
Compare the real time image with our dataset image, if image is not matched then the person is labelled as “Unknown”.
Stored the name of person into our database i.e., Google Cloud Platform
A. Face Detection And Extraction
The dataset.py file is used to generate user dataset if not stored in our existing dataset, the dataset.py calls enroll.py file in which the person face image, email, name, person face encoding is processed, the person faces stored in “known_face_photos” folder and the encoding file of that person is stored in “known_face_encodings” folder, the new registered user get the email OTP for our android app registration window which comes from emailing.py
In main.py file is the main recognition and updating the attendance sheet is done, the main.py file uses recognition.py file in which the user face is detected, and spreadsheet.py file updates the attendance of user if it gets 1 as a returned value from the recognition.py file then your attendance is marked present, if it gets 0 then the attendance is marked absent and this message is sent by mail to user from emailing.py file.
B. Recognization and Identification
In OpenCV, we have a function called run_recognition(): which will implement the recognition of the faces. It has three steps in which two of them are already done that is loading the face image and projecting onto the subspace. The load_facial_encodings_and_names_from_memory(): loads the face image into face image array. cv2.rectangle draw a box around the face if face is detected & cv2.putText draw a label with a name below the face, this process is done by capturing a frame of the camera module or webcam and compare the image to our stored dataset images, after successfully detecting the face our system send email to the user email and tell the status of today’s attendance
IV. ANALYSIS AND RESULTS
A. Analysis
The analysis process involves the following steps:
Step 1: Face Detection and Extraction: Images can be captured with the help of webcam on the user side
Start: The captured image should be processed and extracted. The HOG algorithm compares the real time images with the existing face images stored in the database. If the face matches to our existing face dataset, then reorganization part is done, otherwise the person will be labelled “unknown”.
End
B. Results
The result of the analysis process is presented here,
V. FUTURE SCOPE
Today, along with drones, AI and IoT, facial recognition technology is also defining our millennium. Facial recognition is a biometric technology used for authentication and examination of individuals by correlating the facial features from an image with the stored facial database. Face Recognition is one of the most popular applications of image analysis software and no more considered as a subject of science fiction. Earlier, this technology was only used for security and surveillance purposes, but it has safely transitioned to the real world in recent times. Today, companies are pitching facial recognition software as the future of everything from retail to policing. A time will come when our faces will be our ID cards. With advent of facial recognition technology, that time is already here.
Conclusion
Google Cloud Platform used for maintaining and storing data of student of respective college/University. The HOG algorithm serves as our base to identify faces in the video. The student attendance is updated in the Google Sheet. Google Sheet help us to maintain attendance record automatically with the use of our system which help us to reduce the faculty effort and to manage the time effectively. The system record attendance for particular amount of time and after the time expires it marked their attendance as absent and give the message late. The result of the system shows improved performance in the estimation of attendance compared to traditional pen and paper type attendance system.
References
[1] J. Mehariya, C. Gupta, N. Pai, S. Koul and P. Gadakh, \"Counting Students using OpenCV and Integration with Firebase for Classroom Allocation,\" 2020 International Conference on Electronics and Sustainable Communication Systems (ICESC), 2020, pp. 624-629.
[2] L. Li, X. Mu, S. Li and H. Peng, \"A Review of Face Recognition Technology,\" in IEEE Access, vol. 8, pp. 139110-139120, 2020.
[3] N. R. Borkar and S. Kuwelkar, \"Real-time implementation of face recognition system,\" 2017 International Conference on Computing Methodologies and Communication (ICCMC), 2017, pp. 49-255.
[4] M. Khan, S. Chakraborty, R. Astya and S. Khepra, \"Face Detection and Recognition Using OpenCV,\" 2019 International Conference on Computing, Communication, and Intelligent Systems (ICCCIS), 2019, pp. 116-119.
[5] Chenggang Zhen and Yingmei Su, \"Research about human face recognition technology,\" 2009 International Conference on Test and Measurement, 2009, pp. 420-422.
[6] M. Sahu and R. Dash, \"Study on Face Recognition Techniques,\" 2020 International Conference on Communication and Signal Processing (ICCSP), 2020, pp. 0613-0616.
[7] C. Beumier, \"3D Face Recognition,\" 2006 IEEE International Conference on Industrial Technology, 2006, pp. 369-374.
[8] M. A. Fischler and R. A. Elschlager, “The Representation and Matching of Pictorial Structures,” IEEE Transaction on Computer, vol. C22, pp. 67-92, 1973
[9] S. S. R. Abibi, “Simulating evolution: connectionist metaphors for studying human cognitive behaviour,” in Proceedings TENCON 2000, vol. 1 pp 167-173, 2000
[10] Y. Cui, J. S. Jin, S. Luo, M. Park, and S. S. L. Au, “Automated Pattern Recognition and Defect Inspection System,” in proc. 5 th International Conference on Computer Vision and Graphical Image, vol. 59, pp. 768 – 773, May 1992.