Ijraset Journal For Research in Applied Science and Engineering Technology
Authors: Viruj Thakur, Anuj Maheshwari, Shuchi Sharma
DOI Link: https://doi.org/10.22214/ijraset.2023.57772
Certificate: View Certificate
This research explores the development of an innovative culinary solution – the ingredient-Inspired Recipe Recommender. Focused on addressing the global challenge of food wastage, exacerbated by both consumer habits and quality deterioration, our study employs advanced deep learning architectures. A key aspect involves a detailed comparison of ResNet models (ResNet-50, ResNet-101, and ResNet-152) alongside alternative architectures like DenseNet, VGG, and xResNet. The objective is to identify the most effective neural network for accurately recognizing ingredients and generating insightful recipe recommendations. In response to the contemporary issues of rushed lifestyles, processed food reliance, and insufficient attention to nutrition, our research aims to empower individuals with a practical, technology-driven tool. By seamlessly integrating deep learning into the culinary landscape, the ingredient-Inspired Recipe Recommender suggests personalized recipes based on available ingredients, fostering healthier eating habits and contributing to a reduction in food wastage. This paper presents the methodology employed for the comparative analysis, reports experimental results, and discusses broader implications within the realms of food sustainability and technological innovation. In addressing the evolving needs of individuals, our research strives to align technology and gastronomy for positive environmental and societal impact.
I. INTRODUCTION
In a world characterized by an abundance of culinary choices, the intersection of technology and gastronomy presents a unique opportunity to address challenges associated with food wastage and quality degradation. The rapid expansion of digital platforms has changed how people interact, exchange, and discover knowledge around food, creating new opportunities for creative thinking. The creation of an intelligent system called an Ingredient-Inspired Recipe Recommender, which is intended to generate customized recipes based on components that are readily available, is the main topic of this research article. We conduct a thorough comparison of ResNet result parameters (ResNet-50, ResNet-101, and ResNet-152) as a pivotal phase in our project to determine which neural network is best for feature extraction in the context of food image recognition. Food waste is a grave worldwide problem that has attracted more attention recently. According to the UNEP's Food Waste Index Report, an additional 17% of our food is wasted by customers, especially in households [1]. Concurrently, this problem is made worse by the deterioration in food quality brought on by extended storage, incorrect handling, and ineffective use. To address these problems, our work attempts to improve the effectiveness of ingredient-based recipe recommendations by utilizing cutting-edge deep learning architectures. ResNet, an acronym for Residual Network, has become a potent instrument for image recognition tasks, demonstrating its capabilities across multiple fields such as computer vision and object detection. The ResNet family, characterized by its residual learning framework, has three prominent variants—ResNet-50, ResNet-101, and ResNet-152. These variants differ in depth and complexity. This work examines how well these architectures work in the context of food image recognition,assessing how well they can reliably identify ingredients and further facilitate?insightful recipe suggestions. Extending to the comparison, the ResNet architectures have also been compared with other counterparts. (DenseNet, VGG, xResNet, etc). By selecting the optimal neural network architecture for our recipe recommender system, we aim to enhance the accuracy and reliability of ingredient recognition, ultimately contributing to the reduction of food wastage. The seamless integration of technology into the culinary landscape not only addresses the challenges posed by excessive food loss but also empowers individuals to make informed choices about their meals, promoting sustainability and responsible consumption. In the subsequent sections of this paper, we delve into the methodology employed for our comparative analysis, present the experimental results, and discuss the implications of our findings in the broader context of food sustainability and technological innovation. Through this research, we strive to contribute to the ongoing discourse on leveraging artificial intelligence for positive environmental and societal impact.
II. PROBLEM STATEMENT
In the contemporary, fast-paced lifestyles that individuals lead, the conscientious consideration of dietary needs often takes a backseat. The modern world is marked by hectic schedules, time constraints, and an abundance of convenience-driven food choices. Amidst this hurried pace, there exists a concerning disconnect between individuals and their nutritional requirements. Therefore, dietary habits are frequently compromised, leading to an array of challenges such as increased reliance on processed foods, insufficient attention to ingredient quality, and a growing prevalence of food wastage. The rush of daily life often results in limited time for thoughtful meal planning and ingredient selection, contributing to a reliance on pre-packaged or easily accessible meals that may not align with optimal nutritional standards. This shift in dietary patterns not only jeopardizes individual health and well-being but also amplifies broader issues such as the perpetuation of unsustainable consumption practices and the exacerbation of global food wastage. Addressing this problem requires innovative solutions that seamlessly integrate into individuals' busy lives, providing them with tools to make informed and health-conscious dietary choices. The development of an Ingredient-Inspired Recipe Recommender serves as a strategic response to this challenge, aiming to bridge the gap between hectic lifestyles and nutritional mindfulness. By harnessing the power of deep learning, particularly through the comparison of ResNet architectures, our research endeavors to empower individuals with an intelligent system that suggests personalized recipes based on available ingredients, thereby promoting healthier eating habits and contributing to the reduction of food wastage. Through this, we strive to align the intersection of technology and gastronomy in a manner that addresses the evolving needs of individuals within the fast-paced dynamics of contemporary living.
III. PREVIOUS RESEARCH
As a similar piece of work, Maruyama et al. [2], developed a mobile system for Android and iPhones that employs bag-of-features with SURF and colour histogram for food ingredient recognition. The system, utilizing linear kernel SVM and the one-vs-rest strategy, allows users to instantly receive recipe suggestions based on photographed ingredients during grocery shopping or meal preparation. Rodrigues et al. [3], developed RecipeIS, which employed the use of ResNet-50 as their base neural network for training their dataset. In their case, the pre-trained model ResNet-50 was used due to its frequent use in scientific articles and being one of the most used models when the objective is the classification or recognition of a given image.
IV. METHODOLOGY
The work is divided into two portions. The first portion aims to recognize food ingredients using a given dataset. The second portion is a recipe recommendation based on the food ingredients identified in the first portion.
A. Proposed Model for Ingredient Identification
For the subsequent phase of food ingredient identification, the ResNet-101 convolutional neural network (CNN) architecture was chosen. ResNet-101 is an extension of ResNet-50, designed with a deeper structure for enhanced feature representation. While ResNet-50 employs 50 layers, ResNet-101 incorporates 101 layers, providing a more intricate network for improved classification and recognition tasks. Like ResNet-50, the ResNet-101 architecture comprises a series of convolutional blocks organized into states, each featuring identity and convolution blocks. The identity block, a fundamental component of ResNet, ensures consistency between input and output sizes, while the convolution block adjusts sizes when necessary. The key innovation in ResNet, the skip connection, remains integral in ResNet-101, facilitating the addition of the original input with the output of the convolution block, thereby aiding in gradient flow and alleviating the vanishing gradient problem. As the network progresses through states, the window size doubles while the input size halves, culminating in an Average Pooling layer. The final layers consist of a Fully Connected layer with a softmax activation function, comprising 1000 neurons for classification. ResNet-101, with its extended depth, boasts approximately 44 million trainable parameters, offering a more intricate representation of learned features. Throughout the experimentation phase, ResNet-101 demonstrated its potential to capture complex features, and its performance will be further elucidated in the upcoming results section using the designated dataset.
B. Dataset
Our dataset is meticulously crafted with a total of 114 distinct ingredient classes, each meticulously represented by 80 to 100 images. This deliberate curation ensures a large and diverse dataset crucial for the effective training of our AI algorithm. The inclusion of a substantial number of classes and images facilitates the model in learning intricate patterns and variations inherent in ingredient recognition. The dataset's diversity is a key strength, exemplified by the incorporation of images with varying resolutions.
This intentional variation mirrors the real-world conditions the model is likely to encounter, enhancing its adaptability and robustness. The inclusion of images with diverse resolutions is particularly relevant for our application, ensuring that the model can effectively recognize ingredients in various contexts and under different imaging conditions. Significantly, our dataset is strategically tailored to focus on ingredients commonly found in Indian households. This deliberate choice not only captures the nuances of culinary preferences prevalent in the region but also aligns seamlessly with the specific objectives of our proposed system for recipe recommendation. By emphasizing ingredients relevant to Indian cuisine, the dataset optimally prepares the model to cater to the unique culinary landscape and preferences of individuals in this cultural context. In essence, this comprehensive dataset serves as the cornerstone of our research. Its thoughtful composition and emphasis on diversity, resolution variation, and cultural specificity lay the foundation for developing a robust and accurate ingredient-inspired recipe recommender system. This tailored approach ensures that our AI model is finely tuned to the intricacies of Indian culinary practices, meeting the distinct needs of users in this specific cultural context. So, the characterization of the dataset used:
C. Application
Our recipe recommendation system comes to life through a sophisticated cross-platform application, expertly crafted using the Flutter [4] framework. Recognized for its prowess in delivering a consistent user experience across Android and iOS platforms, Flutter forms the backbone of our user interface, providing a seamless and user-friendly design. The frontend, thoughtfully designed with Flutter's versatile widgets, ensures an immersive and responsive user experience. To establish a robust link between the frontend and our potent deep learning model, we seamlessly integrate Flask [5], a widely used backend framework built on Python. Flask orchestrates efficient communication, allowing the Flutter frontend to seamlessly interact with the underlying recommendation engine. This synergy between Flutter and Flask ensures a fluid user experience, enabling real-time interactions with our intelligent recipe recommender system. Our machine learning model, trained for ingredient recognition, leverages the computational efficiency of Google Colab's [6] T4 GPU runtime.
The T4 GPU acceleration expedites model training, ensuring swift convergence and heightened learning capabilities from the dataset. Incorporating the prowess of Fastai [7], our deep learning model is underpinned by a pre-trained ResNet architecture. Fastai, a high-level deep learning library, simplifies complex tasks, making it instrumental in implementing and fine-tuning state-of-the-art models.
Within Fastai, the ResNet model is available in various versions, each offering distinct depths such as ResNet-50, ResNet-101, and ResNet-152. Our choice of the ResNet-101 model underscores its optimal balance between depth and computational efficiency, enhancing the accuracy of ingredient recognition. The communication channel between the Flutter frontend and Flask backend relies on APIs, acting as pivotal interfaces for seamless data exchange. These APIs facilitate the flow of user inputs to the backend, triggering the model to generate personalized recipe recommendations. The integration of APIs plays a crucial role in enhancing the responsiveness of our application, ensuring users receive dynamic recipe suggestions in real-time based on their ingredient inputs. In summary, our application seamlessly combines Flutter's cross-platform capabilities, Flask's backend robustness, and the computational efficiency of Google Colab's T4 GPU runtime. The incorporation of Fastai's pre-trained ResNet model adds a layer of sophistication to our deep learning capabilities, contributing to the overall responsiveness and effectiveness of our recipe recommender system. APIs play a pivotal role in orchestrating seamless data exchange, making our application a holistic and powerful tool for personalized recipe recommendations.
VI. DISCUSSION
A. Precision, Recall and F1 Score
At Epochs = 20: Improved with precision reaching 0.844730, recall at 0.839704, and F1 score of 0.837489. The improvement suggests that the model is continuing to learn and refine its ability to correctly identify ingredients.
2. ResNet-152: At Epochs = 5, Achieved a precision of 0.847160, recall of 0.841654, and F1 score of 0.840861. ResNet-152 starts with high precision and recall, indicating its initial effectiveness.
At Epochs = 10, Improved with precision reaching 0.857323, recall at 0.852574, and F1 score of 0.851366. While there is improvement, it is marginal, and the complexity of ResNet-152 may not provide proportional benefits to its computational cost.
3. ResNet-101: At Epochs = 10, Achieved a precision of 0.846647, recall of 0.843214, and F1 score of 0.841355. The high precision and recall at this stage indicate that ResNet-101 is effectively identifying and capturing relevant instances while minimizing false positives.
At Epochs = 20, showed further improvement with precision reaching 0.853080, recall at 0.847504, and F1 score of 0.846169. The ongoing improvement in precision and recall demonstrates the model's ability to fine-tune its performance and maintain a balance between precision and recall.
B. Accuracy and Error Rate
At Epochs = 20, improved accuracy to 0.840874 and reduced error rate to 0.159126. The improvement suggests that the model is learning from its mistakes and making more accurate predictions.
2. ResNet-152: At Epochs = 5, achieved an accuracy of 0.841654 and an error rate of 0.158346. ResNet-152 starts with high accuracy, indicating its initial effectiveness.
At Epochs = 10, increased accuracy at 0.852574 and slightly decreased the error rate to 0.147426. The improvement in accuracy and the loss in error rate is negligible which does not justify the model’s complexity.
3. ResNet-101: At Epochs = 10, achieved an accuracy of 0.843214 and an error rate of 0.156786. The high accuracy and low error rate suggest that ResNet-101 is making correct predictions for a large portion of the data.
At Epochs = 20, Improved accuracy to 0.847504 and reduced error rate to 0.152496. While there is a slight increase in accuracy, the reduction in the error rate is still indicative of the model's effectiveness.
C. Training Loss and Convergence
At Epochs = 20, continued decreasing to 0.024193, suggesting ongoing convergence. The ongoing decrease in training loss indicates that the model is still benefiting from additional training.
2. ResNet-152: At Epochs = 5, training loss reduced to 0.220987, indicating effective learning. The decreasing training loss suggests that the model is effectively learning from the training data.
At Epochs = 10, continued decreasing to 0.057970, showing ongoing learning and convergence. The continued decrease in training loss indicates that the model is still learning and has not plateaued.
3. ResNet-101: At Epochs = 10, training loss reduced to 0.088803, indicating effective learning. The decreasing training loss demonstrates that the model is learning from the training data.
At Epochs = 20, continued decreasing to 0.037847, showing ongoing learning and convergence. The continued decrease in training loss indicates that the model is still learning and has not plateaued.
We also tried other neural networks for the sake of comparison, such as xResNet-101, VGG-16, and DenseNet-121. However, from the observations above clearly show that xResNet-101, and VGG-16 fail to match our best network, ResNet-101. DenseNet-121 comes close however the computational increase lets us rule out DenseNet-121.
VII. ACKNOWLEDGMENT
We thank Prof. Shuchi Sharma* our project's mentor, Dr. Akhilesh Das Gupta Institute of Professional Studies, whose leadership and support have served as the compass guiding us through the challenging terrain of this research. Her valuable feedback and contribution remarkably enhanced our manuscript.
1) ResNet-101 exhibits a consistent and balanced improvement across precision, recall, F1 score, accuracy, and training loss, making it a robust choice for ingredient identification. 2) ResNet-50 shows improvement, but ResNet-101 maintains a competitive edge, especially in precision and recall. 3) ResNet-152, while effective, does not show substantial improvement from ResNet-101.
[1] UNEP food waste index report 2001 04 March 2001 Available online: https://www.unep.org/resources/report/unep-food-waste-index-report-2021. [2] Takuma Maruyama; Yoshiyuki Kawano; Keiji Yanai; Real-time Mobile Recipe Recommendation System Using Food Ingredient Recognition IMMPD \'12: Proceedings of the 2nd ACM international workshop on Interactive multimedia on mobile and portable devices November 2012 Pages 27–34 [CrossRef: https://dl.acm.org/doi/10.1145/2390821.2390830]. [3] Miguel Simoes Rodrigues; Filipe Fidalgo; Angels Oliveira; RecipeIS – recipe Recommendation system based on recognition of food ingredients Appl. Sci. 2023, 13(13), 7880 [CrossRef: https://www.mdpi.com/2076-3417/13/13/7880]. [4] Flutter Documentation: Available online: https://flutter.dev/ . [5] Colab Documentation: Available online: https://colab.google/ . [6] Flask Documentation: Available online: https://flask.palletsprojects.com/en/3.0.x/ . [7] FastAI: Available online: https://docs.fast.ai/ . [8] The Annotated ResNet-50. Explaining how ResNet-50 Works and Why it is so popular.|by Suvaditya Mukherjee|Towards Data Science. Available online: https://towardsdatascience.com/the-annotated-resnet-50-a6c536034758 (accessed on 18 December 2023). [9] Understanding and Implementation of Residual Networks (ResNets)|by Raghunandepu|Analytics Vidhya|Medium. Available online: https://medium.com/analytics-vidhya/understanding-and-implementation-of-residual-networks-resnets-b80f9a507b9c (accessed on 18 December 2023). [10] Dataset: Available Online: https://drive.google.com/drive/folders/1t51_bP8ZnNIV5m3DaDrtDEl-rTsiiXXc [11] GitHub Repository: Available Online: https://github.com/VJ1122001/Ingredient-Inspired-Recipe-Recommender
Copyright © 2023 Viruj Thakur, Anuj Maheshwari, Shuchi Sharma. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Paper Id : IJRASET57772
Publish Date : 2023-12-27
ISSN : 2321-9653
Publisher Name : IJRASET
DOI Link : Click Here