AI meal image analysis software under scrutiny

By Nikki Hancocks

- Last updated on GMT

getty | Hinterhaus Productions
getty | Hinterhaus Productions

Related tags personalised nutrition AI Innovation Foodtech

Research from the University of Bern, Switzerland, backs the usability of 'goFOODLite' software to provide nutritional analysis of meals from a single image but suggests advancements for improved accuracy.

Innovators in the personalised nutrition space are constantly working to simplify, speed up, and improve the accuracy of nutrition intake assessment, as reliance on self-report questionnaires and interviews are both time and resource inefficient, and rely on both memory and honesty.

As such, meal image analysing software has been developed to allow smartphone users to understand the nutritional content of their meals with the snap of a photo. This technology has been rapidly evolving in the last couple of years yet existing apps still heavily rely on manual or semi-automatic data entry, particularly for estimating portion sizes. Apps that have attempted to incorporate automatic portion estimation often necessitate phones equipped with depth sensors or the recording of lengthy videos or multiple frames from different angles, and thus introduce potential discrepancies in results based on different conditions.

The authors of the current study previously demonstrated the effectiveness of goFOOD​, an AI-based mobile application for automatic dietary assessment, in a controlled setting. However, a significant limitation of the previous system was the requirement for users to capture two images or a short video, and this could be perceived as a burden.

They state in the current study: "To overcome such challenges, we propose an enhanced system utilising a single image captured with a smartphone, and we compared this to an adaptation of the previous version. While our previous study demonstrated the adequacy of AI-based dietary assessment in a controlled hospital setting, we now sought to assess its applicability in free living conditions."

Methodology

The study recruited 50 adult participants, who were each given an Android phone with the goFOODLite app installed. 

For a single day, the participants had to record their food and drink items before and after consumption, while ensuring that the reference card was placed next to the meal items. To retrospectively analyse the data collected and evaluate our system, we extracted two images from the videos at 90° and 75°, as based on the smartphone orientation.In the case of the consumption of a packaged product, the users had the option of capturing a photo of the product’s barcode.In addition, users had to fill out a feedback questionnaire regarding the goFOODLite app.The following day, a dietitian contacted the participants and asked them about all of the foods and beverages they had consumed over the previous 24 hours and subsequently analysed and logged the participants' nutrient intake.

Accuracy

For the newly proposed method that uses a single image, the mean absolute percentage error in kcal estimation per participant was 27.41%. Researchers observed a percentage error of 31.27% for the CHO, 39.17% for the protein, and 43.24% for the fat estimation compared to the dietitians’ estimations, which used the 24 h recall method.

The previous method that used two images, gave a percentage of error of 31.32% for the kcal, 37.84% for the CHO, 42.41% for the protein, and 51.75% for the fat intake estimations.

That said, the person with the highest error for the kcal intake (64.53%) recorded only four meals, of which one depicted a glass of water, and in two, the reference card was missing. These issues could potentially introduce inconsistencies and inaccuracies in overall results.

Participant feedback

The authors note that user preference plays a pivotal role in the adoption and success of any system as users prioritise systems that minimise effort, even if this forfeits accuracy.

Of the 50 participants enrolled in the study, 8 did not complete the final feedback questionnaire. Based on the responses, 29/42 of the participants (69%) agreed that the logging and recording were good or very good, while 35 (83.3%) found the app easy to use and self-explanatory. Regarding the time needed to complete a recording, 23/42 (54.8%) agreed it was good or very good, while 10 (23.8%) expressed no opinion.

However, when the participants were asked, in the “user satisfaction/usage” question, if they would use the app, 19 (45.2%) remained neutral. On the other hand, 22/42 (52.4%) participants said that they would be willing to use the goFOODTMLite application for tracking their food intake in their everyday life, while 30 (71.4%) answered that they would recommend it to their friends. 

The authors therefore conclude that whilst the accuracy of the new system was not significantly better than the previous version, participant feedback suggests the new method can potentially enhance user satisfaction and adherence to the system by requiring only a single image capture.

Future improvements

Discussing areas for improvement, they note the highest misestimation was in relation to fat.

"This can be attributed mainly to the amount and type of oil and butter present in the food, which varies for each participant and cannot be visually extracted from images. For example, the participant with the highest error in kcal estimation used 50 g of olive oil, which leads to an additional 400 kcal and 45 g of fat. It is important to note that not only do ingredients affect the nutritional content of foods, but different cooking methods also play a role in this matter. 

"In the future, approaches such as incorporating recipes or manual entries could be introduced in the pipeline to tackle these issues."

The researchers will next conduct a second phase of research in which participants will be asked to use the goFOOD system in real-time for one week, as well as record their food intake with an FFQ and participate in two unannounced 24 h recalls. Users will directly capture pictures from the goFOOD app, thus eliminating the use of video recording methods. The users will see the results of the segmentation, recognition, and volume estimation modules, and will be able to change them if needed and manually input the amount of hidden ingredients, like oil or butter.

The researchers write: "Through app nudges, we aim to encourage users not to neglect their meals and actively utilise the reference card. In future investigations, integrating a food weighing method or submitting captured images to a qualified dietitian for meticulous assessment emerges as the most promising alternative to dietitians performing 24 h recall.

"Finally, we must acknowledge the need to refine our system to ensure its effectiveness across various food types and under multiple conditions, such as varying distances and viewing angles. The depth model we used was not specifically trained on closely viewed items; therefore, its effectiveness in accurately estimating volumes for such items may require additional optimisation. These adjustments will contribute to the enhanced accuracy and dependability of the collected data, as well as ultimately improving the performance of our system."

Source: Nutrients

https://doi.org/10.3390/nu15173835

The Nutritional Content of Meal Images in Free-Living Conditions—Automatic Assessment with goFOODTM

Authors: Papathanail, I.; Abdur Rahman, L.; Brigato, L.; Bez, N.S.; Vasiloglou, M.F.; van der Horst, K.; Mougiakakou, S.

Related topics Research Personalised nutrition

Follow us

Products

View more

Webinars