University of Waterloo researchers have developed an AI-powered app that tracks our caloric and nutrient intake while we eat.
Dr. Yuhao Chen, a research assistant professor in the Department of Systems Design Engineering’s Vision and Image Processing (VIP) Lab, said the new tech is aimed at tackling malnutrition in aging populations to ensure that older people get the food they need to support a healthy and active lifestyle.
Unlike traditional nutrition apps requiring manual input, this AI-powered app automates the entire process. No need to upload food photos, search databases, or measure portions. Simply sit down with your meal, and a hardware-mounted camera records every bite as you eat. A visual language model (VLM) analyzes the video frame by frame, tracking utensil use and eating stages like chewing. Meta AI’s Segment Anything Model (SAM) ensures precise nutrient analysis by identifying exactly what and how much you’re consuming.
“The more an app requires user input, the more likely it is that people won’t continue using it long term,” said Chen. “We want to support ongoing nutrition monitoring, so we’re using AI to automate processes and place fewer requirements on the user.”
The system effectively analyzes calories and nutrients for foods eaten with a spoon and is now expanding to handle forks, chopsticks and hands. The team also aims to track other ingested substances, such as supplements and medications so that the app can analyze everything a person consumes. The focus remains though on providing older adults and caregivers with accurate nutrient tracking, filling dietary gaps, and delivering simple, user-friendly recommendations.
Go to Tackling malnutrition in older adults, bite by bite for the full story.