news

NYU develops AI tool that reads your meal’s nutrition from a photo

Posted: 19 March 2025 | | No comments yet

NYU researchers develop a new AI tool that analyses food photos to calculate nutrition, offering an effortless way to monitor diets.

NYU develops AI tool that reads your meal’s nutrition from a photo

Could AI-driven nutrition tracking soon be available to consumers with just the click of a button?

Counting calories and tracking nutrients has long been a tedious process, relying on food diaries and apps that require manual input. But a new AI-powered system from NYU Tandon School of Engineering could make it as simple as taking a photo.

The technology, detailed in a paper presented at the 6th IEEE International Conference on Mobile Computing and Sustainable Informatics, uses deep-learning algorithms to identify food items in images, estimate portion sizes, and calculate nutritional values, including calories, protein, fat and carbohydrates. Designed as a mobile-friendly web tool, it eliminates guesswork and could help millions manage weight, diabetes and other diet-related health concerns.

Cracking the challenges of food recognition

Despite its simplicity in practice, automated food tracking has long been a challenge due to the enormous variety in how meals appear. The same dish can be plated, prepared and presented in countless ways, making it difficult for AI to recognise.

Sunil Kumar, the paper’s co-author and Professor of Mechanical Engineering at NYU Abu Dhabi and Global Network Professor at NYU Tandon explained:

The sheer visual diversity of food is staggering. Unlike manufactured objects with standardised appearances, the same dish can look dramatically different based on who prepared it. A burger from one restaurant bears little resemblance to one from another place, and homemade versions add another layer of complexity.”

To overcome this, the NYU team developed an advanced recognition model that categorises similar food items and refines its training data to improve accuracy. The system was trained on 95,000 images spanning 214 food categories, ensuring reliable identification across a wide range of cuisines.

Another major challenge was accurately estimating portion sizes, essential for precise calorie counts. The NYU system incorporates volumetric analysis, using image processing to measure the physical area of food on a plate and correlate it with nutritional data. This innovation enables accurate calorie calculations without requiring users to input portion details manually.

Real-time analysis on any device

Unlike earlier models that required cloud processing, the NYU team prioritised efficiency, making the AI tool lightweight and fast. By integrating YOLOv8 image-recognition technology with ONNX Runtime, they developed a system that runs directly through a web browser, no need for an app. Users simply visit the website on their smartphone, take a picture of their meal, and receive an instant nutritional breakdown.

In real-world tests, the system delivered highly accurate results. A pizza slice, for example, was calculated at 317 calories, 10 grams of protein, 40 grams of carbohydrates, and 13 grams of fat. When applied to idli sambhar, a South Indian dish, it estimated 221 calories, 7 grams of protein, and 46 grams of carbohydrates, values that closely matched reference standards.

Prabodh Panindre, lead author and Associate Research Professor at NYU Tandon said:

One of our goals was to ensure the system works across diverse cuisines and food presentations. We wanted it to be as accurate with a hot dog — 280 calories according to our system — as it is with baklava, a Middle Eastern pastry that our system identifies as having 310 calories and 18 grams of fat.”

Future potential in health and nutrition

Achieving a mean Average Precision (mAP) score of 0.7941, the system can correctly identify and analyse food items with approximately 80 percent accuracy, even when overlapping or partially obscured.

Though currently a proof-of-concept, the AI-powered tool could have major implications for healthcare, fitness and food service industries. By integrating with health-tracking apps, it could provide users with an effortless way to monitor dietary intake with minimal effort.

Panindre added:

Traditional methods of tracking food intake rely heavily on self-reporting, which is notoriously unreliable. Our system removes human error from the equation.”

Accessible via any smartphone, the web-based application means AI-driven nutrition tracking could soon be available to consumers with just the click of a button.

Leave a Reply

Your email address will not be published. Required fields are marked *