Google developing calorie counting robots, so that it will know exactly how much people eat

The company will be able to advertise burgers to you, just as it knows you’re getting hungry for one

Andrew Griffin
Thursday 04 June 2015 05:05 EDT
Comments
The robots don't need to use special pictures, so could get to know your habits just through Instagram
The robots don't need to use special pictures, so could get to know your habits just through Instagram (@Ibzo)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Google is developing highly-intelligent robots that can analyse the amount of food on a plate and then count the calories that people are consuming.

The company hopes that the technology will be used by people to keep closer tabs on what they’re eating, if they’re dieting or looking to restrict how much they eat. A number of similar solutions already exist, but require people either to enter the nutritional information manually or to scan in the packaging of their food and guess how much of it they have eaten.

The system is called Im2Calories, reports Popular Science. It uses machine learning to recognise the individual pieces of food, measures how big they are in relation to the plate, and then converts them into calories.

The pictures don't need to be specially-taken or high resolution, so the robots could do their analysis using only Instagram pictures.

Google already has robots that have learnt to recognise certain things in pictures, using them in its newly-released Google Photos app. Competitors including Flickr have launched similar technology, which uses “machine learning” — computers built to think and understand like humans.

That means that the system can gradually get better, as human feedback tells it what it’s got wrong or right. Machine learning means that the computers’ designers don’t need to spend time telling it how to recognise things, because its users do instead.

The company says that the system doesn’t initially need to be that accurate. As long as it’s good enough to get people using it, the AI will start learning from the corrections.

““If it only works 30 percent of the time, it's enough that people will start using it, we'll collect data, and it'll get better over time,” said Murphy, according to Popular Science.

But counting calories has repeatedly been found to be the wrong approach to understanding how healthy a diet is, as The Verge points out. Numerous scientists have said that just eating fewer calories and avoiding fatty foods isn’t a good approach.

“What you eat makes quite a difference,” the lead author of a 2011 study told the New York Times. “Just counting calories won’t matter much unless you look at the kinds of calories you’re eating.”

The Verge also points out that the machine is using food labels, which have often been shown to be inaccurate.

The company has filed a patent for Im2Calories. Popular Science reports that Murphy and Google wouldn’t say when it would be available.

But eventually Google hopes to use the same technology for wider functions.

“If we can do this for food, that's just the killer app,” Murphy told Popular Science. “Suppose we did street scene analysis. We don't want to just say there are cars in this intersection. That's boring. We want to do things like localize cars, count the cars, get attributes of the cars, which way are they facing.

“Then we can do things like traffic scene analysis, predict where the most likely parking spot is. And since this is all learned from data, the technology is the same, you just change the data.”

Google’s focus has moved towards data analysis and machine learning in recent years. At the recent Google I/O developer conference, the company focused less on its Android mobile operating system and other headline products, instead largely selling itself as a data analysis company.

But that has also made some worried, since the end goal of that analysis is selling the data it generates and using it for ads. If Google knew exactly how much and what food its users were eating, for instance, it might be able to serve special ads for local burger joints when its data indicated that you were probably hungry for one.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in