Spoken-language app makes meal logging easier, could aid weight loss

For human beings struggling with obesity, logging calorie counts and different nutritional data at each meal is tested to lose weight. However, the method does require consistency and accuracy, and whilst it fails, it’s typically because human beings do not have the time to locate and file all the statistics they need.

A few years in the past, a crew of nutritionists from Tufts University who were experimenting with cellular-smartphone apps for recording caloric consumption approached individuals of the Spoken Language systems institution at MIT’s laptop technological know-how and synthetic Intelligence Laboratory (CSAIL), with the concept of a spoken-language software that might make meal logging even less complicated.


Spoken language app

This week, at the global conference on Acoustics, Speech, and Signal Processing in Shanghai, the MIT researchers, present a web-primarily based prototype in their speech-controlled nutrition-logging system. With it, the consumer verbally describes the contents of a meal, and the gadget parses the description and routinely retrieves the pertinent dietary records from an online database maintained via the U.S. branch of Agriculture (USDA).

The information is displayed collectively with pictures of the corresponding foods and pull-down menus that permit the consumer to refine their descriptions, selecting, as an example, particular quantities of food. But those refinements can also be made verbally. A user who starts by saying, “For breakfast, I had a bowl of oatmeal, bananas, and a glass of orange juice” can then make the amendment, “I had 1/2 a banana.” The device will replace the information it shows approximately bananas while leaving the relaxation unchanged.

“What [the Tufts nutritionists] have skilled is that the apps that have been available to help human beings try and log meals tended to be a touch tedious, and therefore people didn’t preserve up with them,” says James Glass, a senior research scientist at CSAIL, who leads the Spoken Language structures group. “so that they have been searching out methods that were accurate and smooth to enter records.”

The new paper’s first creator is Mandy Korpusik, an MIT graduate scholar in electrical engineering and computer technology. She’s joined through Glass, her thesis marketing consultant; her fellow graduate pupil Michael fee; and Calvin Huang, an undergraduate researcher in Glass’s group.

Context sensitivity

Inside the paper, the researchers report the effects of experiments with a speech-recognition gadget that they developed especially to handle food-related terminology. However, that wasn’t the main attention in their paintings; indeed, a web demo of their meal-logging system as an alternative uses Google’s unfastened speech-popularity app.

Their studies targeting two different troubles. One is figuring out words’ useful role: The machine wishes to apprehend that if the user data the phrase “bowl of oatmeal,” nutritional records on oatmeal are pertinent; however if the word is “oatmeal cookie,” it’s not. The other hassle is reconciling the consumer’s phraseology with the entries in the USDA database. For example, the USDA statistics on oatmeal are recorded under the heading “oats”; the phrase “oatmeal” indicates up nowhere within the access.

To address the first problem, the researchers used the system to get to know. Via the Amazon Mechanical Turk crowdsourcing platform, they recruited employees who, without a doubt, defined what they had eaten at the latest food, then categorized the pertinent phrases within the description as names of ingredients, portions, logo names, or modifiers of the food names. In “bowl of oatmeal,” “bowl” is a amount, and “oatmeal” is a meal; however, in “oatmeal cookie,” oatmeal is a modifier. Once they had roughly 10,000 labeled meal descriptions, the researchers used system-learning algorithms to find patterns within the syntactic relationships between phrases that might perceive their functional roles.

Semantic matching

To translate between users’ descriptions and the labels within the USDA database, the researchers used an open-source database known as Freebase, which has entries on greater than 8,000 commonplace food items, lots of which consist of synonyms. Wherein synonyms had been lacking; they again recruited Mechanical Turk workers to deliver them.

The model of the system presented at the convention is intended mainly to illustrate the viability of its technique to natural-language processing; it reports calorie counts but would not yet total them automatically. A version that does is inside the works, however. When it’s entire, the Tufts researchers plan to conduct a person examine to determine whether or not it certainly makes nutrition logging easier.

“I think logging is particularly useful for many humans,” says Susan Roberts, director of the strength Metabolism Lab at Tufts’ USDA-subsidized Jean Mayer Human nutrients studies middle on getting old. “It makes people more self-aware of the junk they’re eating and the way little they virtually experience it, and the surprise of large portions, et cetera. But currently, it’s miles really tedious to log your food. There are several programs like MyFitnessPal where you can manually input it by way of hand. Regardless of shortcuts, it is tedious and not as user-pleasant because it wishes to be for tens of millions of people. To use it sincerely often.”

Read Previous

Researchers develop system to control information leaks from smartphone apps

Read Next

CBSE releases CTET exam date for Haryana candidates – check here