There are already a number of devices that allow people to keep track of what and how much they eat, in order to help themselves lose weight or maintain a better-balanced diet. Most of these gadgets, however, rely on the user to manually enter the data regarding each meal. The University of Alabama's Dr. Edward Sazonov is working at taking user error/deceitfulness out of the equation, by developing a headset-style diet-tracking device that automatically monitors what its wearer eats.

Known as the Automatic Ingestion Monitor (AIM), the 3D-printed prototype device is worn over one ear. Among other things, it incorporates a motion sensor, a tiny camera, and a Bluetooth transmitter.

When the user eats, the sensor detects the distinctive chewing motion of their jaw – it's able to tell the difference between that motion, and those that accompany activities such as talking. Once AIM is triggered by the chewing, its camera takes photos of what the user is eating. That data is transmitted by Bluetooth to a paired smartphone.

There, an app identifies the food(s) in the photos. It also estimates how much of it was consumed based on both the number of chews, and by comparing the amount of food visible in the photos when the chewing started, and when it ended. The app subsequently determines the energy content of that meal, and records that information for the user.

Sazonov plans on testing the accuracy of a refined version of the device by comparing it to an established technique, in which caloric intake is estimated by measuring the body’s elimination rate of stable hydrogen and oxygen isotopes added to water that they've consumed. If AIM does compare well, it could serve as a much quicker, cheaper and easier-to-administer alternative to the water-based method.

AIM would likely first reach the market as a medical device, although a consumer version could follow. The commercially-available Bite Counter already estimates caloric intake based on the user's movements, although it must be manually activated at the start of each meal, and records the number of times that the user's dominant hand raises food to their mouth.

Source: University of Alabama