Systems, methods, and computer-readable media are disclosed for dynamic nutrition tracking with utensils. Example methods may include receiving a first user input from a user indicative of a meal event initiation, receiving a second user input associated with a first food item, and identifying nutri
Systems, methods, and computer-readable media are disclosed for dynamic nutrition tracking with utensils. Example methods may include receiving a first user input from a user indicative of a meal event initiation, receiving a second user input associated with a first food item, and identifying nutritional information associated with the first food item. The example method may include measuring a weight of a portion of the first food item based at least in part on an upward user gesture indicative of a food consumption event, wherein the weight is measured during the upward gesture, automatically determining that the food consumption event is completed based at least in part on a change in the weight, and calculating a calorie amount indicative of a number of calories in the portion based at least in part on the weight and the nutritional information associated with the first food item.
대표청구항▼
1. A method comprising: identifying, by one or more computer processors coupled to at least one memory, a first gesture indicative of a commencement of a first meal event comprising a first food item identifier associated with a first food item and a second food item identifier associated with a sec
1. A method comprising: identifying, by one or more computer processors coupled to at least one memory, a first gesture indicative of a commencement of a first meal event comprising a first food item identifier associated with a first food item and a second food item identifier associated with a second food item;designating the first food item as a first active food item;identifying a second gesture indicative of a first consumption event during which a first portion of food of the first meal event is to be consumed;triggering a first weight measurement based at least in part on the second gesture;generating a first weight of the first portion based at least in part on the first weight measurement;determining that the first food item is the first active food item;determining that the first consumption event is complete based at least in part on a third gesture;associating the first food item with the first consumption event;identifying a first nutritional information indicator associated with the first active food item based at least in part on the first food item identifier, wherein the first nutritional information indicator comprises information representative of caloric content of the first active food item;generating a first consumed calorie amount based at least in part on the first weight and the first nutritional information indicator;associating the first consumed calorie amount with the first consumption event;determining that the first meal event is complete; andgenerating a total consumed calorie indicator indicative of a total consumed calorie amount based at least in part on the first consumption event. 2. The method of claim 1, further comprising: designating the second food item as a second active food item;identifying a fourth gesture indicative of a second consumption event during which a second portion of food of the first meal event is to be consumed, wherein the fourth gesture is representative of the second gesture;triggering a second weight measurement based at least in part on the fourth gesture;generating a second weight of the second portion based at least in part on the second weight measurement;determining that the second food item is the second active food item;determining that the second consumption event is complete based at least in part on a fifth gesture representative of the third gesture;associating the second food item with the second consumption event;identifying a second nutritional information indicator associated with the second active food item based at least in part on the second food item identifier, wherein the second nutritional information indicator comprises information representative of caloric content of the second active food item;generating a second consumed calorie amount based at least in part on the second weight and the second nutritional information indicator; andassociating the second consumed calorie amount with the second consumption event;wherein the total consumed calorie indicator is based at least in part on the first consumption event and the second consumption event. 3. The method of claim 2, further comprising: determining a target total consumed calorie amount associated with the first meal event;adding the first consumed calorie amount and the second consumed calorie amount to generate a preliminary consumed calorie amount;comparing the preliminary consumed calorie amount to the target total consumed calorie amount;determining that the preliminary consumed calorie amount is equal to or greater than the target total consumed calorie amount; andgenerating a target consumption notification upon determining that the preliminary consumed calorie amount is equal to or greater than the target total consumed calorie amount. 4. The method of claim 1, further comprising: generating a first timestamp based at least in part on the third gesture;identifying a food consumption pacing time interval associated with the first meal event, the food consumption pacing time interval indicative of a desired length of time between consecutive food consumption events; andgenerating a pacing notification comprising vibrational feedback upon completion of the food consumption pacing time interval after the first timestamp. 5. The method of claim 1, further comprising: identifying a fourth gesture indicative of an active food designation;determining that the second food item identifier is associated with the fourth gesture; anddesignating the second food item as a second active food item. 6. The method of claim 5, further comprising: determining that a second food consumption event is complete; anddetermining that the second food consumption event is associated with the second active food item based at least in part on the fourth gesture, wherein the fourth gesture precedes initiation of the second food consumption event. 7. The method of claim 5, further comprising: determining that the first food item is the first active food item and the second food item is the second active food item;determining that a second food consumption event is complete; andselecting one of the first active food item or the second active food item to associate with the second food consumption event based at least in part on a first user input. 8. The method of claim 1, further comprising: determining a target total consumed calorie amount associated with the first meal event;determining that the total consumed calorie amount is less than the target total consumed calorie amount; andadjusting the target total consumed calorie amount for a second meal event based at least in part on a difference between the total consumed calorie amount and the target total consumed calorie amount. 9. A method comprising: receiving, by one or more computer processors coupled to at least one memory, an image of a first food item and a second food item adjacent to the first food item;identifying the first food item and the second food item;prompting a user to identify a first location of the first food item;receiving a first indication of the first location;associating the first location with the first food item;generating a first geofence about the first location based at least in part on the image;identifying a second location to associate with the second food item based at least in part on the first location and the image;generating a second geofence about the second location based at least in part on the image;determining that a food consumption event is initiated;determining an initiation location of the food consumption event;determining that the initiation location is within the first geofence; andassociating the first food item with the food consumption event. 10. The method of claim 9, further comprising: measuring a weight of a portion of the first food item based at least in part on an upward user gesture indicative of initiation of the food consumption event, wherein the weight is measured during the upward gesture; andcalculating a calorie amount indicative of a number of calories in the portion based at least in part on the weight and nutritional information associated with the first food item. 11. A food consumption utensil comprising: a food delivery surface comprising a first surface, a second surface, and a third surface, the food delivery surface configured to receive food on the first surface;a member extending from the food delivery surface;a weight sensor configured to measure weight at the food delivery surface;a motion sensor;a Bluetooth radio;a battery configured to power the weight sensor, the motion sensor, and the Bluetooth radio; anda controller in communication with the weight sensor, the motion sensor, the Bluetooth radio, and the battery, the controller comprising at least one memory storing computer-executable instructions and at least one processor communicatively coupled to the at least one memory configured to access the at least one memory and execute the computer-executable instructions to: associate the second surface of the food delivery surface with a first food item;associate the third surface of the food delivery surface with a second food item;receive a first input at the second surface;designate the first food item to an active state in response to the first input;receive a first indication from the motion sensor indicative of a first gesture;initiate a first food consumption event based at least in part on the first gesture;determine a first weight at the food delivery surface with the weight sensor;receive a second indication from the motion sensor indicative of a second gesture;determine that the first food consumption event is complete based at least in part on the second gesture;determine a first calorie amount to associate with the first food consumption event based at least in part on the first weight and calorie information associated with the first food item based on the active state; andsend the calorie amount to a user device with the Bluetooth radio. 12. The food consumption utensil of claim 11, wherein the controller is further configured to: receive a second input at the third surface;designate the second food item to the active state in response to the second input;designate the first food item to an inactive state;receive a third indication from the motion sensor indicative of the first gesture;determine a second weight at the food delivery surface with the weight sensor;receive a fourth indication from the motion sensor indicative of the second gesture;determine a second calorie amount to associate with the first food consumption event based at least in part on the second weight and calorie information associated with the second food item based on the active state; andgenerate a total consumed calorie amount based at least in part on the first calorie amount and the second calorie amount. 13. The food consumption utensil of claim 12, wherein the controller is further configured to: determine a target total consumed calorie amount;determine that the total consumed calorie amount is equal to or greater than the target total consumed calorie amount; andgenerate a target consumption notification upon determining that the total consumed calorie amount is equal to or greater than the target total consumed calorie amount. 14. The food consumption utensil of claim 13, wherein the target consumption notification comprises vibrational feedback. 15. The food consumption utensil of claim 11, wherein the controller is further configured to: generate a first timestamp based at least in part on the second gesture;identify a food consumption pacing time interval; andgenerate a pacing notification upon completion of the food consumption pacing time interval after the first timestamp. 16. The food consumption utensil of claim 11, wherein the controller is further configured to: determine a frequency at which the first food item is in the active state;determine that the frequency meets a prediction threshold;upon determining that the prediction threshold is met, generate an average time of day at which the first food item is in the active state; andprompt a user to designate the first food item to the active state at the average time day. 17. The food consumption utensil of claim 11, further comprising a button configured to designate the first food item to the active state. 18. The food consumption utensil of claim 11, further comprising a display, wherein the display is configured to present a target consumption notification and a pacing notification to a user. 19. The food consumption utensil of claim 11, wherein the second surface is positioned at a first tine and the third surface is positioned at a second tine.
연구과제 타임라인
LOADING...
LOADING...
LOADING...
LOADING...
LOADING...
이 특허에 인용된 특허 (7)
DiGirolamo Mariam E. (5 Page St. Danvers MA 01923) DiGirolamo Joseph E. (5 Page St. Danvers MA 01923), Cuttlery.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.