What's cooking and Why? Behaviour Recognition during Unscripted Cooking Tasks for Health Monitoring (original) (raw)

2017, PerCom 2017 (Work in progress session)

Nutrition related health conditions can seriously decrease quality of life; a system able to monitor the kitchen activities and eating behaviour of patients could provide clinicians with important indicators for improving a patient's condition. To achieve this, the system has to reason about the person's actions and goals. To address this challenge, we present a behaviour recognition approach that relies on symbolic behaviour representation and probabilistic reasoning to recognise the person's actions, the type of meal being prepared and its potential impact on a patient's health. We test our approach on a cooking dataset containing unscripted kitchen activities recorded with various sensors in a real kitchen. The results show that the approach is able to recognise the sequence of executed actions and the prepared meal, to determine whether it is healthy, and to reason about the possibility of depression based on the type of meal.

Analysing Cooking Behaviour in Home Settings: Towards Health Monitoring †

2019

Wellbeing is often affected by health-related conditions. Among them are nutrition-related health conditions, which can significantly decrease the quality of life. We envision a system that monitors the kitchen activities of patients and that based on the detected eating behaviour could provide clinicians with indicators for improving a patient’s health. To be successful, such system has to reason about the person’s actions and goals. To address this problem, we introduce a symbolic behaviour recognition approach, called Computational Causal Behaviour Models (CCBM). CCBM combines symbolic representation of person’s behaviour with probabilistic inference to reason about one’s actions, the type of meal being prepared, and its potential health impact. To evaluate the approach, we use a cooking dataset of unscripted kitchen activities, which contains data from various sensors in a real kitchen. The results show that the approach is able to reason about the person’s cooking actions. It i...

Recognition of cooking activities through air quality sensor data for supporting food journaling

Human-centric Computing and Information Sciences, 2020

Unhealthy behaviors regarding nutrition are a global risk for health. Therefore, the healthiness of an individual’s nutrition should be monitored in the medium and long term. A powerful tool for monitoring nutrition is a food diary; i.e., a daily list of food taken by the individual, together with portion information. Unfortunately, frail people such as the elderly have a hard time filling food diaries on a continuous basis due to forgetfulness or physical issues. Existing solutions based on mobile apps also require user’s effort and are rarely used in the long term, especially by elderly people. For these reasons, in this paper we propose a novel architecture to automatically recognize the preparation of food at home in a privacy-preserving and unobtrusive way, by means of air quality data acquired from a commercial sensor. In particular, we devised statistical features to represent the trend of several air parameters, and a deep neural network for recognizing cooking activities ba...

A dataset for complex activity recognition withmicro and macro activities in a cooking scenario

ArXiv, 2020

Complex activity recognition can benefit from understanding the steps that compose them. Current datasets, however, are annotated with one label only, hindering research in this direction. In this paper, we describe a new dataset for sensor-based activity recognition featuring macro and micro activities in a cooking scenario. Three sensing systems measured simultaneously, namely a motion capture system, tracking 25 points on the body; two smartphone accelerometers, one on the hip and the other one on the forearm; and two smartwatches one on each wrist. The dataset is labeled for both the recipes (macro activities) and the steps (micro activities). We summarize the results of a baseline classification using traditional activity recognition pipelines. The dataset is designed to be easily used to test and develop activity recognition approaches.

Recognising the actions during cooking task (Cooking task dataset)

2011

The dataset contains the data of acceleration sensors attached to a person during the execution of a kitchen task. It consists of 7 datasets that describe the execution of preparing and having a meal: preparing the ingredients, cooking, serving the meal, having a meal, cleaning the table, and washing the dishes. The aim of the experiment is to investigate the ability of activity recognition approaches to recognise fine-grained user activities based on acceleration data. The results from the dataset can be found in the PlosOne paper "Computational State Space Models for Activity and Intention Recognition. A Feasibility Study" by Krüger et al.

Loading...

Loading Preview

Sorry, preview is currently unavailable. You can download the paper by clicking the button above.