Referring to documents is common when making things, but there is a difficulty caused by the gap between a written description and the actual context of making. For example, when cooking following a recipe, people may lose their current position in the recipe, misunderstand the required amount of ingredients because of complicated measuring units, or skip steps by mistake. We address these problems by selecting cooking as our domain. Our proposed cooking support system, MimiCook, embodies a recipe in a real kitchen counter and directly navigates a user. The system consists of a computer, a depth camera, a projector, and a scaling device. It displays step-by-step instructions directly onto the utensils and ingredients and controls the guidance display in accordance with the user’s situation. The integrated scaling device also helps users to avoid mistakes with measuring units. Results of our user study show participants found it easier to cook with the system and even subjects who had never cooked the assigned recipe did not make any mistakes.
- Ayaka Sato, Keita Watanabe, Jun Rekimoto, MimiCook: A Cooking Assistant System with Situated Guidance, 8th International Conference on Tangible, Embedded and Embodied Interaction (TEI 2014). [to appear in Feb. 2014]