interaction magazine december

90
Interaction Magazine Interaction Design - Autumn 2011 - December 7th Introducing GoodFood The final concept

Upload: martin-lokkegaard

Post on 23-Mar-2016

219 views

Category:

Documents


0 download

DESCRIPTION

Interaction magazine decemper ed. Made as part of the course Design for Interaction at the Technical University of Denmark

TRANSCRIPT

Page 1: Interaction Magazine December

InteractionMagazineInteraction Design - Autumn 2011 - December 7th

IntroducingGoodFood

The final concept

Page 2: Interaction Magazine December

This issue is edited and published by

Meiken Hansen s050031Martin Løkkegaard s072049Maja Hyveled Jakobsen s071768Michael Mansdal Larsen s082510Kamilla Grove Sørensen s072013Christopher Holm-Hansen s072023

Page 3: Interaction Magazine December

ContentsIntroductionUnderstanding the target users Getting older Elderly and touchscreen technologyThe processMarket researchDesign specificationProblem statementScope of the concept People Activity Context Technology Ergonomic/usage considerationsDevelopment of the final concept First iteration Second iteration Scenario 1 Scenario 2 Third iteration User test OutcomeGoodfood First time use Profile Home sweet home Planning ahead Shopping list Cooking with Goodfood The remotePrototypeData flowConsiderations Purchasing Goodfood FeaturesFuture workConclusion & ReferencesAppendix

5889

12182222262628282829343434353637373742434546485051545862666666707478

Page 4: Interaction Magazine December

4

Interaction Magazine

Page 5: Interaction Magazine December

5

December Issue

IntroductionThis magazine is a part of the final delivery in the course 42072 Design for Interaction, DTU Management, 2011 autumn semester. It documents the project work of developing a product for interaction for the aging society. The project includes conceptualization, cog-nitive-physical consideration and mock-ups/prototype of the final product and testing of this.

Products specially designed for elderly is a growing market and an area in which count-less challenges and possibilities exist for creat-ing good interaction design [Lecture week 6: Health Engineering]. The design task that has set the basis for the project is defined as:

“To design a product that allows for interac-tion and has the purpose of supporting eld-erly people in maintaining a healthy and varied diet”.

This December Issue elaborates and finalizes the development, description and testing of the chosen concept: GoodFood – consisting of an application for tablets and a washable remote control. The Goodfood concepts pri-mary focus is intuitive interaction for the user and less on saving the world :-)

Page 6: Interaction Magazine December
Page 7: Interaction Magazine December

Understanding the target users

Page 8: Interaction Magazine December

8

Interaction Magazine

Understanding thetarget usersThe intention of this section is to give an un-derstanding of the target group, which con-siderations must be taken into account and what challenges must be met when designing the final concept.

Getting olderIt is a well-known and documented fact that the human body is exposed to physical de-terioration concurrent with getting older. The pace at which this happens may differ between individuals and is often affected by lifestyle as well as diseases. Muscle power and skeletal mass decreases as well as some may suffer from rheumatoid arthritis. Also, the central nervous system slowly begins disinte-grating leading to reduced short time memory, lack of the ability to perform complex tasks and reduced effect of the five senses: hearing, touch, sight, taste and smell. [Lecture week 6: Health Engineering]

The ability to perform cognitive demand-ing tasks gets worse as people age. “Worked examples” can be described as step-by-step demonstrations of how to solve a problem or solve a specific task while “conventional prob-lems” may be described as learning-by-doing. Studying “Worked examples” has proven to be more efficient for the elderly than solving “Conventional problems”. Any design for elderly could thereby benefit from a guide/tu-torial option, which could serve as a “Worked example”. [Gerven]

Elderly people tend to live by a number of habits established through their entire life. In continuation many have built up a more or less monotonous diet with well-known dishes that are easy to prepare and that they like. Often less consideration has been given to the nutri-tional values of the meals, which can possibly lead to malnutrition and in worst-case diseases [DR]. Furthermore 55% of elderly find it dif-ficult to simply distinguish healthy food from unhealthy food [Ældresagen]. Malnutrition can also be a result of reduced digestion or ability to chew, which may limit the number of dishes they know how to prepare.Correct nutrition is also of utmost importance when dealing with a variety of diseases as e.g. high blood pressure, obesity or diabetes. Another dimension of getting older is that the appetite regulation gets less sensitive. Thereby food consumption may in some cases not cor-respond with the body’s energy demand. Eat-ing less than the body needs results in weight loss and absence of important vitamins and proteins. Other causes related to reduced ap-petite can be depression, social isolation, bad teeth as well as effects on the sense of taste and smell [meraadet]. These developments and dimensions are im-portant to be aware of and keep in mind when designing for the elderly generation as it has a great influence on the possibilities and limita-tions on the final concept. By making sure that the elderly maintain a healthy and varied diet

Page 9: Interaction Magazine December

9

December Issue

it is possible to increase their life quality and mitigate the risk of different diseases. With a healthy and varied diet there are far greater chances of avoiding diseases, increasing their independence from other people/authorities.

Elderly and touchscreentechnologyAs this project focuses on elders and devel-opment of a new product that require their interaction, looking into how they understand interfaces and what kind of feedback is a must.

Studies show that there is not much differ-ence in the performance between elderly and younger users when performing conventional tasks (e.g. pressing buttons and understanding icons) on PDAs (personal digital assistants) [Siek]. Since the similarities between PDAs and tablets are many, it can be assumed that elderly will be able to perform similar tasks on a tab-let with a touchscreen. To further back this as-sumption is the general increase of tablets and smartphones, both touchscreen devices, in the population. Even though most of these prod-ucts are found in the hands of younger users (under 65), more people, including elders, use either tablets or smartphones on a regular ba-sis.

Another study show that elderly people with-out any experience with touchscreens can per-form gesture-based operation reasonably well, although hitting smaller targets (30x30 pixels) proved difficult. The study also proved that

elderly experience great joy in using tablet: “I want to do nothing but use this”, a test person stated on the use of a touchscreen. The joy of using touchscreen devices may be of great value for the elderly and encourage them to be active on areas that can serve their personal interests. [Kobayashi]

Page 10: Interaction Magazine December
Page 11: Interaction Magazine December

The Process

Page 12: Interaction Magazine December

12

Interaction Magazine

The processThe following section presents activities and methodology that were used during the development process. The actual research and development will be presented later.

BrainstormA brainstorming session on different topics was done to specify and fo-cus the design task. The session was based on a mind map where four overall topics were presented: Independent Living, Health and Care, Oc-cupation and Recreation. [Lecture week 7: Final Project]

CombinationA lot of problem areas were identified through the brainstorm sessions. Most of the areas could be combined into broad and overall problem ar-eas of interest. By doing so the problem areas were cut down to five: So-cial networking, mobility, social awareness, pets and practical reminders.

SelectionOne problem area was selected through discussion and voting. This se-lection was based on engagement, interest and knowledge on elderly and their needs. The problem area with the biggest general interest was found to be ‘practical reminders’.

Page 13: Interaction Magazine December

13

December Issue

Design TaskThe area of ‘practical reminders’ was unfolded through further brain-storming on different situation where reminders are needed. Based on the session, focus was narrowed down to concern correct nutrition for elderly people.

ValidationIn order to validate the focus area, the market for existing nutrition as-sistance and guidance was investigated and interview with possible users, within the target group, were conducted. [Appendix 2]

Demands and Criteria

Based on the validation and user interviews demands and criteria for the concept were defined.

Page 14: Interaction Magazine December

14

Interaction Magazine

Concept DevelomentIn order to determine the concept a morphology chart was used [Ap-pendix 4]. Seven parameters based on the earlier research were found. A brainstorm over the parameters was used to develop different solutions to the same parameter. The idea for the final concept was based on a combination of elements from the morphology. The concept was named “GoodFood”.

Iterative detailing

The concept was built up, tested, detailed and evaluated through three iterations before leading up to the final version of the concept.

Page 15: Interaction Magazine December

15

December Issue

GoodfooD

Page 16: Interaction Magazine December
Page 17: Interaction Magazine December

Marketresearch

Page 18: Interaction Magazine December

18

Interaction Magazine

Market researchThere are several products on the market that in an interactive way inspire people to eat a healthy and keep a varied diet. GoodFood will be a new actor in this market and it is therefor necessary to be aware the existing solutions in order to be able to stand out and position it-self.

An example of one of these concepts is called Epicurious which is a website for food lov-ers where recipes can be found; you can make shopping lists and get advice on how to cook. Furthermore the recipes are divided into lev-els of complexity as “Quick and easy”. [Epi]

Another concept is the Danish alternative to Epicurious a webpage www.opskrifter.dk which contains a collection of recipes with some additional functions as “empty the re-frigerator” which allows the user to type their ingredients in their refrigerator whereupon the system suggests relevant recipes. [opskrift-er.dk] Another concept that inspires people to eat varied even though it can not be considered interactive is found on the back of the free weekly newspaper Søndagsavisen. On the back there are recipes for 7 days - the upcom-ing week. This gives inspiration to the readers for meal planning. The recipes often use some of the same ingredients as the recipes for the other days, which give the reader the oppor-tunity to ‘share’ the ingredients among several days. In this way less food and in gredients are

thrown away and the user have saved money. Furthermore the ingredients are arranged in a shopping list that comes in handy when buy-ing groceries. [Søndag]. The solutions described here only represent a fraction of what is available. Many alternatives exist, all with more or less similar functionali-ty. However, a concept that can accommodate special nutritional needs and demands and tai-lor menus to fit elderly people has not been found. This is why Goodfood has its raison d’etre.

Page 19: Interaction Magazine December

19

December Issue

Page 20: Interaction Magazine December
Page 21: Interaction Magazine December

Designspecification

Page 22: Interaction Magazine December

22

Interaction Magazine

Design specificationWith the information on the target group and the current market, requirements and criteria [An-dreassen] for the concept were set up.

Problem statementThe concept shall give the elderly assistance to maintain a varied and healthy diet and all of the activities related with this: give inspiration and planning the menu for the following days, plan the menu so it fits with the users nutritional profile, plan the shopping in accordance with the specified menu, provide guidance when cooking the dishes, provide the nutritional information for each course and the individual ingredients and always give the user the possibility to quickly access help/assistance.The concept must somehow afford hygienic use when in the kitchen environment. The layout must incorporate considerations on physical/visual standards that make the use suitable for eld-erly.

Page 23: Interaction Magazine December

23

December Issue Subject Requirement Criteria Comment

Varied diet Must advise the user to eat a varied diet.

If a varied diet is achieved through inspiration and motivation.

Customization Must be able to tailor menu to personal needs.

It would be preferred if the application could advise users to eat according to their personal condition (E.g diabetes and arthritis)

Help function The possibility to get help if needed.

Could be nice if the help function would be superfluous due to intuitive design.

Hygienic standard The solution must afford hygienic use.

A washable remote control for use during cooking in the kitchen.

Weight Should weigh as little as possible without compromising the usage.

Relating to remote control.

Flexibility Must contain the possibility to choose “not” to eat the suggested food and get something else.

It would be preferred if the application gives the possibility to plan meals ahead in time.

Planning meals ahead in time might make the user feel in control.

Patronising behaviour

The layout must be patronizing towards the users.

The elderly users should not feel stupid when using the application.

Navigation The navigation must be consistent throughout the application.

Should try to use mappings the user already knows.

The navigation should be elderly friendly. An effort towards not having to many buttons nor to few.

Feedback Must give visual or auditory feedback to the user.

Feedback can be customized by user.

Page 24: Interaction Magazine December
Page 25: Interaction Magazine December

Scope ofthe concept

Page 26: Interaction Magazine December

26

Interaction Magazine

Scope of the conceptThe PACT analyses together with ergonomic considerations are used to scope the design of the chosen concept, GoodFood. The PACT framework [Benyon] consists of four areas: People, activity, context and technology. This is used to understand the people who use the GoodFood application and how the activity of the usage proceeds in a defined context.

People

GoodFood is designed/intended for elderly who are able to cook for themselves but do not necessarily need to be in shape for shop-ping as some might use online methods of purchasing groceries. It is assumed that the upcoming generation of elders is far more confident in using technologies like tablets and computers than the current elders. To get an overview of the target group, personas were constructed. The personas are mainly based on the knowledge gained from talking with elderly people about nutrition, cooking, and technology usage as well as research on the internet and articles. The personas give the possibility to immerse into the concept and understand what properties that matter to the user and which aspects it is that give the users the motivation to use the concept.

Page 27: Interaction Magazine December

27

December IssueJames Johnson• 67 year old• Curios on different food sources• Physically able to cook and buy ingredients• Currently eating healthy• Likes to cook and enjoys new taste sensations• Owns a smartphone

Morgan McKinsey• 72 year old• Is lactose intolerant and therefor has special nutritional needs• Physically able to cook and buy ingredients• Has four favorite dishes• Capable of using a computer and cell phone• Has never used a touchscreen device

Loretta Lowell• 75 year old• Suffers from arthritis and has trouble using the oven.• Able to cook for herself• Poor sight and prefer audiobooks and audio instead of reading• Orders her groceries online through a smartphone application

Sheila Stanton• 84 year old• Normally uses cookbooks to discover new recipes• Physically able to cook and buy ingredients• Enjoys watching the cooking channel and cooking along with the TV chefs• Uses her computer every now and the and her cell phone on daily basis

Page 28: Interaction Magazine December

28

Interaction Magazine

ActivityThere are several activities that the user will be exposed for when using the GoodFood ap-plication.

The first activity related with the application is the purchase. The purchase requires inter-net access and an acceptance from the user that he/she wants to buy and download the application. This activity must take place on the tablet of which the user intends to use Goodfood.

The regular interaction with the application is accomplished with a tablet. The tablet allows for use of a tangible interface and lets the user interact with the application with their fingers.

When using the application in the kitchen the tablet touchscreen is no longer the tangible in-terface. Instead the washable remote is used. The remote has buttons that afford pressing and are consistent with the buttons and neces-sary steps of action in the cooking scenario.

ContextIt is assumed that elderly will use it primarily at home in their living room and kitchen. The application has different characteristics that afford different use contexts. When planning the weekly menu, creating the shopping list or editing/viewing their profile, it is expected that user will be sitting in their living room in a comfortable chair with the tablet.

Another dimension is when the application is used in the kitchen during the cooking ses-sion. In this case the physical interaction with the application becomes a challenge, as kitch-

en hygiene is an unavoidable and important issue. In the kitchen environment, the tablet is exposed to hazardous elements as e.g. water and bacteria when the user has to interact with the tablet. But also when turning the scenario around other issues appear. Bearing in mind that the tablet can be used for many things e.g. gaming while on the go or in the bath-room; the problem of the tablet bringing out-side bacteria into the kitchen is also necessary to keep in mind, when designing the physical interaction. The solution for these hygienic challenges is realized as a washable remote control that can interact with the application.

TechnologyThe media of the interactive system Good-Food is a tablet. The tablet receives input from the user through either the touchscreen or the washable remote control. The user touches the tablet with the finger to give input to the tablet. When the user is cooking the remote is used to give input to the device. The user navi-gates by pushing buttons on the remote, when he/she otherwise would have tapped directly on the tablet screen. The remote is powered by battery and uses Bluetooth technology for transferring the data.

The outputs are formed as text, audio, video and graphic visualization.

The GoodFood concept requires connection to the internet for application updates and when creating new menus.

Page 29: Interaction Magazine December

29

December Issue

Ergonomic/usageconsiderationsWhen designing an application for use on a tablet there are certain patterns that should be considered. Patterns are a combination of a gesture, for example touch of a button, and a system response. The combination of the two can be used in several situations in a variety of devices. Using patterns that are similar to successful patterns currently used in products may make the product seem familiar and more intuitive to use [Saffer].

One pattern that has been used in a vast number of interactive devices, and has been implemented in the GoodFood device, is the tap-to-open/activate pattern. When an object is to be opened or activated the Tap is used. Tap is a natural replacement of a click of a mouse and therefore should be a familiar ges-ture when using a touchscreen [Saffer]. Fur-thermore the tap-to-select pattern is used in the GoodFood concept. The two types: tap-to-open/activate and tap-to-select are used in an identical manner, both having buttons similar in construction. In order to give the user the possibility to change her/his mind af-ter the finger is placed on the button, they can simply slide their finger of the button as it is only activated on direct release. This gives the user the possibility to avoid unattended ac-tions. [Saffer]

Gestural interfaces, such as touchscreens, have some limitations. Compared to a key-board with actual buttons (or/and a mouse) the touchscreen does not contain actual but-tons. The physical feeling of when a button is pressed is sometimes on a touchscreen

switched into a visual display only, which gives little feedback to the user. Especially for the visually impaired this might be a problem [Saf-fer]. Since many elderly are visually impaired a clear feedback should be given when a gesture has been done. Simulating a movement of the button and adding a clicking sound could be means for achieving such feedback.

Another physical consideration concerning the usage of the touchscreen, which is of in-terest to the target group, is the keyboard. Us-ing a keyboard for typing may be challenge to elderly especially if the taps are small. When typing some use the fingertips other may use the finger pads. Finger pads are larger than fingertips and the average size of a finger pad is 10-14 mm [Saffer]. To make the writing on the keyboard as easy as possible for the eld-erly the size of the buttons (letters) should be large enough for using the finger pads. A tab-lets keyboard is, when the device is set on hor-izontal direction the size of the taps is larger than 14 mm and can therefore be considered elder friendly.

When designing an interactive device on a touchscreen, one should consider the place-ment of buttons and other interactive ele-ments. The layout should make sure that no important information is covered when inter-acting with the application. Therefor placing these elements in the same layout as on com-puter screen could most likely induce compli-cations. [Saffer]

A study has shown that elderly prefer icons of a larger size (20 mm) where the younger us-ers preferred icons of a smaller size (5 mm or

Page 30: Interaction Magazine December

30

Interaction Magazine

10 mm). Furthermore the study showed that elderly users had problems viewing the screen because of glaring and had a tendency to tilt it for a better view [Siek]. The icons that are used in the GoodFood concept are all of an appropriate size for elderly according to the previously mentioned studies.

Touchscreen-devices have proven to be less cognitive demanding than other interactive devices when the interface is designed appro-priately. Therefor when designing an applica-tion to be used by elderly one should consid-er what level of cognitive demand it should require [Hippler]. An interactive design that is natural to operate would not be cognitive demanding and thus be in favor to the target group of this project. When cognitive tasks are not complex the performance of young and elderly is similar. [Siek]

Page 31: Interaction Magazine December

31

December Issue

Page 32: Interaction Magazine December
Page 33: Interaction Magazine December

Development of the final

concept

Page 34: Interaction Magazine December

34

Interaction Magazine

First iterationCreating outlineThe first mock-up of GoodFood was super-ficially built, only giving an overview and idea of the main functions and layout. The mock-up was constructed as static images and only built to show the initial thoughts and con-siderations for GoodFood. The concept was evaluated through feedback from supervisors.

Development of thefinal concept

1

Second iterationKeep it simpleFor the next iterations, functions and opera-tions were defined using function diagrams [Appendix 3]. This gave a good basis for dis-cussing important functions and prioritizing the design effort. The second iteration was built as an interactive interface, enabling trail and error testing. The concept was evaluated based on heuristics, personas and user inputs. Simplicity was in focus, still ensuring that the concept would not be too limited.

Test persons from the target group were con-tacted to evaluate the concept and its con-tent. The general feedback was positive. They liked the possibility to plan meals for a whole week. One user expressed that she really often found it hard to get inspiration for new meals and would really like an assistant proposing meals [Appendix 2]. Another positive point, was the feature about avoiding certain ingre-dients e.g. in relation to medicine use. One user was taking anticoagulant medication, not allowing him to eat food with a high content of k-vitamin. His wife, who primarily does

2

Page 35: Interaction Magazine December

35

December Issue

the cooking in the household, said it would be nice to get inspiration how to make new recipes. Especially when it came to vegetables, she was interested in assistance. They used to eat a lot of spinach, but with a high level of K-vitamins, this was suddenly no longer an option. [Appendix 2]

To help defining and evaluating the main func-tions of GoodFood, scenarios were created based on the seven steps of action [Benyon]. These scenarios helped to investigate the gulf of execution, describing how a user of Good-Food translates goals and intentions into ac-tions.

Two scenarios were set up concerning gen-eration of a week menu and using GoodFood for everyday cooking. The scenarios were not based on user tests, but were intended to give an overview and understanding of how inter-action with GoodFood is imagined.

Scenario 1:Generate a week menu1. Forming a goal: The user wants to plan the menu of the week.

2. Forming the intention: The user wants to plan the week menu by using GoodFood on her tablet.

3. Specifying an action sequence: She has to buy and open GoodFood.

4. Executing the action: She buys GoodFood and opens it. She adjusts the user settings in GoodFood and press the week menu button. She follows the steps shown on the screen.

5. Perceiving the state of the world: When the week plan comes up, she can see pictures of the planned meals. If something does not look nice, she can change the dish to some-thing else.

6. Interpreting the state of the world: She has now by using GoodFood an exact plan for what to eat the whole week without finding the recipes herself.

7. Evaluating the outcome: Her goal of plan-ning a week menu is satisfied. She did achieve her main goal, but she could change the sug-gested menu’s recipes if she wanted. So at some point, she did have to figure out if she wanted to make that specific meal or if she should choose another – but without coming with solutions herself.

Page 36: Interaction Magazine December

36

Interaction Magazine

Scenario 2:Cooking today’s meal1.Forming a goal: The user wants a home cooked nice and healthy meal.

2. Forming the intention: She wants to use the planned recipe from GoodFood to cook the meal.

3. Specifying an action sequence: She has to open GoodFood and find today’s recipe. Af-ter that, she has to go into the kitchen and find the ingredients which she (hopefully) has bought earlier this week.

4. Executing the action: She opens GoodFood and chooses the feature ‘cook’. Today’s recipe opens and she finds the ingredients. She pre-pares them and cooks them into a meal.

5. Perceiving the state of the world: While using the recipe, she can hear the recipe read aloud. She can also see how the meal is pre-pared by viewing a video. Her senses like eye-sight, smell, and taste can be used during the preparation to make sure the meal taste nice and figure out when it is finished.

6. Interpreting the state of the world: Good-Food helps her to choose a healthy recipe and gives her help about how to cook the meal.

7. Evaluating the outcome: Hopefully she has been able to fulfil the recipe with success so the meal is a nice meal. But maybe, she did not follow the recipe completely or believed she knew what a certain procedure was but in reality did not and did not view the video. She might also have been interrupted during

the cooking and forgotten to check the food. Then there is a possibility that the meal did not become a success, but there is a good chance that it might be a success.

The scenarios were used to anticipate and op-timize interaction with GoodFood.

Page 37: Interaction Magazine December

37

December Issue

3

Third iterationTestingThe scenarios, constrains and consistency, to-gether with knowledge of natural mappings, created the basis for the third iteration. This iteration was created with focus on user test-ing and effort was put into making it look appealing and complete. A user test was con-ducted and together with continuous heuristic evaluation led to the final concept, the fourth iteration of GoodFood.

User testThe user test was performed with focus on the user’s ability to navigate through the Good-Food application. The evaluation of the ap-plication is built upon the intuitiveness of use and at what level of ease the user navigated through the application for a given scenario.

It should be noticed that the user of the test had never used a touch-screen device. This choice of user was done to get the least ex-perienced of the intended users to navigate the system, which would catch most of the breakdowns. The chosen user had previously expressed interest in learning the technology of touch-screen devices and to get to know the system, even though he had no related skills what so ever. The user lacked English skills, which might have been a barrier at some

points. The evaluator tried to compensate for this by translating the text.

It should be clear that the application tested was a prototype which had some differences compared to the final application. Examples of this are non-functioning buttons and cer-tain information, that the user is supposed to type in, which is already there. This might have confused the user at some points.

OutcomeThe full user test and all findings can be found in Appendix 1. The following contains some of the important findings and changes based on the user test.

From the user test we became aware of com-municating what happens when you tap for-ward and backwards. In the profile settings the user tried to navigate by tapping the numbers instead of the arrows. The arrows have now been changed to a back arrow including the text “back” and a square with a tick including the text “accept”.

Not all users know the expression “Vegetar-ian” and “Vegan”. Instead of explaining the word the “None of the two” were changed to “I eat meat”. This button was furthermore changed to be the first button from the left.

Page 38: Interaction Magazine December

38

Interaction Magazine

The user found it difficult to change his pro-file settings. He tried to tap directly on the in-formation on the profile overview. After this we made it possible to see all the profile set-tings at the profile overview, tap edit and each setting would appear as a button. When tap-ping the specific button the user would jump to the specific setting, e.g. Allergy. In this way the user did not need to go through all screens to change one setting.

As the user did not know that the info button would give information on the specific screen it was changed to a Help button.

In the cooking guide the user had difficulties understanding that it contained more than one step and how to go to the next steps. We have changed the screens to showing the numbers of steps and the current step.

As it did not seem obvious for the user how to finish the cooking guide it should be clarified that the guide is over and how to go the home.

The user misunderstood the Shop button and tapped is when he wanted to create a new menu as “he had to do shopping for more meals”. After this session the “Shop” button at the Home screen were changed to “Shop List”.

The user did not realize that he could press the cook button from the My menu screen. He tried to tap the specific meal to see the recipe. As nothing happened he went to the Home screen and tapped the Cook button there instead. It could be possible to tap the specific meal to go and see or check the recipe

for all the chosen meals in the menu. This has not been implemented in the prototype yet.

When using the remote control the user once moved the cursor too much to the right pressed OK and went to Home screen. In this way he left the cooking instructions, which were not the intension. This breakdown should be con-sidered and it should be clear to the user where the cursor is when using the remote control.

From the user test we agreed on having an in-troduction video to the application. The video should introduce the user to the main func-tions and shoving how the general navigation is done. This will make sure that the user will be informed about the applications different function and that the user can benefit from all functions.

The user test, together with the previous itera-tions of GoodFood, created the basis for the final concept.

Page 39: Interaction Magazine December

39

December Issue

Page 40: Interaction Magazine December
Page 41: Interaction Magazine December

Goodfood

Page 42: Interaction Magazine December

42

Interaction Magazine

GoodFood helps elderly to a healthy and varied diet. The application is an inspiration source, providing alternatives and help to cook new dishes. GoodFood allows the user to clearly define preferences and special needs and based on these information creates a varied 7 day menu, cover-ing all nutritional needs of the user. The application offers elderly a helping hand to break food habits and will improve overall health and life quality.

GoodFood is an application for a tablet device, including a specially designed remote control for use in the kitchen. The application is intended for a family household with one or more members. Each member creates his/hers own profile and GoodFood will take into account all personal needs when planning the menu.

Goodfood

 The following is a run through of the most essential features of GoodFood. The interfaces pre-sented are created from the prototype of GoodFood.

Page 43: Interaction Magazine December

43

December Issue

First time useThe very first time GoodFood is opened the user has to setup his/hers own personal profile. The application will guide the user through seven interfaces with possibility to define special needs and preferences. It is not possible for the user to skip this step, as it is essential for the usefulness of GoodFood.

The Welcome screen is the first to appear. It allows the user to determine gender buy tapping the male or female icon and also to type in a profile name. In the bottom of the screen seven circles are visible, clearly stating how far in the setup process the user is and how far there is to go. The circle indicating the current state is highlighted with a thin line around it and the circle will be 100 percent visible. As the setup process continues, complete steps will stay 100 visible where steps not yet defined will only be 80 percent visible. When the user has selected gender and entered a name he/she needs to press accept.

 

Page 44: Interaction Magazine December

44

Interaction Magazine

The next six steps let the user define: Date of birth, height, weight, food preferences (e.g. vegetar-ian), allergies, special conditions (e.g. arthritis) and physical limitations.

In the case of e.g. allergies, the user will be asked: “Do you have any food allergies?” If the answer is no, the application will skip ahead and let the user define special conditions. However, if the user taps yes, he/she will be guided to an interface allowing specifying the type of allergy. There is no accept button in this interface and as the user taps either yes or no, GoodFood will instantly continue the setup sequence. In the case of an mistake, it is always possible for the user to tap the “back” button always located in the lower left corner.

 

The setup sequence is simple and straight forward, and does not require the user to preform dif-ficult tasks. Each of the seven interfaces are presented in a similar manner to minimize confusion and it is always possible to recover from mistakes by either using the back button or simply picking another choice (however not possible in the case of allergies, here the back button must be used).

Page 45: Interaction Magazine December

45

December Issue

ProfileWhen all seven steps of the profile setup are completed, the user will be presented with a profile overview. Here it is possible to see all data entered in the previous steps and by tapping Edit to change mistakes. The profile overview also present nutritional information, here it is possible for the user to see how the personal settings are affecting the choices GoodFood makes when gener-ating a personal menu (e.g. avoided ingredients or daily calorie intake).

In this interface the user is also presented with the Help button for the first time. This option will follow on every interface from hereon, providing guidance in case of need. When the user is satis-fied with the profile settings he/she taps accept and is guided to the Home interface.

 

GoodFood gives the possibility to take several profiles into consideration when later generating a menu. When the first profile is finished, it is possible to start over and enter specifications for the next. (This is however not implemented in the prototype)

Page 46: Interaction Magazine December

46

Interaction Magazine

Home sweet homeWhen user/users is/are finished setting up profile(s) the Home menu will be presented. From Home it is possible to go either to Cook, Menu, Profile or Shop List.

•The Profile option allows going back to profile settings either for a redefinition or just to view it.

•Tapping Menu, Goodfood will generate a menu specially adjusted the user profile(s).

•Cook will guide the user to the cooking interface (in the case a menu is created), showing the menu of the day and providing guidance and cooking assistance.

•Shop list will show what groceries are needed for the defined menu (also provided that a menu is created)

As this is the first time GoodFood is opened, tapping Cook or Shop List will guide the user to the Menu planner.

 

Page 47: Interaction Magazine December

47

December Issue

If the user needs help to define the different buttons in the Home menu it is just to tap the help icon, as usual centered in the bottom of the interface. Tapping Help gives a pop window with a description of the different elements on the original interface. It is possible to tap the audio or video buttons in order to get further help. If Audio is tapped, GoodFood will read the help aloud. If Video is pressed, visual help will appear, defining the different elements. Tapping anywhere outside the help box or tapping the cross in the upper right corner will at anytime stop the help functions.

 

Page 48: Interaction Magazine December

48

Interaction Magazine

Planning aheadAs defined, when using GoodFood for the first time, the user will be guided to Menu. As this is the very first time of use and no menu has been created yet, only Create new menu will be avail-able. At any other time it will be possible to choose between the two options, either to generate next weeks menu or to show the menu for the current week (my menu).

 

Page 49: Interaction Magazine December

49

December Issue

When tapping Create new menu, GoodFood will, based on the profile settings, create a new varied menu specially designed for the user. The profile settings will ensure that special demands (e.g. allergies) and nutritional needs of the user will be taken into account when generating the menu.

The user will be presented with a menu showing different suggestions. It is possible to change dishes if something pops up that the user does not want. GoodFood will then come up with alter-native based on the same nutritional content, which the user can choose from. (Not implemented in the prototype)

Information of each dish will also be available to the user, showing detailed information on ingre-dients and nutritional content. (Not implemented in the prototype)

From My Menu it is possible to go directly to Cook, Home or Shop List.

 

Page 50: Interaction Magazine December

50

Interaction Magazine

Shopping listTapping Shop List presents an overview of the ingredients needed to cook the dishes from the menu. The shopping categorizes ingredients in a simple and clear way for easy understanding. The two buttons, SMS and Print, centered in the bottom of the interface, gives the user the possibil-ity to send the list to his/hers mobile phone or a printer connected to the tablet device on which GoodFood is used. This allows the user, in an easy way, to transfer the shopping list to something that can be brought to the local grocery store.

 

Page 51: Interaction Magazine December

51

December Issue

Cooking with GoodFoodWhen the menu is created and groceries bought it is time to cook. When the user taps Cook in the home menu the dish of the day will appear. As shown in the menu interface, Chicken heaven is up for Monday.

The dish will be presented with ingredients, date and expected preparation time.

 

Page 52: Interaction Magazine December

52

Interaction Magazine

It is also possible to get detailed directions for the cooking process. It is clearly stated how many steps the current recipe contains and how far in the process the user is. Each time the user is ready to continue to the next stem the arrow is taped.

Cook also presents audio and video possibilities. Tapping the audio will read the recipe aloud, and if going through the directions, it will be done step by step. The video option will visually guide the user through the cooking process through small video sequences e.g. showing how to prepare the chicken.

 

Page 53: Interaction Magazine December

53

December Issue

When the user is done cooking and reaches the last step of directions, a Done button will appear. This has the exact same function as the Home button and guides the user to the Home menu. However the extra button is added to give the cooking process a natural flow.

 

Page 54: Interaction Magazine December

54

Interaction Magazine

The remoteThe intended use of the GoodFood remote control is to be able to interact with GoodFood while cooking without having to touch the tablet device. This will ensure that hygiene will not be com-promised using the application.

The remote control has three buttons with appearance and icons similar to the ones used in the GoodFood application, making it intuitive for the user to see the connection between the applica-tion and physical remote control.

The remote is created with big buttons giving tangible feedback when pressed. The remote can be put directly in the dishwasher or be cleaned with soap and water after use.

 

Page 55: Interaction Magazine December

55

December Issue

The remote control navigates around the GoodFood application by using a circle. The circle jumps from icon to icon only positioning itself where a function can be triggered. When the circle has reached a desired location the accept button is pressed. (Not included in the prototype).

While cooking it is still possible to interact with GoodFood using the touchscreen.

 

Page 56: Interaction Magazine December
Page 57: Interaction Magazine December

Prototype

Page 58: Interaction Magazine December

58

Interaction Magazine

A prototype of GoodFood is available for test and review. The prototype is created in Tumult Hype and exported as HTML5 to Dropbox. It is possible to access the application using all platforms and browsers, however the proto-type has shown the best result using Google Chrome or Safari.

The prototype is split into two different sec-tions. Section 1 concerns the settings function of GoodFood, and gives the viewer possibility to experience how a potential user would de-fine their personal settings in order to create a profile. Opening section 1 in a browser will present the GoodFood interface as if it was the very first time the application was opened. This section ends with the Home menu.

Section 2 includes the rest of the main func-tions of GoodFood and starts with the Home menu. The section will present the viewer with the everyday interface of GoodFood. It is possible to explore Cooking, Menu, Profile and Shop List.

It must be noted that the prototype has some limitations. Tumult Hype does not support the creation of a full functional application. No audio and only limited video functions are available. Further it is not possible to en-ter text in textboxes. The prototype has been created to give the viewer an idea of how to interact with GoodFood.

PrototypeTo view the two sections of GoodFood fol-low the links below, download and unzip the two .zip files. Make sure that the HTML5 files and the resource folders are placed together. Click the HTML5 link and GoodFood will pop up in the standard browser window ready for interaction.

Section1:

http://dl.dropbox.com/u/3103478/Good-Food%20Settings.zip

Section2:

http://dl.dropbox.com/u/3103478/Good-Food%20Main.zip

Enjoy!

Page 59: Interaction Magazine December

59

December Issue

Page 60: Interaction Magazine December
Page 61: Interaction Magazine December

Data flow

Page 62: Interaction Magazine December

62

Interaction Magazine

Going behind GoodFood a data flow diagram is a good tool to show how the functions of ap-plication are connected. Four elements are illustrated: Data store, Process box, Data flow and External entities. A data store is a database supporting the system e.g. a recipe database. A proc-ess box represents an event or processes within a system in order to perform an interaction e.g. change a recipe. A dataflow, symbolized by an arrow, shows how data flows between the different processes. The external entity is in the case of GoodFood the user.

Data flow

 

Page 63: Interaction Magazine December

63

December Issue

Page 64: Interaction Magazine December
Page 65: Interaction Magazine December

Considerations

Page 66: Interaction Magazine December

66

Interaction Magazine

Purchasing GoodFoodSince this concept consists of an application for a tablet and a remote control the payment method should be considered. Traditional ap-plications can only been bought on the In-ternet through an account that is established when buying the tablet. This account is in-cluding personal credit card information. The GoodFood system could be purchased in the similar manner as the traditional applications where the remote is sent to the purchaser by mail. Another possibility for purchasing the Goodfood system could be through various stores where the remote control and a code for installing the software can be bought. This last solution might appeal to elderly that are not comfortable entering credit card- and per-sonal information online.

Features One feature that was initially planned to be included in the concept was the possibility for the users to add personal settings regarding medicines on prescription. This would allow GoodFood to take special precautions re-garding medication of the user. Research on this area however showed, that it was difficult to standardize guidelines for medicines and which food ingredients might have an inflic-tion upon their effect. There were found in-formation that showed that certain types of fruit juices could neutralize the effect of cer-tain drugs. Since the effect of fruit juice was different from user to user, the prediction of what users would experience neutralization of the medicine effect was impossible. Research

Considerationsalso showed that there is no existing database available for guidelines regarding medication and food recommendations. The feature was therefore not considered further.

An “empty the fridge” feature was also dis-cussed at one point. This feature would con-tain the possibility for the user to tap in the ingredients of the fridge to be used in a meal. This feature would assure that the user could use all the food from thin his/hers fridge and thereby save money. However after internally discussion and evaluation the feature was dis-missed, mainly because it would interfere with the scope of this project. When a user plans the next week menu the program has a certain amount of nutrition levels to fulfill and runs through a number of recipes to find a fit. This action is however not very compatible to the idea that the user should be able to use ran-dom ingredients from the fridge.

Page 67: Interaction Magazine December

67

December Issue

Page 68: Interaction Magazine December
Page 69: Interaction Magazine December

Future work

Page 70: Interaction Magazine December

70

Interaction Magazine

To further validate and finally realize the GoodFood-concept several areas could be in-teresting to look into. First of all more user testing should be conducted to ensure the optimal layout of the program aimed at the target group. Some alternative features also could be considered to improve the function-ality of the program. Furthermore there are some practical issues that should be decided before the launching of this concept.

One area of the concept that needs further detailing is the shopping list feature. There are today good possibilities to shop for groceries online, and since the elderly are a group of people that certainly could benefit from this, it should be included in the GoodFood sys-tem. To shop for groceries within the Good-Food system would make good sense since it is a device for cooking and meal planning.

One alternative feature that could be consid-ered is the possibility to define a food budget, which might serve as an add-on feature in the program. This feature might find the appro-priate meals for the users according a specific weekly or monthly budget.

Another alternative feature could be the pos-sibility to add additional recipes from e.g. celebrity chefs such as Rene Redzepi, Claus Meyer, Nigella Lawson or the brothers Price. These recipes could further be attached video instructions performed by the celebrity chef him-/herself. A subscription could in this

Future workcase be set up, e.g. to one of the mentioned chefs. This would add value to both the user and the GoodFood concept.

Page 71: Interaction Magazine December

71

December Issue

Page 72: Interaction Magazine December
Page 73: Interaction Magazine December

Conclusion & References

Page 74: Interaction Magazine December

74

Interaction Magazine

Several iterations, including user testing and evaluation, have created the basis for the final concept of GoodFood. In the development process good interaction design has been the key focus area and concepts and ideas have continuously been evaluated against theory. GoodFood has been developed specially to fit the demands and criteria of elderly, striving to map and constrain functions to have mini-mum uncertainty.

GoodFood generally touches some interest-ing aspects, looking to help elderly to a healthy and varied diet trough motivation and inspi-ration. GoodFood stands out from a market, with many similar alternatives, by allowing personalized nutritional settings aimed to ad-just the diet to the user.

ConclusionEconomical validation through a thorough business plan could be an interesting aspect for next iteration of the concept. However, it is concluded that it would be a realistic scenar-io that GoodFood, using known and available technology, would have a competitive advan-tage and theoretically would be able to posi-tion it self on the market. The process of developing GoodFood has been an educational process and has proven rather difficult. The final concept is still con-sidered a prototype, but it holds an interesting potential and is generally considered success-ful in providing user-friendly interaction de-sign accessible by elderly people.

Page 75: Interaction Magazine December

75

December Issue

Cross, N., “Engineering Design Methods, Strategies for product design.” 3.edition, Wiley. 2000. Andreassen, M.M, Hein, L., ”Integreret Produktudvikling” 2. oplag. Instituttet for produktud-vikling. Sektionen for Konstruktionsteknik 1985.

Benyon, D. 2010. Design Interactive Systems: A comprehensive guide to the HCI and interaction design. Second edition. Pearson.

Hippler, Rachelle Kristof et. al.: ”More than Speed? An Empirical Study of Touchscreens and Body Awareness on an Object Manipulation Task” Applied Science - Firelands Computer Sci-ence Department Psychology Department Bowling Green State University, Bowling Green, OH 43403 Siek, Katie A. ; “Fat Finger Worries: How Older and Younger Users Physically Interact with PDAs” Indiana University, Bloomington, IN 47405, USA

Saffer, Dan, “Designing Gestural Interfaces” Publisher: O’Reilly Media, Inc. Pub. Date: Novem-ber 26, 2008

Gerven, P.W.M. Van. et. al. ”Cognitive load theory and aging: effects of worked examples on training efficiency” Department of Psychology, Maastricht University, P.O. Box 616, 6200 MD Maastricht, The Netherland. 2002

Kobayashi, Masatomo et. al.; “Elderly User Evaluation of Mobile Touchscreen Interactions” Hongo, Bunkyo-ku, Tokyo, 113-8656, Japan

[Lecture week 6: Health Engineering]

[Lecture week 7: Final Project]

References

Page 76: Interaction Magazine December
Page 77: Interaction Magazine December

Appendix

Page 78: Interaction Magazine December

78

Interaction Magazine

User testThe user test was performed with focus on the user’s ability to navigate through the Good-Food application. The evaluation of the ap-plication is built upon the intuitiveness of use and at what level of ease the user navigates through the application for a given scenario.

It should be noticed that the user of the test has never used a touch-screen device. This choice of user was done to get the least expe-rienced user of the intended users to navigate the system, which would catch most of the breakdowns. The chosen user has previously expressed interest in learning the technology of touch-screen devices and to get to know the system, even though he has no related skills what so ever. The user had lacking Eng-lish skills which might have been a barrier at some points. The evaluator tried to compen-sate for this by translating the text.

It should be clear that the application tested was a prototype which has some differences compared to the final application. Examples of this are non-functioning buttons and cer-tain information, that the user is supposed to type in, which is already there. This might have confused the user at some points.

From the user test we became aware of differ-ent aspects which now have been changed in the final prototype. The test setupThe actual test of the system was conducted

on a regular computer display due to the ab-sence of a touch-pad. To compensate for the lack of a touch-screen, the user was asked to use the screen as a touch-screen. The evaluator then moved the cursor around the screen with a mouse according to the users tapping of the screen. As the user does not speak or read flu-ent English the language barrier is something that should be taken into account. To make up for the limited English skills the evaluator translated the text on the display that the user did not understand.The test scenario was as close to the actual use scenario as the test was carried out in the home of the user.To set off the test the application was started in advance. This was done due to the fact that the test was done on a prototype and the start-up does not show any type of information ex-cept the name of the application.Profile settingsThe following sections explains how the user conducted the profile settings. The welcome screen (screen 1)First step was to write the name of the pro-file owner. Since the user never used a touch device prior to the test he did not know how to get his name in the text box. The evalua-tor explained how an ordinary keyboard pops up when tapping the text box used to write the name in which seemed obvious to the user when confronted with the information. We assume that if the user has a tablet, the user would already know this information.

Appendix 1 - User test

Page 79: Interaction Magazine December

79

December Issue

To continue the process of setting up the profile, the user had to navigate to the next screen. This proved difficult at the beginning. The user tried to tap the “1 button” but it did not react. Then he tried to tap the 2 button as he realised he had to go there, but the but-ton did not react either. After realising that the buttons did not work as he thought, he indi-cated that he did not know what to do. The evaluator explained how to use the arrow and the user completed the welcome screen con-tinuing to the next screen.From this we became aware of communicat-ing what happens when you tap forward and backwards. The arrows was changed to an ar-row including the text “back” and square with a tick and including the text “accept”. We have decided that navigating the steps should only be done by tapping the arrows and not by tap-ping the numbers. This constrain makes sure that the user will fulfill all steps when setting the profile the first time. Age settings (screen 2)The prototype was only made to work when setting the year, not the day and month. The user quickly tapped the “day” square, which in the prototype did not respond. This showed that the user immediately knew how to set the birth options and he became familiar with the way of navigating on a touchscreen de-vice. The user got the function right away and quickly navigated to the next screen by tap-ping the forward arrow.Height and weight settings (screen 3)Like the age screen the user quickly got how

the drop-down menu worked and tapped it right away. He found the number correspond-ing to his weight and height and continued to the next screen using the arrow on the screen.Food preferences (screen 4)The user now knew the functions but did not known the meaning of vegetarian or vegan food. The evaluator explained the meaning and the user tapped “non of the two” and continued to the next screen.

This made us aware of not all users knows the expression “Vegetarian” and “Vegan”. In-stead of explaining the word the “Non of the two” were changed to “I eat meat” and were changed to be the first button.Food allergy (screen 5)No problem at all. The user answers ”no” and continues to next screen.Health settings (screen 6)The user was presented with different options regarding his health status and tapped several of the options without hesitating. When fin-ished, he navigated to the next screen.Physical ability settings (screen 7) and com-pleting profile settingsDuring the test this 7th screen were not made, but the evaluator explained the content and asked the user to continue the process by pre-tending he had already had filled in the forms like he did on the previous screens. This proved difficult.

After finishing settings the user did not know where to push and tried to tap the “1 icon”

Page 80: Interaction Magazine December

80

Interaction Magazine

at the bottom of the screen. Then he tapped the backward arrow in the lower left corner resulting in one step backwards to the “Health settings” screen. The user realised what hap-pened and tapped the forward button return-ing to screen 7. It did not seem clear to him that he had to continue by pressing the for-ward button and the evaluator explained that the user had to tap the forward arrow to con-tinue.

As we change the back and forward arrows it should now be clear how to complete the settings. Editing the profile settingsLater the user were asked to change his set-tings according to the doctors news, that a specific ingredients did not agree with the user any longer. The user had difficulties knowing what to do from the home menu. “Profile... [thinking for a while], no I tap the info but-ton” After entering the profile he tried to tap the text about allergy, but nothing happened. As the user was not familiar with English the evaluator read aloud the different buttons. The Edit button was tapped and the user came to the start of the profile settings. Ini-tially the user had difficulties coming to the right screen as all earlier screens than Allergy had to be gone through. After a tap forward, back and forward again the user ran quickly through all the screens until the allergy screen, changed the settings and finished the profile settings.

After this we made it possible to see all the profile settings at the profile overview. Tap edit and each setting would appear as a button. When tapping the specific button you would jump to the specific setting, e.g. Allergy. In this way the user did not need to go through all screens to change one setting.Navigating the system from Home screenEntering the Home screen the user did not intuitively know what the different screens meant. The evaluator started explaining and translat-ing the the different functions of the screen trying to give the same information as if the user tapped the info button.

The user was interested in the first button, the one to the left, even though he did not know what to use it for. Afterwards we have discussed the order of buttons. We discussed whether they should be in the order of first time use: (Profile), Meal, Shop, Cook, (Profile) or in order of how often the different but-tons are used. We agreed on having them in order of how often they are used. After the first time use, it would make sense to have the most used buttons in the beginning, to the left according to the reading direction.

As the user did not know that the info button would give information on the specific screen it was changed to a Help button.

We agreed on having some kind of an intro-

Page 81: Interaction Magazine December

81

December Issue

duction video in the final application. The video should introduce the user to the appli-cations main functions and shoving how the general navigation was done. This would also make sure that the user was informed about the applications different function and benefit from all functions. CookThe user tapped the Cook button from the Home screen redirecting him to the Cook screen. On the Cook screen he read the in-gredient list and stopped. After a while he taps the “info” button and the evaluator ex-plained what the screen could do. After that the user tapped the Forward arrow and the first instruction were given on the screen. He quickly ran through all the instruction screens translated by the evaluator, imagining that the user was cooking. When having finished the instructions he stops and wondered what to do. The evaluator asks him what he thought he should do and the user considers pressing the backwards arrow but decides to tap the info button. The user got informed that this was the end of the cooking guide. He did not know how to act and the evaluator informed the user could continue tapping the “home button” in the lower right corner of the screen.

As the meals were collected to give a varied cost he would the user would cook the whole meal even though there was a part he knew he would not like. “I would eat a little bit of it and try to taste it!”

From the user test it became clear to us that the the next button should communicate in a more clear way that the cooking guide con-tains more than one step. This was partly done by changing the button, partly done by show-ing the the numbers of steps and the current step. As it did not seem obvious for the user how to finish the cooking guide it should be clarified that the guide is over and how to go the home. ShopThe user understood the different options for getting the list to another device by tapping the info button. Afterwards he understood the different possibilities by him self. He found the ability to get the shopping list sent by a text message to his mobile phone the smart-est. Entering the home screen, the user quickly read the text on the buttons. He wanted to do another dish as taps the Shop button. From the screen he realised that he could not choose a new meal. He went back to Home and tapped the Meal button. After this session the “Shop” button at the Home screen were changed to “Shop List”.MenuIn the Menu screen two buttons appeared: Create new menu and My menu. As the user was testing a prototype, the My menu was al-ready typed in, and not blank as it would be in the real application if the user had not planned a menu yet. The user tapped correctly the Cre-ate new menu as he had not generated a menu

Page 82: Interaction Magazine December

82

Interaction Magazine

yet. He accepted the menu without realising that it could be changed. The reason for this could partly be that he was pleased with the chosen meals partly that is was not communicated well enough that the user could edit the menu. He found the 4 days period suitable according to the size of the refrigerator and the ingredi-ents’ best before date.

The user did not realize that he could press the cook button from the My menu screen. He tried to tap the specific meal to see the recipe. As nothing happened he went to the Home screen and tapped the Cook button there instead. It could be possible to tap the specific meal to go and see or check the recipe for all the cho-sen meals in the menu even though is is not the right day. This has not been implemented in the prototype yet, but you can tap Cook di-rectly from the My menu screen. Editing the meals in the menuThe evaluator explained the user that the meals in the menu could be changed. The user and evaluator discussed whether it should be pos-sible to change the ingredients in a menu. The user was a bit concerned that other alternative ingredients would spoil the totality of a meal. If he did not like a specific ingredient as garlic he would just skip it himself in the recipe.

Later the user proposed to change a whole meal or change the order of the different meals. This confirmed us that it should be pos-

sible to do this, which we earlier just assumed the user would like to. This feature have not been implemented in the prototype yet.

The user was aware of getting a varied cost and proposed himself to have a day with fish instead of all meat. The remote controlIn the beginning the user found it hard to use the remote control instead of the touch-screen. The arrows did not mean back and next but moved the cursor to the left or right. When understanding this the user managed the navigation well and pressed OK when he wanted to tap the button of the cursor. At one time the user moved the cursor too much to the right and pressed the Home but-ton. In this way he left the cooking instruc-tions which was not the intension. This break-down should be considered and it should be clear to the user where the cursor is when us-ing the remote control.

The user found the remote control good but he had difficulties in remembering using it all the way through and he did several time tap the touch-screen ‘with sticky fingers’. The icons The user found the edit, info and accept icons good and mentioned that is was good that there was a consistency of which icons and choices he could choose in the different screens.The user found the icons in the Home screen good and decoded them well, he though. He

Page 83: Interaction Magazine December

83

December Issue

did not decode the calender, meal and ingredi-ent icons in the Menu screens. He rather read the text and decoded the system in this way.

The language was from time to time a bar-rier. E.g. the user misunderstood the meaning of the Edit button which caused difficulties in the different screens when he should edit a setting. He thought that Edit meant Finish or Stop.

He liked that the icon was often supplement-ed with text. Then he would not be in doubt of the meaning of the button he said.

The size of the buttons were good and could even be smaller the user thought. If he should tap a button it would be good if it had the same size as the buttons on his laptop’s key-board.

Finishing a sequenceThe user found it difficult to see if he had fin-ished all the steps of a sequence and should go back to Home. By changing the text and icons in the bottom of the screen we hope that this will help the user realize his possibili-ties of the last screen in the sequence. ConclusionThe user test gave us a lot of feedback of the program and made us realize less intuitive mappings which the elderly user had difficul-ties understanding. The language was from time to time a barrier. E.g. did the user misunderstand the meaning

of the Edit button which caused difficulties in the different screens when he should edit a setting. The user found the four main functions, Cook, Menu, Profile and Shop, well chosen and they covered all his requests for the application: “These functions are what I need!”. Preparation for the user testQuestions to be answered during the test:● Familiar with touchscreens/devices?● Navigating the system with remote control● Recovery during use● Change of personal settings (applying intolerance to a food source)● Users understanding of buttons, icons,

● Is a 7 days period appropriate● How about the personal settings - are they understandable - too long/short? Is it okay that the user MUST complete all ques-tions ● How the users navigates between dif-ferent pages? Can they go back to the last page?● Can they go in and change the user set-tings? ● Do the icons make sense?● Are the buttons too small or too big?● Can they make a week menu without problems? Is it obvious that they can change a dish to another?● Are the remote control easy to use? ● Would they prefer the stand-remote control, so the tablet do not lie on the table or

Page 84: Interaction Magazine December

84

Interaction Magazine

do they prefer the other solution? ● Would they prefer to shop only once a week or more times?● How long time before dinner have they planned what to make, and do they plan for more day than one?

Page 85: Interaction Magazine December

85

December Issue

Appendix 2 – user interview

User couple 1:Profile: A couple (Tove, 78 year and Ove, 82 year) has been asked according to their cook-ing.It is mostly Tove who is doing all the cooking. Her husband helps with things like stirring the sauce or taking the food out of the oven. They do both shop – sometimes a part, but mostly together.Ove takes anticoagulant medication, which does that he should not eat food with a high content of k-vitamin (because it will reduce the medications effect). His wife claims that this is annoying. Usually they had a lot of spinach in the kitchen garden, but now Ove cannot eat this. There is a lot of the green vegetables he cannot eat, and that limits the cooking. A lot of the recipes she usually cook contain vegetables with a high content of k-vitamin. Therefore, she thinks it is a good idea if she could get inspiration of other dishes than the ones she usually makes, in order to replace those she cannot make anymore. When they buy things (not just food) they are very much aware of the price. They like to buy food cheap. And she hates to throw food away there has gotten too old.Tove do not believe that she could use a tablet, but Ove thinks that it would be easy to learn. They can both use computer, but Ove is bet-ter than Tove.

User 2

Profile: woman, Anni, 59 years. Anni is a part of the intended user group within some years. Does not have a tablet “but I would probably have it very soon”.• Are doing a week menuo It is cheapero I don’t have to think about it during the weeko I only have to shop once (at least one big shopping and them possibly smaller shop-pings for milk and so on)• I use the proposals for meal on the back page of Søndagsavisen [the free newspa-per Sunday]o The ingredients in the meals are shared among the different meals – this makes it cheaper• I love to have alternatives for the dishes – to try new food and to get inspired. Not al-ways having the same 5 different salads • I would love to get information about the nutrition of the food. This is knowledge I do not have myself...• A week menu should contain fish twice a weeko Should be different types of fish• It would be nice also to get proposals for lunch. Right now I am at work at lunch, but in a couple of years I will be home and would like to have exciting food at lunch as well• I would like to have the shopping list at my phone. I would not bring my iPad – it is too unpractical. Printing is waste of paper...• It should be an application – not a tab

Appendix 2 - Interviews

Page 86: Interaction Magazine December

86

Interaction Magazine

let with the programme. If the user would like to use the application on a tablet he would most likely have a tablet already... If the con-cept was a tablet it would be annoying that it could only be used for this application. o The application should be to the kind of elders who have the energy to have a tablet • The menus should be costumizedo To fit to my health conditionso Not to eating the same meal as my neighbour all days!!• During cooking I am not worried about the tablet to become dirty/sticky. A remote control could be ok. But it did not have to be there!• About payment I think that I would like to have a monthly subscription – I would pay 20-25 DKK a month. Or I could just buy it once. Then I would pay maximum 450 DKK. As a cooking book. But I would not know if I would be found of the application – so for a start I would like the subscription. Then there could be an addition for lunch to the price of 5 DKK...• I would not pay more if I had a spe-cial diseace/comdition. It should be the same price for all.• It should be special made for elderly – so the meals would give the right nutrition.

Page 87: Interaction Magazine December

87

December Issue

To decide what possibilities there should be for the user in GoodFood the group made various suggestions that were merged to one solution. In the following some of these suggestions can be found:

Appendix 3 - Sketches

 

Page 88: Interaction Magazine December

88

Interaction Magazine

 

 

Page 89: Interaction Magazine December

89

December Issue

Appendix 4 - ConceptulazationThrough discussion, the group found several parameters that may be relevant to a concept that is to be the solution to the problem state-ment. Based on these parameters a Morpholo-gy chart was constructed [Cross]. The param-eters are stated on the left column. This was followed by a brainstorm session where ideas according to the parameters were stated in the chart in the x – direction. The amount of ideas was deliberately limited in order to avoid an immense chart leading to a vast number of concepts.

Some of the parameters in the chart were chosen due to the scope of the course 42072 Design for Interaction. The parameter “how to operate” for instance was chosen because the devise is to be interactive with the user.

GoodFood:Other considerationsOther consideration:

Based on combination of elements from the morphology GoodFood was decided to be the final concept. The combination of solu-tions creating the basis for GoodFood was found superior in several criteria. One crite-ria being customization, in which GoodFood allows customization according to health and limitations of the user and Hygienic standard.

Parameter 1 2 3

Varied diet Week plan maker that suggests a week plan according to the users needs

Cook club where elderly join a virtual community with whom they cook in company with

A devise on the refrigerator keeping an eye out for what groceries are put in. If there is not enough variety it complains and posts recipes

Help to cook Press button and a film appears

Call a phone number

Press button and it is linked to a webpage with cooking guidelines

Hygienic standard Cover product with thin layer of film

Make the product washable

Washable remote control

How to operate Physical buttons linked to screen

Touchscreen Voice

Physical components /layout

Screen mounted on the fridge/in the kitchen

Mobile device, with a screen to be carried around

Page 90: Interaction Magazine December

90

Interaction Magazine

Appendix 5 - Division of work

All group members have contributed equally to the project, and the report/prototyping work has been divided equally. The main responsibilities of the group members have been:

All members have read the report, rewritten and edited parts.

Introduction – written made by Meiken and KamillaProcess – Text written by Maja and illustration made of MartinUnderstanding the problem – Written by MichaelMarked research – Written by MichaelElderly people and their limitations – Written by Meiken Design specification for a New Design – Made by all the group membersProblem statement – Made by all the group membersPACT Framework – Written by MajaData Flow chart – Written and illustrated by MajaIntended use of GoodFood (Seven steps of action) – Written by Maja and edited by Kamilla.Purchasing GoodFood – written by MeikenErgonomic considerations/User considerations – written by MeikenUser test – written by KamillaDescription of the final application – written by Martin and ChristopherVarious features – Written by MeikenFurther Work – Written by Meiken

Mock-ups are made by:Remote control physical – MeikenRemote control (CAD) – MichaelGoodFood – Martin and Christopher

User test – performed and videotaped by Michael

Report layout – Christopher and Martin