castle: a video game for lateral thinking training · chapter 1 introduction...

62
POLITECNICO DI MILANO Corso di Laurea Magistrale in Ingegneria Informatica Dipartimento di Elettronica e Informazione CASTLE: a video game for lateral thinking training Relatore: Prof. Pier Luca Lanzi Correlatore: Prof.ssa. Manuela Eliane Anna Cantoia Tesi di Laurea di: Davide Jones, matricola 771517 Roberto Casanovi, matricola 771465 Anno Accademico 2011-2012

Upload: phungkhanh

Post on 21-Feb-2019

214 views

Category:

Documents


0 download

TRANSCRIPT

POLITECNICO DI MILANOCorso di Laurea Magistrale in Ingegneria Informatica

Dipartimento di Elettronica e Informazione

CASTLE: a video game for lateralthinking training

Relatore: Prof. Pier Luca Lanzi

Correlatore: Prof.ssa. Manuela Eliane Anna Cantoia

Tesi di Laurea di:

Davide Jones, matricola 771517

Roberto Casanovi, matricola 771465

Anno Accademico 2011-2012

Sempre, a Martina

Ringraziamenti

Vorremmo ringraziare il professor Pier Luca Lanzi per l’aiutoe la disponibilità manifestati durante la stesura di questa tesi.Vorremmo anche ringraziare la professoressa Manuela Cantoia eAlessandro per l’aiuto e l’assistenza forniti in questi mesi. Rivol-giamo inoltre un ringraziamento speciale a Fava e Lucio per tuttoil meraviglioso tempo passato assieme in questi anni. Ringrazi-amo Efrem per tutto l’aiuto e i consigli forniti durante lo sviluppodel gioco.

Contents

1 Introduction 1

1.1 Lateral thinking . . . . . . . . . . . . . . . . . . . 21.2 CASTLE . . . . . . . . . . . . . . . . . . . . . . . 31.3 Data mining . . . . . . . . . . . . . . . . . . . . . 31.4 Summary . . . . . . . . . . . . . . . . . . . . . . . 4

2 Lateral thinking 5

2.1 Practicing lateral thinking . . . . . . . . . . . . . 52.2 Summary . . . . . . . . . . . . . . . . . . . . . . . 9

3 CASTLE 11

3.1 Game structure . . . . . . . . . . . . . . . . . . . 113.2 Puzzles . . . . . . . . . . . . . . . . . . . . . . . . 163.3 Lateral thinking in CASTLE . . . . . . . . . . . . 233.4 Tracing . . . . . . . . . . . . . . . . . . . . . . . . 253.5 Summary . . . . . . . . . . . . . . . . . . . . . . . 26

4 Tools and implementation 27

4.1 Cocos2d-x . . . . . . . . . . . . . . . . . . . . . . 274.1.1 Game structure in Cocos2d-x . . . . . . . . 28

i

CONTENTS ii

4.1.2 Memory management . . . . . . . . . . . . 294.1.3 Actions and Callbacks . . . . . . . . . . . . 30

4.2 PugiXML . . . . . . . . . . . . . . . . . . . . . . 304.2.1 XML parsing . . . . . . . . . . . . . . . . . 30

4.3 Procedural map generator . . . . . . . . . . . . . . 314.4 Player movement . . . . . . . . . . . . . . . . . . 334.5 Summary . . . . . . . . . . . . . . . . . . . . . . . 35

5 Data analysis 36

5.1 Methods . . . . . . . . . . . . . . . . . . . . . . . 365.1.1 Partecipants . . . . . . . . . . . . . . . . . 375.1.2 Materials . . . . . . . . . . . . . . . . . . . 375.1.3 The questionnaire . . . . . . . . . . . . . . 385.1.4 Procedures . . . . . . . . . . . . . . . . . . 395.1.5 Data analysis . . . . . . . . . . . . . . . . 395.1.6 Performance . . . . . . . . . . . . . . . . . 40

5.2 Results . . . . . . . . . . . . . . . . . . . . . . . . 415.3 Summary . . . . . . . . . . . . . . . . . . . . . . . 44

6 Conclusions and further work 45

Bibliography 47

List of Figures

2.1 The nine dots problem (a) and its solution (b) . . 6

3.1 Menu . . . . . . . . . . . . . . . . . . . . . . . . . 123.2 World map . . . . . . . . . . . . . . . . . . . . . . 123.3 A room . . . . . . . . . . . . . . . . . . . . . . . . 133.4 A room with no lights . . . . . . . . . . . . . . . . 143.5 Game menu . . . . . . . . . . . . . . . . . . . . . 153.6 Inventory . . . . . . . . . . . . . . . . . . . . . . . 153.7 Castle map . . . . . . . . . . . . . . . . . . . . . . 163.8 Riddle example . . . . . . . . . . . . . . . . . . . 173.9 Association example . . . . . . . . . . . . . . . . . 183.10 Sequence example . . . . . . . . . . . . . . . . . . 193.11 Matchsticks example . . . . . . . . . . . . . . . . 203.12 Solution of the matchsticks example . . . . . . . . 203.13 Details example . . . . . . . . . . . . . . . . . . . 213.14 Ambiguous image example . . . . . . . . . . . . . 223.15 Find-it example . . . . . . . . . . . . . . . . . . . 23

4.1 Scenes flow example . . . . . . . . . . . . . . . . . 284.2 Example of a xml tree structure in PugiXML . . . 31

iii

LIST OF FIGURES iv

4.3 Player movement with standard A* . . . . . . . . 344.4 Player movement with our implementation of A* . 34

5.1 Map C . . . . . . . . . . . . . . . . . . . . . . . . 435.2 Map A . . . . . . . . . . . . . . . . . . . . . . . . 43

List of Tables

5.1 Test subjects distribution . . . . . . . . . . . . . . 39

v

Abstract

Video games have emerged as the most frequently used interactivemedia among children and as an important tool for brain trainingand education [1]. In collaboration with the Università Cattolicadel Sacro Cuore, we developed a video game for lateral think-ing training. We had 72 students playing the game on differentplatforms. Through data collection and analysis, we investigatedhow subjects with different backgrounds react to the problemsproposed in the game and whether the platform affects player’scognitive performance.Our game appears to be an useful tool to study various aspects andcorrelations of cognitive skills regarding lateral thinking. Findingsindicated that the digital versions of the game stimulates lateralthinking skills of subjects more than non-digital version. Find-ings also suggested that gamers perform better in lateral thinkingproblems.Our video game is currently used at the Università Cattolica delSacro Cuore to investigate the use of lateral thinking in mid andhigh-school students.

vi

Sommario

I videogiochi sono diventati uno tra i media più frequentementeutilizzati dai bambini e un importante strumento per l’ appren-dimento e per lo sviluppo delle capacità cognitive [1]. In col-laborazione con l’Università Cattolica del Sacro Cuore, abbiamosviluppato un videogioco per il miglioramento del pensiero lat-erale. Il videogioco è stato giocato da 72 studenti su diverse pi-attaforme. Attraverso la raccolta e l’analisi dei dati, abbiamoindagato come soggetti provenienti da contesti differenti reagis-cono ai problemi proposti nel videogioco e se la piattaforma usatainfluenza le prestazioni cognitive dei soggetti.I risultati suggeriscono che il nostro videogioco è uno strumentoutile per studiare vari aspetti e correlazioni tra le capacità cog-nitive legate al pensiero laterale. I risultati suggeriscono che leversioni digitali dei giochi stimolano maggiormente le capacitàdi pensiero laterale nei soggetti rispetto a versioni non digitali.Inoltre vi è evidenza che i giocatori abituali ottengono risultatimigliori nei problemi di pensiero laterale.Il nostro videogioco è attualmente utilizzato presso l’UniversitàCattolica del Sacro Cuore per ricerca nell’ambito del pensiero lat-erale nelle scuole medie e superiori.

vii

Chapter 1

Introduction

Educators increasingly recognize the impact of entertainment soft-ware and utilize games as a teaching device in a growing numberof classrooms and business settings. In doing so, they are em-bracing the cultural and technological shifts of the 21st centuryand expanding the use of a favorite leisure activity, computerand video games, into a critical and still-emerging educationalresource. More than just play, entertainment software helps im-parting knowledge, developing life skills and reinforcing positivehabits in students of all ages.In addition to being a great way to keep students engaged, re-searchers have found that video games have real potential as next-generation learning tools. Games use new technologies to incor-porate principles that are crucial to human cognitive learning.University of Wisconsin education professor Dr. James Paul Geeconcluded that video games intermix instruction and demonstra-tion, a more effective learning technique than the style currentlyfound in most classrooms [2]. According to a recent study con-

1

Chapter 1. Introduction 2

ducted by researchers at the University of Michigan [3], puzzlegames that exercise children’s working memories can enhance theirabstract reasoning and problem-solving skills, which can have adirect impact on future educational and occupational success. Inaddition, a study conducted by scientists at the University ofRochester found that video games can improve players’ vision,attention and certain cognitive skills [4]. Study participants alsoperformed better than non-gamers on certain tests of speed, ac-curacy and multitasking.

1.1 Lateral thinking

Among cognitive skills, lateral thinking is the ability to solve prob-lems through an indirect and creative approach, thus lateral think-ing distances itself from "vertical" logic (the classic method forproblem solving: working out the solution step-by-step from thegiven data) or "horizontal" imagination (having many ideas butbeing unconcerned with the detailed implementation of them).Traditional thinking is to do with analysis, judgment and argu-ment. In a stable world this is sufficient because it is enough toidentify standard situations and to apply standard solutions. Thismay not apply in a changing world where the standard solutionsmay not work [5].The term ’lateral thinking’ was coined in 1967 by Edward de Bonowho stated that lateral thinking can be taught and learnt throughexercise. The term ’lateral thinking’ is now part of our societyand its tools are used by individuals and organisations around the

Chapter 1. Introduction 3

world [5]: lateral thinking has been adopted in many fields, frombusiness [6] to information technology [7].

1.2 CASTLE

CASTLE is a 2D multi-platform, dungeon crawling video gamewe developed in collaboration with the Università Cattolica delSacro Cuore di Milano. CASTLE is a video game created to sup-port research on lateral thinking in schools. To the best of ourknowledge, there is no video game, publicly available, specificallydeveloped and designed to study lateral thinking.CASTLE was developed on the base of the expertise of the re-search unit of the Università Cattolica del Sacro Cuore. The appli-cation has been validated having 72 students playing the game ondifferent platforms (personal computer, tablet and a paper versionof the game). We used data mining techniques to analyse players’behaviour with respect to cognitive skills and lateral thinking. Inour research, we investigated whether a subject who frequentlyplays video games performs better at lateral thinking problemsand how the game platform (e.g. personal computer, tablet, etc)affects the cognitive performance of the subjects.

1.3 Data mining

Data mining is the process of analyzing data from different per-spectives and summarizing it into useful information. It is widelyused in computer science to analyse the behaviour of the infor-

Chapter 1. Introduction 4

mation systems’ users. Data mining is also often used to analyseplayers’ behaviour in video games. For example data mining wasused to analyse huge amounts of data to predict players’ strate-gies in Starcraft [8]. In Tomb Raider Underworld, data miningwas used to predict when a player would stop playing the game[9]. Thurau e Bauckhage used data mining to retrieve relevant in-formation about social interactions in Massive Multiplayer OnlineRole-Playing Games [10].

1.4 Summary

This paper is structured as follows:In the next chapter we give an overview of lateral thinking andvideo games.In Chapter 3 we describe CASTLE.In Chapter 4 we describe the tools and implementations used inthe develop of CASTLE.In Chapter 5 and 6 we describe how we managed to collect andanalyse the data about player behaviors and comment the results.

Chapter 2

Lateral thinking

Lateral thinking is defined as the solving of problems through anindirect and creative approach, using reasoning that is not imme-diately obvious and involving ideas that may be not be obtainedby traditional logic. The term was coined in 1967 by Edward DeBono [11], who also argued that lateral thinking can be taughtand improved. Most of the games we are accustomed to are basedon traditional logic, for example, Sudoku and Nim require nothingbut mathematical logic [12][13].In this chapter, we present examples of both classical and videogames involving lateral thinking.

2.1 Practicing lateral thinking

Lateral thinking puzzles [14] are multiplayer games where a hostdescribes an odd situation and the other players are asked to comeup with an explaination for what happened. Players are stimu-lated to go overcome preconceptions to solve the puzzle. A famous

5

Lateral thinking 6

example of lateral thinking puzzle is "A man rode into town onMonday. He stayed for three nights and then left on Monday.How come?". The described situation is impossible only if wethink that Monday can only be the name of a day of the week.However the solution is that Monday is the name of the man’shorse.In the nine-dots puzzle a three by three matrix of dots is givenand the player has to connect all the dots using four straight lineswithout lifting the pen. The solution requires the player to drawthe lines outside the boundaries of the square area defined by thenine dots (Figure 2.1).

(a) (b)

Figure 2.1: The nine dots problem (a) and its solution (b)

The nine-dots puzzle popularized the phrase "thinking outside thebox". The nine-dots puzzle first appears in Sam Loyd’s 1914 Cy-clopedia of Puzzles [15]. Sam Loyd’s original formulation of thepuzzle entitled it as "Christopher Columbus’s egg puzzle".

Video games often involve aspects related to lateral thinking andhave become among the most popular leisure time activity forchildren and young adults. The computer game industry is a

Lateral thinking 7

billion-dollar business and its products have become a major partof today’s media [16]. Several researches show that video gamescan improve various skills, such as motor, spatial and cognitive.Games like Electronic Arts’s Medal of Honor or Activision’s Callof Duty can improve visual acuity [17]; Tetris stimulate at acognitive level, engaging our visual system in low-level patternrecognition [18]. The advantage video games have with respect totraditional games is that they constantly give to the player theprospect of being gratified for problem solving. Video games fol-low the principle of "low cost of failure and high reward success"[2] so that failure became as much motivating as rewards becauseevery failure intrinsically teaches something useful to achieve suc-cess.

There are several examples of video games involving aspects re-lated to lateral thinking. Some games like horror puzzle Anna1

or the Monkey Island series2 require the player to interact withthe environment finding correlations between items to solve puz-zles and advance; sometimes solutions to the puzzles involve non-traditional logic.Other games give the player the possibility to solve puzzles in dif-ferent ways using a creative approach. Examples of these gamesare Crayon Physics Deluxe and Portal. In Crayon Physics Deluxe3

the player has to guide a ball from a start point collecting every1www.dreampainterssoft.com/wp2www.lucasarts.com/games/monkeyisland3www.crayonphysics.com

Lateral thinking 8

star placed on the level. The player cannot control the ball di-rectly, but rather must influence the ball’s movement by drawingphysical objects on the screen. Portal, developed by Valve Corpo-ration in 20074, is the combination of a first-person shooter witha puzzle game. The game comprises a series of puzzles that mustbe solved by teleporting the player’s character and simple objectsusing the "Aperture Science Handheld Portal Device", a devicethat can create inter-spatial portals between two flat planes. In2011 Valve Corporation published a sequel5 .Similar to Crayon Physics Deluxe, Fantastic Contraptions6 re-quires the player to build a contraption and to use it to move anobject to a specific location. The game allows almost infinite waysto solve each level.There are no video games aimed at studying lateral thinking. Aresearch on brain fitness Nintendo’s video game Brain Age showedthat the effects of the brain training game were transferred to exec-utive functions and to processing speed but there was no transfereffect on any global cognitive status nor attention [19].On the other hand, a research supported by the National ScienceFoundation and Human and Social Dynamics program conductedon a sample of 490 parents and their 490 middle school children,suggests that children who play video games more are more cre-ative than children who play video game less, while parent behav-ior on the dimensions of warmth and control is unrelated to their

4www.valvesoftware.com/games/portal5www.valvesoftware.com/games/porta26www.fantasticcontraption.com

Lateral thinking 9

child’s creativity [20].In a study conducted by the Department of Psychology of theCalifornia State University, riddles and video games were used todevelop lateral thinking. Preliminary results from their analysisprovide promising evidence that these methods are effective in en-hancing creative and critical thought in college students [21].Another study shows that video game generated emotion signif-icantly affects creativity through the interaction of arousal andvalence [22].CASTLE is a video game created to support research on lat-eral thinking in schools. CASTLE was developed on the base ofthe expertise of the research unit of the Università Cattolica delSacro Cuore. The application has been validated having 72 stu-dents playing the game on different platforms (personal computer,tablet and a paper version of the game). We used data miningtechniques to analyse players’ behaviour with respect to cogni-tive skills and lateral thinking. In our research, we investigatedwhether a subject who frequently plays video games performs bet-ter at lateral thinking problems and how the game platform (e.g.personal computer, tablet, etc) affects the cognitive performanceof the subjects.

2.2 Summary

In this chapter, we introduced the term ’lateral thinking’ and andhow games and video games are related to it.

Lateral thinking 10

In the next chapter we describe our video game CASTLE.

Chapter 3

CASTLE

Creative Activities Strengthening Thoughtful Learning Experiences(CASTLE) is a 2D video game we developed for the training oflateral thinking capabilities. The player, in the role of a student,is called to free a realm from an evil wizard who cast a curse thatblocked inhabitants’ creativity. The player has to explore therealm’s castles solving puzzles to re-obtain the creativity neededto defeat the evil wizard. In the journey in CASTLE world theplayer encounters puzzles developed to train lateral thinking indifferent ways.

3.1 Game structure

When the game is launched, it shows a menu from which it is pos-sible to start a new game, load a saved game or open the optionscreen (Figure 3.1).

11

CASTLE 12

Figure 3.1: Menu

At the beginning of a new game the player has to choose a nameand the sex for the character. Then, the world-map opens up andthe player can go to one of the four castles available (Figure 3.2).

Figure 3.2: World map

Three castles have twenty-five rooms, the fourth has five rooms.

CASTLE 13

Each room is viewed with a top-down perspective. When theplayer clicks a position on the screen, the character will move toit. Rooms are connected through doors. Some doors are closed,preventing the access to a room unless the player finds the keyto open it by solving a puzzle. A room can contain various itemssuch as tables, paintings, torches, and non-playing characters suchas knights and ladies (Figure 3.3).

Figure 3.3: A room

Some rooms have no source of light and the player can only see ina small area surrounding the character (Figure 3.4).

CASTLE 14

Figure 3.4: A room with no lights

To access puzzles the player has to select some items or non-playing characters on screen. Solving a puzzle, the player receivesa certain amount of creativity points and may receive an itemwhich will be stored into the inventory.To open the inventory the player has to access the game menu bypressing buttons located in the corners of the screen (Figure 3.5).

CASTLE 15

Figure 3.5: Game menu

Figure 3.6: Inventory

The menu also shows the map of the visited rooms and reportsinformation about positions, connections, light sources and closeddoors (Figure 3.7).

CASTLE 16

Figure 3.7: Castle map

3.2 Puzzles

In CASTLE there are three puzzle types, divided into verbal, spa-tial and visual. The player receives a certain amount of creativitypoints by solving a puzzle. In some puzzles, the player can askfor hints. If hints are used the player gains less creativity pointscompleting the puzzle. It is possible to check the current creativ-ity level by opening the game menu. In addition to the creativitypoints, puzzles may reward the player with an item. Items workas keys to open castle’s doors.

Verbal puzzles

In CASTLE there are three types of verbal puzzles: riddles, as-sociations and sequences.

Riddles are statements or questions having a double or veiled

CASTLE 17

meaning, requiring thought to answer or understand. The playercan ask for 1-3 hints, the longer the solution to the riddle, themore hints the player can ask for. Each hint reveals a letter ofthe solution. If the player fails three times the puzzle won’t beaccessible anymore.An example of riddle is: "Al suo passaggio tutti si tolgono il cap-pello. Ha i denti ma non morde." The correct answer is: "Pettine"(Figure 3.8).

Figure 3.8: Riddle example

Associations are puzzles in which three words are given and theplayer is challenged to find a fourth one that connects them all.The player can ask for 1-3 hints and has three tries, as in riddles.An example of association is: "Cosa lega queste tre parole? Aria- Scrivere - Continuo?". The correct answer is "Getto" (Figure3.9).

CASTLE 18

Figure 3.9: Association example

Sequences are puzzles in which the player has to find the correctanswer from a short definition. If the answer given by the player iswrong, another part of the definition is shown and the creativitygained in case of success decreases. The player has five tries.An example of sequence definition is: "E’ giallo". If the playersgives a wrong answer, the next definition will appear: "Vive inAfrica". The correct answer is: "Ghepardo" (Figure 3.10).

CASTLE 19

Figure 3.10: Sequence example

Spatial puzzles

In CASTLE there is one type of spatial puzzle: matchsticks.

In matchsticks puzzles the player receives a number of matchsticksarranged as squares, rectangles or triangles. The player has to re-arrange matchsticks to make a target figure. There are no hintsand the player has an unlimited amount of tries. The player canalso reset the position of all the matchsticks by pressing the "Re-set" button in the lower right corner.An example of matchsticks is "Sposta tre bastoni per otteneretre quadrati congruenti", shown in Figure 3.11. The solution isshown in Figure 3.12.

CASTLE 20

Figure 3.11: Matchsticks example

Figure 3.12: Solution of the matchsticks example

Visual puzzles

In CASTLE there are three types of visual puzzles: details, am-biguous images and find-it.

Details are puzzles in which a detail of a picture is shown and the

CASTLE 21

player has to understand which object it represents. If he failsanother detail appears and the creativity gained in case of suc-cess decreases. If the player fails three times the puzzle won’t beaccessible anymore. It is also possible to ask for the next detailwithout trying to give a solution.An example of details is shown in Figure 3.13. The correct an-swer is "Anello".

Figure 3.13: Details example

Ambiguous images are images which exploit graphical similaritiesand other properties of visual system interpretation between twoor more distinct image forms to provide multiple, although stable,perceptions. Classic examples of this are the rabbit/duck and theRubin vase [23]. Given an ambiguous image, the player has toidentify the two forms. There are no hints and the player has anunlimited amount of tries.An example of ambiguous images is shown in Figure 3.14. The

CASTLE 22

correct answers are "Anatra" and "Coniglio".

Figure 3.14: Ambiguous image example

In find-It puzzles the player has to find every occurrence of aspecific object within an image. The images are full of differentobjects in order to raise confusion. There are no hints and theplayer has an unlimited amount of tries but after a certain num-ber of wrong tries the player is forced to exit the puzzle and startover.An example of find-it is shown in Figure 3.15. In this examplethe player has to find every heart in the figure.

CASTLE 23

Figure 3.15: Find-it example

3.3 Lateral thinking in CASTLE

CASTLE’s puzzles are developed to offer the player the possibilityto train and challenge lateral thinking.

In riddles and sequences the player is asked to answer a ques-tion given a definition. However, the definition is often vague andconfusing: words used in definitions often have double meaningand the player has to analyze the sentence in multiple ways. Anexample is: "What falls, but does not break, and what breaksbut does not fall?". This puzzle, whose answer is "night and day"(night falls and day breaks), is based on the double meaning of thewords "fall" and "break". Similarly in associations the solutionword has completely different meanings when referred to one ofthe three given words.Matchsticks puzzles are designed to challenge spatial visualization,

CASTLE 24

the ability to mentally manipulate two and three-dimensional fig-ures. They are the digital version of a classical spatial-visualizationgame. Moreover, researches by the University of Toronto have dis-covered that differences between men and women on some tasksthat require spatial skills are largely eliminated after they bothplay a video game for only a few hours [24]. According to a2009 study in Current Directions in Psychological Science, play-ing video games enhances performance on mental rotation skills,visual and spatial memory, and tasks requiring divided attention[25].Ambiguous images are well-know tools used in psychology experi-ments in which the player has to recognize two items drawn in anambiguous figure. This process trains middle-level vision. Middle-level vision is the stage in visual processing that combines allthe basic features in the scene into distinct, recognizable objectgroups. This stage of vision comes before high-level vision (un-derstanding the scene) and after early-level vision (determiningthe basic features of an image). When perceiving and recognizingimages, middle-level vision comes into use when we need to clas-sify the object we are seeing. High-level vision is used when theobject classified must now be recognized as a specific member ofits group. For example, through middle-level vision we perceive aface, then through high-level vision we recognize a face of a famil-iar person.In details the small visible portion of the image is not a key-partof the image itself and to give the correct answer the player has to

CASTLE 25

use creativity and imagination. Find-it works the same way: theitems to find are often placed so that only a detail is visible. Tohelp the use of lateral thinking, both in find-it and details puzzlessome items do not have their natural color (e.g. hearts are white,not red).

3.4 Tracing

CASTLE implements a mechanism for data collection which cre-ates a log file each time the game is played. A log file containsthe following information:

• The seed used to generate the game contents

• Name and sex of the character

• The time spent in each room. If the player exits a room andcomes back later, two different values are registered

• The time spent trying to solve each puzzle. If the player exitsthe puzzle screen and comes back later, two different valuesare registered

• The total time passed in each castle

• The exploration time (e.g. total time minus the time spentin puzzle) for each castle

• The exploration paths for each castle

Tools and implementation 26

• The number of hints used

• The number of puzzles solved

• The number of puzzles failed

• The number of puzzles left behind

• The number of touches of the screen for each room. If theplayer exits the puzzle-view and comes back later, two differ-ent values are registered

• The number of times the player opens the inventory

• The number of times the player opens the map

3.5 Summary

In this chapter, we have described the structure of the game CAS-TLE and why it can be used to train lateral thinking.In the next chapter we describe the tools used for the developmentand some implementation algorithms.

Chapter 4

Tools and implementation

CASTLE was developed using the C++ programming languageand Cocos2d-x game engine. Game content is stored in XMLformat and parsed using the PugiXML library. In this chapter,we describe these tools. We also describe the algorithm used togenerate maps using Procedural Content Generation and our im-plementation of the A* algorithm used to move the player’s char-acter.

4.1 Cocos2d-x

Cocos2d-x [26] is a free cross-platform open-source engine for 2Dgame development that we used to create CASTLE. It includes aset of C++ libraries for handling game graphics, physics, audio,scripting and user input.

27

Tools and implementation 28

4.1.1 Game structure in Cocos2d-x

In Cocos2d-x, a game is structured as a set of scenes. A scene isa tree structure that stores elements to draw on the screen.A scene represents a single section (e.g. a level or a menu) of thegame (Figure 4.1). A scene can contains all the game elements,including sprites, buttons, particle effectes, etc. When rendered,the scene first draws the last items added providing automaticculling. In Cocos2d-x a Scene is rapresented by the CCScene1

class. Elements can be added to the scene using the addChild

method of CCScene.This Cocos2d-x feature allows the creation of a very hierarchicalstructure of the game: at any moment, the engine has one andone only active CCScene class reference working as the root of thecurrent scene. Every item added to the active scene reference willthus be rendered on the screen.

Figure 4.1: Scenes flow example

In the game there may be several stored CCScene instances and wecan switch from one scene to another. Additionally, we can specifythe transition effect (e.g. fade out, fade in) and duration. When

1Every Cocos2d-x class uses the ’CC-’ prefix

Tools and implementation 29

switching scene, the engine automatically releases the source one,freeing up memory. CASTLE is organized in four main scenes: ascene for the game menu (MenuScene), a scene for the world map(WorldMapScene), a scene representing a room (RoomScene) anda scene representing a puzzle (QuizScene); the engine constantlychanges the rendered scene accordingly to the user input.

4.1.2 Memory management

Cocos2d-x features automatic memory management. It imple-ments an autorelease pool in a similar way to the one used inObjectiveC [27]. Whenever a Cocos2d-x object is created, it isautomatically added to the autorelease pool. At the end of thecurrent code section, every item in the autorelease pool will beautomatically released, unless they’ve been added to a scene orretained using the retain method. When an item has been re-tained this way, the programmer has to release it later on usingthe release method, to avoid memory leaks.This may be useful to avoid the automatic release of an object thatwe may need later: we may choose to retain it to save the cre-ation computational time. For example, when a transition fromthe RoomScene to the QuizScene occurs, we entirely retain theRoomScene in order to avoid to create it from scratch when we goback to the RoomScene.

Tools and implementation 30

4.1.3 Actions and Callbacks

Cocos2d-x uses CCAction to move, rotate or manipulate sprites.When a CCAction is created, it is associated to a target (e.g. theplayer’s sprite) and launched with the target’s runActionmethod.There are several kinds of CCAction, such as CCMoveTo which al-lows to move the target from its actual position to a new pointon the screen. It is also possible to organize multiple CCAction

objects in a CCSequence and then run the entire sequence. Dur-ing the execution of a sequence, actions are sequentially executedone at a time. CCSequence is particularly interesting because ofthe possibility to execute not only CCAction objects but also func-tions. These functions, called callbacks, are treated as a CCActionand invoked in the same way. Similarly, CCMenuItems which rep-resents menu buttons, can be associated with a callback functionso that when the user presses a button, the associated method iscalled.

4.2 PugiXML

Game puzzle content is stored in XML files. We used a C++ XMLprocessing library named PugiXML [28] to parse the content ofthe XML files and load puzzle descriptions.

4.2.1 XML parsing

PugiXML loads an XML document as a tree data structure thathas one or more child nodes. Nodes may have different types; de-

Tools and implementation 31

pending on the type, a node can have a collection of child nodes,a collection of attributes, and some additional data (e.g. name)(Figure 4.2).

Figure 4.2: Example of a xml tree structure in PugiXML

Once loaded, we can use the library methods to explore the treefrom the root and retrieve attributes and data from nodes.

4.3 Procedural map generator

In CASTLE, we used Procedural Content Generation to createcastle’s maps and to place items and puzzles inside rooms. Pro-cedural Content Generation (PCG) [29] is a term used to identifyan algorithm for the automatic creation of random contents. This

Tools and implementation 32

approach is often used in video games to increase the amount oflevels or challenges the player may encounter.Given a NxN boolean matrix to describe a map where true meansthe presence of a room and false means otherwise, we designedmaps with three constraints:

• every map had to contain a specific number of rooms (thatis, specific number of true values in the matrix)

• two adjacent rooms had to be linked (that is, you are able tomove from one to another)

• no rooms may be placed outside the matrix boundaries

At the beginning of a new game, we randomly generate a string.This string is used as a seed for the random calls of the algorithm.From here, we start with a single instance of Room class, repre-senting a room. This class contains four empty references to otherrooms that may be placed on the north, east, west and south sides.The algorithm places this room in the middle of the matrix, thenit tries to place a new room on every side of it, sequentially. Theplacement is successful if:

• there is no already existing room on that side

• the new room is inside the boundaries of the matrix

• there is no adjacent room to the new one

• given the seed, a pseudo random boolean function returnstrue value

Tools and implementation 33

If the placement is successful the algorithm is applied recursivelyto the new room and a true value is set in the matrix, elsewise ittries to place a new room on another side. The algorithm stopswhen a certain amount of rooms are placed.

Once we have this data structure for the map, we proceed to pop-ulate every room except the first one with three puzzles. For eachpuzzle type puzzles are randomly selected using the same seedused for the map generation. The number of selected puzzles ofevery type is fixed.

4.4 Player movement

In the room scenes, in order to explore, change room and reachpuzzles, the player is able to move the character touching anypoint on the screen. We divided the screen into a 6 per 8 tilematrix. The character has 48 locations accessible: he can movein a discrete way from the center of a tile to the center of anothertile.To compute the optimal path from the position of the characterto the location clicked by the player we used the A* algorithm[30] with Manhattan distance heuristic. Whenever the Manhat-tan distance between A and B is greater than half the screen widthdimension, we first compute a path from A to the midpoint of theline segment connecting A and B and then a second path fromthe midpoint to B. This way the A* algorithm computes smaller

Tools and implementation 34

paths, which require less computational time. As shown in figure4.3 and 4.4, this approach also helps to avoid long paths alongthe borders of the screen.

Figure 4.3: Player movement with standard A*

Figure 4.4: Player movement with our implementation of A*

Tools and implementation 35

4.5 Summary

In this chapter we described the tools used during the develop-ment of CASTLE. In particular we described the main featuresof Cocos2d-x and PugiXML. We then detailed the algorithm usedto randomly generate maps using PCG and the algorithm used tomake the player move in the game using the shortest path.In the next chapter we describe methods and results of the re-search.

Chapter 5

Data analysis

In this chapter we present our empirical study to test whethera subject who frequently plays video games performs better atlateral thinking problems and how the game platform affects thecognitive performance of the subjects. We describe the methodswe used for collecting the data from the subjects, the statisticalanalysis and our findings. Through data collection and analy-sis, we have studied how people with different backgrounds (e.g.students from different universities) reacts to the lateral thinkingproblems proposed in CASTLE. We also tested whether the de-vice used to play the game influences the use of lateral thinkingof the players.

5.1 Methods

In collaboration with with the Università Cattolica del Sacro Cuore,we validated CASTLE with 72 students playing the game on differ-

36

Data analysis 37

ent platforms (personal computer, tablet and paper). We applieddata mining techniques to analyse players’ behaviour with respectto cognitive skills and lateral thinking.

5.1.1 Partecipants

We collected data from 72 subjects, divided into two groups of36 players each: the former group was composed of students ofengineering, the latter of students of humanities. All the subjectswere volunteers and gained no credits for performing the test. Thesubjects were between 19 and 32 years old; the average age was23.35 years. One of the subjects was discarded from the analy-sis because his age was outside the range of interest. The genrewas not balanced, beacuse it was not considered an importantdimension of analysis.

5.1.2 Materials

Each student had to complete a questionnaire and to play thegame. We provided three versions of the game: a tablet version,a personal computer version, and a paper version. The paper ver-sion consisted in a short game-book that allows the player to ex-plore various narration paths through the use of numbered pages,and proposed the same puzzles of the tablet/personal computerversion.

Data analysis 38

5.1.3 The questionnaire

The questionnaire had four sections. The first section containedquestions about video game habits such as how much time thesubjects dedicate to electronic devices, video games and puzzlegames in general and what they think about various aspects ofvideo games. The second section contained the Object-SpatialImagery Questionnaire (OSIQ) [31], a self-report questionnairedesigned to distinguish between two different types of imagers: ob-ject imagers who prefer to construct vivid, concrete and detailedimages of individual objects (e.g. visual artists) and spatial im-agers who prefer to use imagery to schematically represent spatialrelations among objects and to perform complex spatial transfor-mations (e.g. scientists). The third section contained the BarrattImpulsiveness Scale (BIS) [32], a questionnaire designed to assessthe personality/behavioral construct of impulsiveness. The BIS isscored to yield three factors: attentional impulsiveness, motorialimpulsiveness and nonplanning impulsiveness. The last sectioncontained the Torrance Tests of Creative Thinking (TTCT) [33].The subjects had to draw six images given a number of startinglines. The sequence of images had to tell a story. The subjects alsohad to give a title to each image. The TTCT scores the resultson four scales: fluency, the total number of interpretable, mean-ingful, and relevant ideas generated in response to the stimulus;flexibility the number of different categories of relevant responses;originality, the statistical rarity of the responses; elaboration, theamount of detail in the responses.

Data analysis 39

5.1.4 Procedures

Subjects played CASTLE for 12 minutes after completing thequestionnaire. We divided each 36 people group into three sub-groups, thus each subgroup played on a different platform. Thegame also provides the possibility to choose one of three prede-fined maps instead of a random generated one. These maps areidentified as map A, map B and map C. For each platform weequally tested every predefined map. Table 5.1 sums up the dis-tribution of test subjects.

Map A Map B Map CIpad 4 4 4 12

Engineering Pc 4 4 4 12Paper 4 4 4 12Ipad 4 4 4 12

Humanities Pc 4 4 4 12Paper 4 4 4 12

24 24 24 72

Table 5.1: Test subjects distribution

5.1.5 Data analysis

The collected data, consisting in compiled questionnaires and gamelogs, were organised into a spreadsheet file and analysed usingSPSS 20, a software for statistical and data mining analysis.Initially we preprocessed some of the attributes as follows:

• We used players’ paths through rooms in CASTLE to divide

Data analysis 40

players into three categories: stationary, players who havetendencies to stay close to a specific room; explorers, play-ers who have tendencies to explore every room in the map;straightforward, players who have tendencies to follow onedirection exploring the rooms

• Answers to the open question "Why do you think peopleshould play video games?" were categorised into "to improvecognitive skills", "to gain control of emotions" and "fun"

For the analysis we used the chi-squared (χ2) test, a test of inde-pendence that assesses whether paired observations on two vari-ables, expressed in a contingency table, are independent of eachother distribution. It tests a null hypothesis stating that the fre-quency distribution of certain events observed in a sample is con-sistent with a particular theoretical distribution.

5.1.6 Performance

We scored players’ performance using the following indexes:

• For puzzle performance we considered to total number exam-ined puzzles and the relations between the number of solvedand examined puzzles, failed and examined puzzles and leftand examined puzzles

• For exploration performance we considered the total numberof explored rooms and how much the subject was oriented toexploration

Data analysis 41

5.2 Results

Data analysis showed no evidence that university of origin or sub-jects’ genre affect the subjects’ performances.Our analysis shows that there is no evidence that impulsivenessand creativity dimensions are related.It also shows that the impulsiveness is significantly related to timespent playing video games (χ2 (1, 54) = 6,629, p < 0,05): the77,1% of the subjects that spend less than 1,5 hours a day play-ing video games, obtained low motorial impulsiveness scores; the57,9% of the subjects that spend more than 1,5 hours a day play-ing video games, obtained high motorial impulsiveness scores.Our analysis shows that scores of the Object-Spatial ImageryQuestionnaire section of the questionnaire significantly differ withrespect to the field of study: humanities students obtained higherscores in object visualization (χ2 (2, 71) = 8,507 p < 0,05) whileengineering students obtained higher scores in spatial visualiza-tion (χ2 (2, 71) = 8,573 p < 0,05). Scores of the TCTT sectiondiffer with respect to the university of origin: humanities studentsshow more originality, obtaining significantly higher scores (χ2 (2,67) = 8,011 p < 0,05).The analysis reports that the device significantly influenced theperformances of the subjects: the subjects who played the tabletversion or the personal computer version completed more puzzles(χ2 (3, 71) = 16,314 p < 0,001), used more hints (χ2 (3, 71) =16,903 p < 0,001) and failed less puzzles (χ2 (2, 71) = 12,756 p< 0,01). There are no significant differences between players who

Data analysis 42

played the personal computer version and players who played thetablet version. This suggested to re-encode the device variablewith only two dimensions: digital and paper.The analysis shows that the total number of examined puzzles(variable considered only for the digital versions) is significantlyrelated with the originality dimension measured by the TCTT:original subjects read more puzzles (χ2 (4, 44) = 12,300 p < 0,05)and left more puzzles (χ2 (4, 67) = 10,007 p < 0,05).The number of errors is significantly related with the object vi-sualization scale of the OSIQ: subjects with higher object visual-ization scores have the tendency to fail more puzzles (χ2 4, 71) =10,161 p < 0,05). It is also related with motorial impulsiveness:the higher the motorial impulsiveness score is, the more errors theplayer made (χ2 (2, 71) = 8,346 p < 0,05). Errors are also relatedto the time spent playing video games: the more is the time spent,the less is the number of errors (χ2 (2, 54) = 7,958 p < 0,05).Map C (Figure 5.1) seems to encourage exploration: the 86,7%of the subjects who played that map explored at least four rooms.Map A (Figure 5.2) seems to discourage exploration: the 75% ofthe subjects who played that map explored less than four rooms.

Data analysis 43

Figure 5.1: Map C

Figure 5.2: Map A

The object visualization scores are significantly related to explo-ration: higher scores mean higher exploration time (χ2 (2, 47) =7,289 p < 0,05) and a higher tendency to explore every room inthe castle (χ2 (4, 47) = 13,459 p < 0,01).

Data analysis 44

5.3 Summary

In this chapter, we presented methods and results of our analysis.The analysis suggested that the digital version of the game stim-ulates subjects more than the paper version and that there areno significant differences between players who played the personalcomputer version and players who played the tablet version. Italso suggested that the number of errors made by the subjects isrelated to the time spent playing video games: gamers commitsless errors.

Chapter 6

Conclusions and further work

CASTLE appears to be an useful tool to study various aspectsand correlations of cognitive skills regarding lateral thinking. Inparticular the analysis suggested that a digital version of a gamestimulates lateral thinking processes of the subjects more than apaper version. We did not reported significant differences betweenperfomances of the players who played the personal computer ver-sion and the palyers who played the tablet version. This may bedue to the fact that by design CASTLE didn’t take advantage ofthe full potential of touch devices. In addition, we noticed thatthe map influenced the behaviour of the players. This should betaken into account in further analysis.The analysis also shows that the number of errors made by thesubjects is related to the time spent playing video games: gamerscommits less errors. This may suggets that gamers perform betterin lateral thinking problems.Even if analysis’ results may suggests this conclusions, furtheranalysis with an higher number of subjects are required. With

45

Conclusions and further work 46

long term analysis it may be possible to study if the game actualimproves lateral thinking skills.Finally, we plan to submit a free modified version of the gamewith more castles and puzzles, and without data collection, to theApp Store.

Bibliography

[1] John Kirriemuir; Angela McFarlane. Literature review ingames and learning, 2004.

[2] Paul Gee. What video games have to teach us about learningand literacy, 2007.

[3] www.medicalnewstoday.com/articles/228449.php. Brain ex-ercise video games can improve kids’ school performance,2011.

[4] D. Bavelier. Action video games improve vision;www.rochester.edu/news/show.php?id=3342, 2009.

[5] www.edwdebono.com.

[6] P. Sloane. The leader’s guide to lateral thinking skills: Power-ful problem-solving techniques to ignite your team’s potential,2003.

[7] Yann Landrin-Schweitzer; Pierre Collet; Evelyne Lutton. In-troducing lateral thinking in search engines, 2006.

[8] B. G. Weber; M. Mateas. A data mining approach to strategyprediction, 2009.

47

Bibliography 48

[9] T. Mahlmann; A. Drachen; J. Togelius; A. Canossa; G.N.Yannakakis. Predicting player behavior in tomb raider: Un-derworld, 2010.

[10] C. Thurau; C. Bauckhage. Analyzing the evolution of socialgroups in world of warcraft, 2010.

[11] www.edwdebono.com/lateral.

[12] Catherine Greenhill. Sudoku, logic and proof, 2011.

[13] C. L. Bouton. Nim, a game with a complete mathematicaltheory, 1901.

[14] Paul Sloane; Des MacHale. Lateral thinking puzzlers, 1992.

[15] www.mathpuzzle.com/loyd.

[16] Peter Vorderer; Jennings Bryant. Playing video games: Mo-tives, responses, and consequences, 2006.

[17] Achtman; R. L. Green; Bavelier D. Video games as a tool totrain visual skills, 2008.

[18] Nick Tannahill; Patrick Tissington; Carl Senior. Video gamesand higher education: What can call of duty teach our stu-dents?, 2012.

[19] Rui Nouchi; Yasuyuki Taki; Hikaru Takeuchi; HiroshiHashizume; Yuko Akitsuki; Yayoi Shigemune; AtsushiSekiguchi; Yuka Kotozaki; Takashi Tsukiura. Brain train-ing game improves executive functions and processing speedin the elderly: a randomized controlled trial, 2012.

Bibliography 49

[20] Linda A. Jackson; Edward A. Witt; Hiram E. Fitzger-ald; Alexander von Eye; Yong Zhao. Parent behavior, chil-dren’s technology use and creativity: Videogames count butparents don’t!, 2011.

[21] John H. Doolittle. Using riddles and interactive computergames to teach problem-solving skills, 1995.

[22] Elizabeth Huttona; S. Shyam Sundarb. Can video gamesenhance creativity? effects of emotion generated by dancedance revolution, 2010.

[23] Edgar Rubin. Synsoplevede figurer, 1915.

[24] Jing Feng; Ian Spence; Jay Pratt. Playing an action videogame reduces gender differences in spatial cognition, 2007.

[25] Matthew Dye; Shawn Green; Daphne Bavelier. Racing,shooting, and zapping your way to better visual skills, 2009.

[26] www.cocos2d-x.org.

[27] www.developer.apple.com/library/mac/#documentation/Cocoa/Conceptual/MemoryMgmt/

Articles/MemoryMgmt.

[28] www.pugixml.org.

[29] www.pcg.wikidot.com.

[30] S. Russell; P. Norvig. Artificial Intelligence - A Modern Ap-proach. Prentice-Hall, third edition, 2010.

Bibliography 50

[31] Olessia Blajenkova; Maria Kozhevnikov; Michael A. Motes.Object-spatial imagery: a new self-report imagery question-naire, 2006.

[32] Jim H. Patton; Matthew S. Stanford; Ernest S. Barratt. Fac-tor structure of the barratt impulsiveness scale, 2006.

[33] E. P. Torrance. Torrance tests of creative thinking, 1974.