Courseware design of project research based on interactive game design

Download Courseware design of project research based on interactive game design

Post on 06-Jun-2016

216 views

Category:

Documents

4 download

TRANSCRIPT

  • Courseware Design of ProjectResearch Based on InteractiveGame DesignYULUNG WU

    Department of Information and Communication, Kun Shan University, No. 949, DaWan Rd., Yung-Kung City, Tainan

    Hsien, 71003, Taiwan, R.O.C.

    Received 29 September 2009; accepted 18 January 2010

    ABSTRACT: Interactive game design stretches over two domain knowledge: games and interactivetechnologies. The two so different domains are too difficult for students to learn at same time. In this research,

    the course design of interactive game design is based on motion-sensing technology and presents a multiple

    input interface development platform. Finally, three case studies are presented to verify that the proposed

    framework is useful for educating of project research. 2010 Wiley Periodicals, Inc. Comput Appl Eng Educ;Published online in Wiley InterScience (www.interscience.wiley.com); DOI 10.1002/cae.20419

    Keywords: interactive game design; project research; virtools; motion sensing

    INTRODUCTION

    The growing digital content is various with the joining of

    multimedia and Internet. Computer games play important roles in

    the digital content industry. Computer games affect deeply in

    home entertainment, e-learning, and audio/video media. With the

    performance improvement of computer hardware, computer

    games provide more real game environment and operation. The

    home entertainment market is the next market to be developed in

    the post-PC era in this saturated computer development environ-

    ment. However, this market has been monopolized by Japanese

    manufacturers since its inception. Nintendo, SEGA and Sony

    have been market leaders at different times.

    At the end of 2006, Nintendo developed a new generation of

    home entertainment console, called Wii. The revolution of this

    console is not in the strengthening of computing performance or

    visual effect, but in the major advancement in humanmachineinteractive interface. The humanmachine interface of a tradi-tional PC comprises the keyboard and mouse. Considering the

    operating capabilities of computer users, the home entertainment

    computer further simplifies the humanmachine interface intoa gamepad and remote control to enable the computer to be

    controlled by a few buttons. However, these operating styles

    simply send out orders by fingers, and are not intuitive

    humanmachine interactive styles. The novel feature of the Wiiis that its operating gamepad can detect motion and rotation in 3D

    space. Additionally, the gamepad can be adopted as a bat, baton,

    fishing rod, or sword in computer games. The game player can

    perform motions such as waving, cutting, flinging, and chopping,

    thus considerably increasing humanmachine interactivity andamusement. Hence, Wii has sold more than the Sony PS3 since its

    launch, and has revolutionized the home entertainment market.

    For training and bringing up interactive game designers, this

    research proposes a game design platform of motion-sensing

    technology. The platform helps students developing interactive

    game without understanding too much detail specification of

    hardware interface that helps students concentrating their

    attentions on creative design and game content.

    This article is organized as follows. Second section

    introduces the literature on game design and interactive design.

    Third section then describes the main part of this research

    course design of Project Research. Subsequently, fourth

    section presents framework of the proposed system. Next,

    fifth section performs evaluations to gather learning outcomes

    and demo. Finally, conclusions and future works are presented in

    sixth section.

    RELATED WORK

    Reigeluth [1] says interactivity means mutual activities

    between two organisms. In computer aided learning activities,

    interactivity means interactive relationship between learners and

    computers. Interactive system provides various ways to send

    commands to system and obtain feedback from system.

    Virtual reality (VR) provides various interactive methods to

    let user immerse in a virtual environment. Users using VR and

    interactivity features observe or play objects in the virtual world,

    Correspondence to Y. Wu (ylw@mail.ylw.idv.tw).Contract grant sponsor: National Science Council of the Republic

    of China, Taiwan; Contract grant numbers: NSC 96-2815-C-426-005-E, NSC 97-2815-C-426-002-E.

    2010 Wiley Periodicals Inc.

    1

  • or discuss and communicate with other people. The rich

    situational experience of VR promotes focus and interest in

    users. Besides entertainment, VR can be adopted in teaching

    assistance to increase the learning motivation of users [2,3].

    The greatest advantage of VR is that it can portray a scene

    close to reality. Some dangerous or hidden aspects of nature or

    real life may not be freely accessible to most people. The

    properties can be exploited to create virtual scenes to allow users

    to browse and experience these aspects at any time. Although

    documentary filming can depict such natural scenes and features,

    VR has higher interactivity and freedom in perspective. Due to

    improvements in 3D technologies and reduction of video card

    price, VR is no longer confined to high-level server and expensive

    devices for implementation, and can now be implemented simply

    by general-purpose PCs. Hence, the VR-related applications are

    now increasingly extensive and widespread.

    Game maker [http://www.yoyogames.com] is a game design

    software with visual interface without the need to write a single

    line of code. User can design game by using mouse to drag-and-

    drop sprites, resources and events, and then constructing required

    functions. Game maker focuses on 2D game design and provides

    many animation effects to integrate into game project. Game

    maker can simulate 3D screen with pre-rendering 2D image, but

    not real 3D game design platform.

    Song and Lee [4] adopted 3D in teaching geometry. They

    adopted polyhedral teaching as an example, teaching students the

    relationship among points, lines, surfaces, and the polyhedron.

    Plane graphics could not fully express the features of poly-

    hedrons, while too few physical teaching objects were available

    to allow all students to spend enough time observing then. In

    contrast, various polyhedrons were easy to build using 3D

    technologies. The students were able to turn the polyhedrons

    around to learn their features. Therefore, this system significantly

    helped the students in their learning.

    Jong [5] adopted 3D technology to create a multiple-people

    interactive learning environment and to teach elementary science

    courses. The learning activities were performed in groups. To

    understand the personal interactive relations in between students,

    the range of activities of students in the virtual learning

    environment was analyzed.

    Terrell and Rendulic [6] also adopted the computer-game

    learning software to teach elementary school students. Their

    results indicated that the computer-game-style learning improved

    the intrinsic motivation and learning achievements of students.

    The method and process of presenting the computer games were

    consistent with the suggestions of Gagne [7]: (1) provide sensory

    stimulation; (2) carefully guide the learners activities; (3)

    provide the way to reach the goal; (4) provide external driving

    forces; (5) guide the direction of thinking; (6) stimulate the

    transfer of knowledge; (7) assess the learning results, and (8)

    obtain feedback. The computer game approach, if it is guided

    correctly, plays an educational role to obtain teaching tasks.

    Additionally, 3D can be adopted to teach spatial sense. The

    round earth [8] project grouped learners into pairs. One learner

    acted as driver to control the virtual space shuttle, and the other

    acted as navigator to guide the direction of space shuttle. The

    scenes helped learners to understand relative spatial positions,

    and taught them to describe directions and positions. Addition-

    ally, classroom explanations are inadequate in experiment-related

    courses. The students have to verify what they have learnt from

    the class by experiment. Hence, operating experimental materials

    in experiment-related courses can provide students with deep

    impressions [9]. Raymond and Nathan [10] adopted VR

    technologies to teach remote control and operation of motors.

    Virtual reality peripheral network (VRPN) [11] proposed an

    integrated library to support many devices that used in VR. VRPN

    supports an abstraction layer that makes all devices of the same

    base-class look the same. The information that VRPN obtains

    from trackers is raw data. That means, though VPRN supports the

    same interface to control devices, for motion-sensing pro-

    grammer, developing methods of transforming raw data into

    motion sensing is required. Our research proposes the same

    interface and transforming method to provide a high extensible

    and flexible system. And all devices are ensured that providing

    necessary information of motion-sensing design. Besides,

    Virtools is more suitable to design game and 3D system than

    VRPN with C language.

    THE COURSE DESIGN OF PROJECT RESEARCH

    Project Research is compulsory for third-year undergraduates of

    Department of Computer Science and Information Engineering,

    Leader University, Taiwan. And the course period lasted three

    semesters. Project Research educates students obtaining the

    ability of analyzing and solving problems. Students have to master

    and integrate their professional skills and work together to finish a

    large-scale project. All students are grouped with 35 students forteamwork. In the course, many students have not much experience

    in interactive game design; the following course design is designed

    for students to follow. The course design shows as Figure 1.

    In the first semester, students have to review the related

    literature for understanding the theories, trends, and applications

    of interactive game design. In the same time, students have to plan

    their project topic and system framework; and the studying of

    developing tools, 3DMax and Virtools, is also required.

    In the following semester, students design their project with

    Virtools and the proposed frame, multiple input interface (MII).

    Due to the course, process is in the form of teamwork, students in

    group have to finish different sub-systems by themselves. In the

    phase, students design the interface and protocol among sub-

    system and integrate finished sub-system.

    In the last semester, the prototype of project finished,

    including title animation, background music, and sound effects.

    Figure 1 The course design of Project Research.

    2 WU

  • The next step is testing and revising system. System documenta-

    tion and user manual also prepared. At the end of this semester, a

    project achievement show will be hold for demonstrating the

    finished project.

    For designing interactive game, motion-sensing device is

    required for detecting user motion and providing feedback.

    Managing motion-sensing device with low-level control is not

    easy for students. They must pay much more attention to design

    code of control hardware and do not have enough time to finish

    their system. This research proposes the MII framework to reduce

    development processes and hide hardware characteristics, and then

    students can focus on system function and creative design. The

    following introduces teaching materials in first semester, which

    teach the students hardware and software used in the course.

    Hardware Specifications of Interactive Devices

    Many hardware devices in support of motion sensing are

    currently available in the market, with various operating

    principles and properties. The positioning of motion-sensing

    devices falls mainly into two broad categories, namely relative

    displacement and absolute coordinates. Relative displacement

    means that the motion-sensing device can only sense the direction

    of the current movement, which is the relative displacement

    direction with the starting position, and cannot know the exact

    position at any moment. Such positioning adopts a gyroscope to

    detect movement. In a motion-sensing device based on absolute

    coordinates can acquire its exact position in space, which is

    represented generally by 3D coordinates. A positioning device

    based on absolute coordinates can judge precise movements more

    accurately than a relative positioning device, since it can acquire

    the movement tracks. Such positioning generally adopts infrared

    or ultrasound to detect movement.

    The immersion VR system (Cave) [12] positioning and

    operating device comprise two components, Wanda, and soni-

    strip. Wanda is a device that looks like a gamepad, as shown in

    Figure 5. The structure includes a direction stick and a few

    buttons, and has similar functions to general computer gamepads

    that send out control orders by finger. The sonistrip comprises a

    few metal stripes placed on top of Cave to send out ultrasound

    waves. Wanda calculates its position in space by time delay

    between ultrasound reception and transmission, and then locates

    and identifies the operations of the user.

    The MX Air is a 3D wireless mouse produced by Logitech,

    and is connected with PC by a 2.4GHz RF signal. It is similar in

    appearance to a general wireless mouse, and has a built-in

    gyroscope. When the MX Air is off the desktop, the gyroscope

    accumulates displacement information, which is transformed into

    mouse cursor coordinates by the built-in chip. The MX Air can be

    held in air to control the mouse cursor when operating the

    computer or briefing, and is very easy to apply in motion sensing.

    Since the movements of the user are transformed into displace-

    ments of the desktop mouse cursor inside the mouse, the X and Y

    coordinates of the mouse cursor are ultimately transmitted to the

    computer. Therefore, general windows programs can easily

    acquire the mouse cursor data without the need for any specific

    platform or operating system.

    Wii remote is the new generation of gamepad introduced by

    Nintendo, and is connected to a console by bluetooth signals with

    built in gyroscope and infrared receptor. The gyroscope also

    detects the movements of the user to obtain the displacement

    information. Therefore, the displacement information is also the

    relative displacement. Unlike in MX Air, Wii remote does not

    process displacement information, but instead directly transmits

    the data to the main console, which performs the computations to

    obtain the 3D displacement information.

    Software Specifications of Course Used

    The development platforms adopted in this research are Virtools

    and PC. Virtools is a 3D interactive construction software

    application developed by 3DVIA [http://www.3dvia.com], and

    provides a WYSWYG development environment. The interactive

    functionality design method of Virtools does not request writing

    programmable codes as in general programming languages.

    Instead, program design is represented in a flowchart to present a

    more intuitive and more easily accessible development environ-

    ment than traditional programming languages. This type of

    development interface is called a schematic. In Virtools, each

    function is called a building block (BB). The development

    process involves selecting suitable BBs to assemble in a serial

    link. Each BB has some variable parameters to provide necessary

    information for implementation. A simple example is given

    below to illustrate how to adopt a keyboard to control the

    movements of 3D objects. The schematic as illustrated in Figure 2

    adopts three BBs. Switch On Key waits for the keyboard

    button to be pressed. According to different pressed keys, the

    following BBs are executed. Translate moves the 3D object

    and Rotate changes the angle of 3D object in the virtual scene.

    The three BBs connected by line represent pressing the button,

    then moving or rotating the 3D object Shphere01.

    Virtools has various BBs, based on categories of different

    functions. Some important categories, namely Network, AI,

    Physics, and VR, are illustrated below. The network library

    Figure 2 The schematic of controlling the movements of 3D objects.

    COURSEWARE DESIGN OF PROJECT RESEARCH 3

  • contains BBs for online connection and database connection,

    which are adopted in the design of multi-user online game or

    game progress storage. The AI Library can provide non-player

    character (NPC) BBs. NPC has intelligent actions, such as a

    hostile monster approaching the player; or running, hidden or

    member in the line automatically following the player. The

    physics library mainly provides physical properties similar to the

    reality for the objects in the scenes to improve the simulation of

    the game and reduce the design difficulty. It provides simulation

    of various common physical properties such as gravity, mass,

    friction force, bouncing force, bumping, and floating force. These

    BBs can be directly applied to reduce considerably the game

    manufacturing time. The VR library is adopted along with

    immersion style VR devices, providing hardware devices such as

    access pack, headphone display, 3D glove, force feedback jacket

    to focus on the design of interactivity mechanism and game in

    scenes. The communication with hardware devices can be

    processed by the BBs provided by the VR library.

    The basic and most common development method in

    Virtools is to build the functions required from the built-in BBs

    as shown in Figure 2. However, the implementation of complex

    functions results in huge and complex schematic that is not easy

    for maintenance and reading. Therefore, Virtools can pack parts

    of the schematic into one BB as a new self-made BB, known as a

    behavior graph. Similar to BBs in appearance, a behavior graph

    can be adopted repeatedly and integrated with other BBs. If the

    process is amended, then the behavior graph can be expanded

    to the original schematic, thus considerably reducing the

    complexity of the schematic. In practice, the commonly adopted

    or independently operating schematics can be packed into

    behavior graphs such as the scoring function or file storage

    function. User-created behavior graphs can be shared with other

    gaming projects if required.

    Both BBs and behavior graphs are established graphically in a

    schematic. However, programming codes are easier than sche-

    matics to adopt in design relating to complex logic judgment and

    data storage. This is because a line of programming code may

    obtain the task of 23 BBs. Hundreds of lines of code are commonin general programming, while more than 100 BBs are hard to

    maintain. Therefore, Virtools also provides programming lan-

    guage-based design methods. Virtools scripting language (VSL) is

    a scripting language based on the C language provided by Virtools.

    The system operation is controlled by programming codes written

    in VSL, which produces the same results as BBs in both

    appearance and application methods. However, VSL comprises

    programming codes when expanded. Nevertheless, BBs, behavior

    graphs and VSL are designed in the development environment

    despite the different formats, and can ultimately be converted into

    BBs for execution. Restated, the functions available are limited by

    the BBs provided by Virtools. The behavior graphs and VSL are

    simply different presentations of BBs, and they cannot perform

    functions that are not provided by BBs.

    To enhance the supportiveness of Virtools and escape the

    limitation of the existing BBs, Virtools provides an SDK

    development environment that adopts Visual C to developDLLs to be loaded into Virtools for execution. Since the SDK

    development is performed in VC, and Virtools is simply anexecution environment, functions that can be run in VC canbe adopted by Virtools, meaning that Virtools has almost no

    restriction on any function. Functions accomplished by SDK are

    similar in appearance and application method to BBs in Virtools

    interface, but are converted to VC for reprogramming if they

    need to be amended. Wii remote is not a Virtools-supported

    hardware device. The SDK was adopted for user-development in

    the research (Fig. 3).

    SYSTEM FRAMEWORK OF COURSEWARE

    In interactive games that support motion sensing, the sensing

    mode detects the movements such as the hand waving or head

    swinging of the player. The fundamentals of these movements are

    the direction, displacement, and time parameters. According to

    Figure 4, the parameters when a player physically moves are the

    starting (coordinates sc) and ending (coordinates ec) positions, as

    well as the movement time t. As described in the previous section,

    the absolute coordinate motion-sensing device can obtain the

    spatial coordinate information of the movement, that is, the

    starting position coordinates sc and the ending position

    coordinates ec. However, the relative displacement device can

    only obtain the direction of movement d. The movement direction

    d and movement distance l are calculated from the coordinates of

    the two positions.

    These data are sent to the computer for computation after

    being received by sensors, and then further transformed to the VR

    scenes for interactivity or control. The movement direction d of

    the player is transformed into the direction of force to the virtual

    object. The movement distance l is adopted to calculate whether

    the movement of the player is in contact with the virtual object.

    The movement velocity v is calculated by dividing the movement

    distance l by the movement time t, representing the speed of the

    swinging movement of the player as well as the force to the

    virtual object. Therefore, regardless of the type of motion-sensing

    device, if the movement data can be transformed into motion-

    sensing parameters such as the starting and ending coordinates (or

    moving direction) and movement time, then they can be

    transformed into the scene interactive parameters requested by

    the interactive games.

    The Framework of Multiple Input Interface

    Since currently available hardware devices supporting motion

    sensing vary in operating principles and characteristics, access

    methods and data formats, games have to individually develop

    for different platforms to support different sensing devices,

    Figure 3 Development methods of Virtools.

    4 WU

  • resulting in repeated investment and long development processes.

    This research presents a MII development platform that allows

    program developers to focus on developing of game contents

    without worrying about the different hardware properties.

    Figure 5 shows the MII framework. The main aim of the MII is

    to isolate the game core from the basic hardware device. The MII

    obtains the displacement information from the motion-sensing

    device, and transform it into motion sensing values to the back

    end game core for further use. Switching between different

    motion-sensing devices involves only setting MII parameters, and

    does not influence the operation of the back end game core.

    Additionally, supporting new motion-sensing device involves

    only adding a new device plug-in to the MII, and does not require

    adjusting the game core.

    The MII framework adopts a modular design. The new

    motion-sensing devices can be incorporated into MII by making

    device plug-ins developed according to norms, and completing the

    registration for MII management. Therefore, MII can access the

    newly added motion-sensing device. Each device plug-in works

    independently. If the game core has any unwanted motion-sensing

    equipment, then it can remove the unnecessary device plug-in

    without affecting other settings, thus reducing the file size.

    Plug-ins for three motion-sensing devices, Wanda, MX Air,

    and Wii remote, were been completed on the MII platform in this

    research and three interactive games were implemented and

    tested. The results were consistent with the expected goals and

    stability of this research. The development platform, system

    structure, and specifications are introduced as follows.

    Development Method

    The specifications and implementation of MII are now intro-

    duced. MII mainly serves as the access interface for motion-

    sensing equipment and the back-end programming engine. MII is

    adopted as a loader, and does not involve complex logic

    computation and special hardware control, but instead is based

    on a behavior graph. Figure 6 shows the behavior graph of MII.

    The top of BB is the input parameters that determine the motion-

    sensing device ID, which is assigned by the MII in sequence when

    loading device plug-ins. After obtaining displacement informa-

    tion from the given motion-sensing devices, the type of

    displacement information, namely relative or absolute coordi-

    nates, is determined according to the device category. The values

    below the MII behavior graph are the output parameters of the

    position method (relative displacement/absolute coordinates),

    starting coordinates, ending coordinates, movement direction,

    and movement time. If the location is obtained by relative

    displacement, then only two parameters are output, namely

    movement direction and movement time. If the location is

    determined by absolute coordinates, then data are output in all

    parameters. Adding a motion-sensing device involves only

    adding one output connection point of the Switch On Parameter;

    connecting the corresponding device plug-in to the output

    connection point, and finally outputting the device plug-in

    parameters to the MII parameter output.

    A corresponding plug-in is requested for MII to support a

    motion-sensing device. This research adopts Wanda, MX Air, and

    Wii remote as examples to explain the development of device

    plug-ins. Wanda is a built-in device in Virtools, belonging to the

    VR Library. MX Air as well is a built-in device supported by

    Virtools, belonging to controller library, and has similar access

    method to a mouse. Device plug-ins for Wanda and MX Air are

    relatively easy to develop. This research adopts behavior graphs

    for packaging, and VSL to calculate and transform displacement

    information. Development on devices not supported by Virtools,

    such as Wii remote, is performed by the SDK. This research

    adopts the function set of cWiiMote [http://thekrf.com/projects/

    wii] to connect with Wii remote. Although SDK is developedFigure 5 MII framework.

    Figure 4 The relationship between motion-sensing parameters and scene interactive parameters.

    COURSEWARE DESIGN OF PROJECT RESEARCH 5

  • under the VC environment, Virtools has its own developmentframework. Therefore, cWiiMote is rewritten to satisfy the

    requirements of Virtools SDK.

    COURSE EVALUATIONS

    For examining the usability and acceptability of the proposed

    system, this research applies MII to a course Project Research.

    The following are three case studies that designed by three teams.

    The three teams have never used Virtools before. They finish their

    works of interactive game in three semesters, and National

    Science Council, Taiwan, approved the two college student

    research projects. The three case studies show that the MII

    framework is effective for developing motion-sensing game.

    In addition to developing a MII development platform, this

    work considers two game types as examples to explain the

    development of motion-sensing games in MII. The first game is a

    SPT (Sports Game) as shown in Figure 7. The game was

    developed by two students in 2007. The game named Struck Out,

    requires the player to bat the coming balls to hit the nine boards in

    front. The player who hits off all number boards with the fewest

    balls obtains the highest score. The control of the bat is connected

    with the motion-sensing device. The game is mainly to test the

    timing and direction of bat swinging of the player.

    Many aspects of the game involve with detection of object

    collision, such as collisions between bat and ball, and between

    ball and number board. The Physicalize BB of the physics library

    in Virtools is adopted to calculate the collision detection. The

    Physicalize BB is very simple to use. It is mainly adopted assign

    physical features such as weight and flexibility to the 3D models

    in the gaming scenes, and to detect collisions between two virtual

    objects. Figure 8 shows the schematic for the ball processing

    when the ball hits by the bat. The left part represents MII

    obtaining the motion-sensing parameters by calculating the bat

    swinging force and direction. The right part is to transmit the

    force data to the ball to strike out. The right part of the process is

    the Physicalize BB, which provides its physical features.

    The second game is a first personal shooting game (FPS) as

    shown in Figure 9. The game was developed by three students in

    2008. In the game named Dungeon Keeper, the player has to

    adopt weapons to knock down coming monsters. Knocking down

    more monsters leads to a higher score. The weapon control is

    connected with motion-sensing devices. Different gestures can

    bring about different types of attacks. The game is mainly to test

    the timing and reaction of the player in adopting different attacks.

    To increase the fun of the game, it is designed to trigger

    different attacks in different directions and tracks of the motion-

    sensing devices swung by the user. Some attacks are forceful but

    can be used against only one monster. Other attacks can be used

    against a group of enemies, but require preparation time to collect

    power.

    The third game is a SPT (Sports Game) as shown in

    Figure 10. The game was developed by three students in 2009.

    The game named Boxing.Net, requires players wield Wii remote

    to punch opponent. In the game, networking is a new function to

    support two players playing with each other. The network

    protocol uses TCP/IP, which can translate data through Internet,

    and then, providing a game environment without limited with

    distance.

    For transmitting data through network, this research designs

    a BB TCP_Socket. Virtools provides multi-user library which

    supports many high level BBs, such as distributed object,

    message management, session management, and user manage-

    ment. By using these BBs, developing a multi-user game is very

    easy, but the library costs expensive. With Visual Studio 2003,

    this research uses SDK and socket to design client BB, and game

    server designs with Java. The server and clients connect by using

    TCP/IP protocol. In Figure 11, the client schematic manages

    network connection and transmission. The BB TCP_Socket

    has three behaviour inputs (On/SendIn/Off), four behaviourFigure 7 The screen of Struck Out.

    Figure 6 Expanded behavior graph of MII.

    6 WU

  • outputs (Out/SendOut/DataIn/Error), four parameter inputs (IP/

    Port/SendProtocol/SendMessage) and four parameter outputs

    (SenderID/ReceiveProtocol/ReceiveMessage/ErrorMessage).

    First, the parameter inputs IP and Port are assigned to

    connect game server. And then, behaviour inputs activate or

    deactivate the BB. By using behaviour input SendIn, the BB

    sends data in parameter inputs SendProtocol and SendMes-

    sage to the other player through game server. When the BB gets

    data through game server, the behaviour output DataIn

    activates the following BB Switch On Parameter. And then,

    the BB Switch On Parameter get data from parameter outputs

    SenderID, ReceiveProtocol and ReceiveMessage to

    deals with network connection.

    The proposed system provides a straightforward and simple

    method of designing interactive game. After using the proposed

    system, eight participations were interviewed. The interview was

    designed to collect the opinions of the students. From the result of

    interview, the main comments of disadvantage are discussed.

    The design method of Virtools is different from other

    familiar programming languages, such as C, Java. The designmethod of Virtools is represented in flowcharts. The participating

    students have to spend additional time to learn the new design

    method. But the students also mention that flowcharts are familiar

    tools of programming languages and easier to understand than

    programming code.

    The purpose of MII framework is reducing development

    processes and hiding hardware characteristics, and then students

    can focus on system function and creative design. The output

    parameters of MII are starting coordinates, ending coordinates,

    movement direction, and movement time. When designing

    interactive game, some advanced functions are required, such

    as hand gesture or full body motion. The students have to design

    these functions by themselves. In future works, The MII will

    provide some predefined hand gestures to solve the problem.

    CONCLUSIONS AND FUTURE WORKS

    Home entertainment with PC based is the future trend. Powerful

    PCs support and integrate many multimedia devices. Home

    theater personal computer (HTPCs) had become popular devices

    in modern living room. Family gaming consoles, such as

    XBOX360, PS3, Wii, require certification fee to design and vend

    gaming software. For departments of information technology in

    universities, it is necessary to teach and train students to

    understand and use these technologies. This research proposed

    extra functions for HTPC with lower cost and less difficulty than

    particular family gaming consoles. After the training of the

    proposed course design, students have abilities to use simple tools

    and PC to design interactive game.

    The feasibility and stability have been verified by using

    three examples of two game types. All games can correctly

    execute and meet the requirements of the games after testing,

    Figure 8 The schematic of ball hit by bat.

    Figure 9 The screen of Dungeon Keeper. Figure 10 The screen of Boxing.Net.

    COURSEWARE DESIGN OF PROJECT RESEARCH 7

  • confirming that the proposed framework helps game designers to

    develop motion-sensing games quickly.

    Future work is in two directions. The first direction is to

    support additional motion-sensing devices and functions. This

    research has developed three device plug-ins covering different

    types of location methods. The next phase will be to support new

    motion-sensing devices such as the EEE Stick newly introduced

    by ASUS [http://www.asus.com], CyWee newly introduced by

    Industrial Technology Research Institute of Taiwan [http://

    www.itri.org.tw]. And, the advanced function, hand gesture, will

    be included to provide more interactive methods. The second

    direction is to widen the application areas of MII. This research

    adopts MII to develop motion-sensing games, controlling the

    game by hand movements. Future work will be to track hand

    movements to enable users to control games using many hand

    gestures. This feature could also be adopted in other applications

    such as controlling the humanmachine interfaces of householdappliances.

    ACKNOWLEDGMENTS

    The author would like to thank the National Science Council of

    the Republic of China, Taiwan for financially supporting this

    research under Contract No. NSC 96-2815-C-426-005-E and

    NSC 97-2815-C-426-002-E.

    REFERENCES

    [1] M. C. Reigeluth, Instructional-design theories and models: A new

    paradigm of instructional theory. Lawrence Erlbaum Associates,

    Mahwah, NJ, 1999.

    [2] B. S. Jong, T. W. Lin, Y. L. Wu, and T. Y. Chan, A web dual mode

    virtual laboratory supporting cooperative learning, The 18th

    International Conference on Advanced Information Networking

    and Applications, Fukuoka, Japan, 2004, pp 642647.[3] J. P. Gerval, D. M. Popovici, and J. Tisseau, Educative distributed

    virtual environments for children, Proceedings of the IEEE

    International Conference on Cyberworlds, 2003, pp 382387.[4] K. S. Song and W. Y. Lee, A virtual reality application for geometry

    classes, J Comput Assist Learn 18 (2002), 149156.[5] B. S. Jong, Y. L. Wu, T. Y. Chan, and T. W. Lin, Applying the

    adaptive learning material producing strategy to group learning,

    The 2006 International Conference on E-learning and

    Games, Lecture Notes in Computer Science (LNCS), 3942, 2006,

    pp 5463.[6] S. Terrell and P. Rendulic Using computer- managed instructional

    software to increase motivation and achievement in elementary

    school children, J Res Comput Educ, 26 (1996), 403414.[7] E. D. Gagne, The cognitive psychology of school learning, Little

    Brown and Company, Boston, MA, 1985.

    [8] A. Johnson, T. Moher, S. Ohlsson, and M. Gillingham, The round

    earth project-collaborative VR for conceptual learning, IEEE

    Comput Grap Appl 19 (1999), 6069.[9] L. Benetazzo, M. Bertocco, F. Ferraris, A. Ferrero, M. Parvis, and V.

    Piuri, A web-based distributed virtual educational laboratory, IEEE

    Transa Instrum Meas 49 (2000), 349356.[10] B. Raymond and S. Nathan, Web-based virtual engineering

    laboratory for collaborative experimentation on a hybrid electric

    vehicle starter/alternator, IEEE Trans Ind Appl 36 (2000),

    11431150.[11] R. M. Taylor, T. C. Hudson, A. Seeger, H. Weber, J. Juliano, and A.

    T. Helser, VRPN: A device-independent, network-transparent VR

    peripheral system, Proceedings of the ACM Symposium on Virtual

    Reality Software & Technology 2001, Banff Centre, Canada, pp

    5561.[12] J. Jacobson and M. Lewis, Game engine virtual reality with

    CaveUT, IEEE Comput 38 (2005), 7982.

    BIOGRAPHY

    YuLung Wu received the PhD degree in

    computer engineering from Chung-Yuan

    Christian University in 2005. He is an assistant

    professor in the Department of Information

    and Communication, Kun Shan University,

    Taiwan. His research interests primarily lies in

    learning technologies and interactive designs.

    Figure 11 The schematic of network connection.

    8 WU

Recommended

View more >