developing a generic augmented- reality interface · developing a generic augmented-reality...

7
0018-9162/02/$17.00 © 2002 IEEE 44 Computer Developing a Generic Augmented- Reality Interface the physical environment. The system facilitates seamless two-handed, three-dimensional interac- tion with both virtual and physical objects, with- out requiring any special-purpose input devices. Unlike some popular AR interfaces that constrain virtual objects to a 2D tabletop surface, 4 Tiles allows full 3D spatial interaction with virtual objects any- where in the physical workspace. A tangible inter- face, it lets users work with combinations of digital and conventional tools so that, for example, they can use sticky notes to annotate both physical and virtual objects. Because our approach combines tan- gible interaction with a 3D AR display, we refer to it as tangible augmented reality. We do not suggest that tangible AR interfaces are perfect for all conceivable applications. Therefore, although our interface techniques can be applied broadly, we ground our design in a real-world appli- cation: the rapid prototyping and collaborative evaluation of aircraft instrument panels. INTERFACE DESIGN SPACE DICHOTOMY As the “Short History of Augmented Reality Interface Design” sidebar indicates, the AR inter- face design space can be divided along two orthog- onal approaches. A 3D AR interface provides spatially seamless augmented space where the user, wearing a head- mounted display, can effectively interact with both The Tiles system seamlessly blends virtual and physical objects to create a work space that combines the power and flexibility of computing environments with the comfort and familiarity of the traditional workplace. Ivan Poupyrev Sony CSL Desney S. Tan Carnegie Mellon University Mark Billinghurst University of Washington Hirokazu Kato Hiroshima City University Holger Regenbrecht DaimlerChrysler AG Nobuji Tetsutani ATR MIS Labs COVER FEATURE A n augmented-reality (AR) interface dynam- ically superimposes interactive computer graphics images onto objects in the real world. 1 Although the technology has come a long way from rendering simple wire- frames in the 1960s, 2 AR interface design and inter- action space development remain largely unex- plored. Researchers and developers have made great advances in display and tracking technologies, but interaction with AR environments has been largely limited to passive viewing or simple browsing of virtual information registered to the real world. 3 To overcome these limitations, we seek to design an AR interface that provides users with interactiv- ity so rich it would merge the physical space in which we live and work with the virtual space in which we store and interact with digital information. In this single augmented space, computer-generated enti- ties would become first-class citizens of the physical environment. We would use these entities just as we use physical objects, selecting and manipulating them with our hands instead of with a special-pur- pose device such as a mouse or joystick. Interaction would then be intuitive and seamless because we would use the same tools to work with digital and real objects. Tiles is an AR interface that moves one step closer to this vision. It allows effective spatial composi- tion, layout, and arrangement of digital objects in

Upload: trinhbao

Post on 27-Jul-2018

238 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Developing a Generic Augmented- Reality Interface · Developing a Generic Augmented-Reality Interface ... Object Creation,” IEEE MultiMedia, vol. 7, ... bling technique

0018-9162/02/$17.00 © 2002 IEEE44 Computer

Developing a GenericAugmented-Reality Interface

the physical environment. The system facilitatesseamless two-handed, three-dimensional interac-tion with both virtual and physical objects, with-out requiring any special-purpose input devices.Unlike some popular AR interfaces that constrainvirtual objects to a 2D tabletop surface,4 Tiles allowsfull 3D spatial interaction with virtual objects any-where in the physical workspace. A tangible inter-face, it lets users work with combinations of digitaland conventional tools so that, for example, theycan use sticky notes to annotate both physical andvirtual objects. Because our approach combines tan-gible interaction with a 3D AR display, we refer toit as tangible augmented reality.

We do not suggest that tangible AR interfaces areperfect for all conceivable applications. Therefore,although our interface techniques can be appliedbroadly, we ground our design in a real-world appli-cation: the rapid prototyping and collaborativeevaluation of aircraft instrument panels.

INTERFACE DESIGN SPACE DICHOTOMYAs the “Short History of Augmented Reality

Interface Design” sidebar indicates, the AR inter-face design space can be divided along two orthog-onal approaches.

A 3D AR interface provides spatially seamlessaugmented space where the user, wearing a head-mounted display, can effectively interact with both

The Tiles system seamlessly blends virtual andphysical objects to create a work space thatcombines the power and flexibility of computingenvironments with the comfort and familiarity ofthe traditional workplace.

IvanPoupyrevSony CSL

Desney S.TanCarnegie MellonUniversity

MarkBillinghurstUniversity of Washington

HirokazuKatoHiroshima City University

HolgerRegenbrechtDaimlerChrysler AG

Nobuji TetsutaniATR MIS Labs

C O V E R F E A T U R E

An augmented-reality (AR) interface dynam-ically superimposes interactive computergraphics images onto objects in the realworld.1 Although the technology has comea long way from rendering simple wire-

frames in the 1960s,2 AR interface design and inter-action space development remain largely unex-plored. Researchers and developers have made greatadvances in display and tracking technologies, butinteraction with AR environments has been largelylimited to passive viewing or simple browsing ofvirtual information registered to the real world.3

To overcome these limitations, we seek to designan AR interface that provides users with interactiv-ity so rich it would merge the physical space in whichwe live and work with the virtual space in which westore and interact with digital information. In thissingle augmented space, computer-generated enti-ties would become first-class citizens of the physicalenvironment. We would use these entities just as weuse physical objects, selecting and manipulatingthem with our hands instead of with a special-pur-pose device such as a mouse or joystick. Interactionwould then be intuitive and seamless because wewould use the same tools to work with digital andreal objects.

Tiles is an AR interface that moves one step closerto this vision. It allows effective spatial composi-tion, layout, and arrangement of digital objects in

Page 2: Developing a Generic Augmented- Reality Interface · Developing a Generic Augmented-Reality Interface ... Object Creation,” IEEE MultiMedia, vol. 7, ... bling technique

2D and 3D virtual objects anywhere in the work-ing environment. To interact with virtual content,however, the user must rely on special-purposeinput devices that would not normally be used inreal-world interactions. Thus, interaction discon-tinuities break the workflow, forcing the user toswitch between virtual and real interaction modes.

Tangible interfaces, on the other hand, use mul-tiple physical objects tracked on the augmented sur-face as physical handles or containers for interacting

with virtual objects projected onto the surface.Tangible interfaces do not require any special-purpose input devices, and thus they provide anintuitive and seamless interaction with digital andphysical objects. However, spatial discontinuitiesdo break the interaction flow because the interfaceis localized on augmented surfaces and cannot beextended beyond them. Further, tangible interfacesoffer limited support for interacting with 3D virtualobjects.

March 2002 45

In 1965, Ivan Sutherland built the firstsee-through head-mounted display andused it to show a simple wireframe cubeoverlaid on the real world, creating the firstaugmented-reality interface. Developers of early AR interfaces, who followedSutherland, similarly designed themmostly to view 3D virtual models in real-world contexts for applications such as medicine,1 machine maintenance,2 or personal information systems.3 Althoughthese interfaces provided an intuitivemethod for viewing 3D data, they typicallyoffered little support for high-level inter-action, such as creating or modifying aug-mented-reality content.

3D AR InterfacesRecently, researchers have begun to

address this deficiency. Kiyoshi Kiyokawa4

uses a magnetic tracker, while theStudierstube5 project uses a tracked penand tablet to select and modify aug-mented-reality objects. In a 3D AR inter-face, a head-mounted display lets the userinteract with virtual content through dif-ferent input devices. More traditionalinput techniques, such as handheld mice6

and intelligent agents,7 have also beeninvestigated.

However, in all these cases the usermust work with special-purpose inputdevices, separate from tools used to inter-act with the real world. This limitationeffectively results in two different inter-faces—one for the physical workspaceand another for the virtual one. Conse-quently, interaction discontinuities, orseams, disrupt the natural workflow, forc-ing the user to switch between virtual andreal operation modes.

Tangible InterfacesFor more than a decade, researchers

have been investigating an alternative

approach: computer interfaces based onphysical objects. The Digital Desk pro-ject8 used computer-vision techniques totrack the position of paper documentsand the user’s hands on an augmentedtable. The user could seamlessly arrangeand annotate both real paper and virtualdocuments using the same physical tools—a pen and a finger. Graspable9 and tan-gible user interfaces further explore theconnection between virtual objects andthe physical properties of input devices,using simple wooden blocks to manipu-late virtual objects projected on a table’ssurface, for example.

Most importantly, tangible interfacesallow for seamless interaction, because asingle physical device represents eachinterface function or object. Therefore,users can access interface functions anduse traditional tools in the same man-ner—through manipulation of physicalobjects.

Information display in tangible inter-faces can be a challenge, however.Changing an object’s physical propertiesdynamically is difficult, and these inter-faces usually project virtual objects onto2D surfaces.10 The users, therefore, can-not pick virtual objects off the augmentedsurface and manipulate them in 3D spaceas they would a real object—the systemlocalizes the interaction to an augmentedsurface and cannot extend beyond it.Given that the tangible interface user can-not seamlessly interact with virtual objectsanywhere in space—by, for example, mov-ing a virtual object between augmentedand nonaugmented workspaces—the tan-gible interface introduces a spatial seaminto the interaction.

References1. M. Bajura, H. Fuchs, and R. Ohbuchi,

“Merging Virtual Objects with the Real

World: Seeing Ultrasound Imagery withinthe Patient,” Proc. Siggraph 92, ACMPress, New York, 1992, pp. 203-210.

2. S. Feiner, B. MacIntyre, and D. Selig-mann, “Knowledge-Based AugmentedReality,” Comm. ACM, vol. 36, no. 7,1993, pp. 53-62.

3. J. Rekimoto and K. Nagao, “The Worldthrough the Computer: Computer Aug-mented Interaction with Real WorldEnvironments,” Proc. UIST 95, ACMPress, New York, pp. 29-36.

4. K. Kiyokawa, H. Takemura, and N.Yokoya, “Seamless Design for 3DObject Creation,” IEEE MultiMedia,vol. 7, no. 1, 2000, pp. 22-33.

5. D. Schmalstieg, A. Fuhrmann, and G.Hesina, “Bridging Multiple User Inter-face Dimensions with Augmented Real-ity Systems,” Proc. ISAR 2000, IEEE CSPress, Los Alamitos, Calif., 2000, pp.20-29.

6. T. Hollerer et al., “Exploring MARS:Developing Indoor and Outdoor UserInterfaces to a Mobile Augmented Real-ity System,” Computers & Graphics,vol. 23, 1999, pp. 779-785.

7. M. Anabuki et al., “Welbo: An Embod-ied Conversational Agent Living inMixed Reality Spaces, Proc. CHI 2000,ACM Press, New York, 2000, pp. 10-11.

8. P. Wellner, “Interaction with Paper onthe Digital Desk, Comm. ACM, vol. 36,no. 7, 1993, pp. 87-96.

9. G. Fitzmaurice and W. Buxton, “AnEmpirical Evaluation of Graspable UserInterfaces: Towards Specialized, Space-Multiplexed Input, Proc. CHI 97, ACMPress, New York, 1997, pp. 43-50.

10. J. Rekimoto and M. Saitoh, “Aug-mented Surfaces: A Spatially Continu-ous Work Space for Hybrid ComputingEnvironments,” Proc. CHI 99, ACMPress, New York, 1999, pp. 378-385.

Short History of Augmented Reality Interface Design

Page 3: Developing a Generic Augmented- Reality Interface · Developing a Generic Augmented-Reality Interface ... Object Creation,” IEEE MultiMedia, vol. 7, ... bling technique

46 Computer

We believe that these opposing approaches alsocomplement each other. In Tiles, we design inter-faces that bridge 3D AR and tangible interactionsand thus produce a spatially and interactively seam-less augmented workspace.

DESIGNING TILESIn Tiles, the user wears a lightweight head-

mounted display with a small camera attached,both of which are connected to a computer. Thevideo capture subsystem uses the camera’s outputto overlay virtual images onto the video in real timeas described in the “Tracking and Registration inTiles” sidebar. The system then shows the resulting

augmented view of the real world on the head-mounted display so that the user sees virtualobjects embedded in the physical workspace, asFigures 1 and 2 show. Computer-vision trackingtechniques determine the 3D position and orien-tation of marked real objects so that virtual mod-els can be exactly overlaid on them.5 By manip-ulating these objects, the user can control the vir-tual objects associated with them without usingany additional input devices.

Design requirementsAlthough we developed the Tiles system specifi-

cally for rapid prototyping and evaluation of air-

An augmented-reality system’s fundamental ele-ments include techniques for tracking user positionand viewpoint direction, registering virtual objectsrelative to the physical environment, then renderingand presenting them to the user. We implemented theTiles system with the ARToolKit software, an opensource library for developing computer-vision-basedAR applications.

To create the physical tiles, we mark paper cardsmeasuring 15 cm × 15 cm with simple square pat-terns consisting of a thick black border and uniquesymbols in the middle. We can use any symbol foridentification as long as each symbol is asymmetri-cal enough to distinguish between the square bor-der’s four possible orientations.

The user wears a Sony Glasstron PLMS700 head-set. Lightweight and comfortable, the headset pro-vides an 800 × 600-pixel VGA image. A miniatureNTSC Toshiba camera with a wide-angle lensattaches to the headset. The system captures thecamera’s video stream at 640 × 240-pixel resolutionto avoid interlacing problems. The image is thenscaled back to 640 × 480 pixels using a line-dou-bling technique.

By tracking rectangular markers of known size,the system can find the relative camera position andorientation in real time and can then correctly ren-der virtual objects on the physical cards, as FigureA shows. Although the wide-angle lens distorts thevideo image, our tracking techniques are robustenough to correctly track patterns without losingperformance. We implemented our current systemon a 700-MHz Pentium III PC running Linux,which allows updates at 30 frames per second. Formore information on the ARToolKit library, seehttp://www.hitl.washington.edu/artoolkit/.

Figure A. The three-step process of mapping virtualobjects onto physical tiles so that the user can viewthem with a head-mounted display (HMD).

Tracking and Registration in Tiles

Page 4: Developing a Generic Augmented- Reality Interface · Developing a Generic Augmented-Reality Interface ... Object Creation,” IEEE MultiMedia, vol. 7, ... bling technique

craft instrument panels, we believe that this task’srequirements have broad application to many com-mon AR interface designs.

Aircraft instrument panel design requires the col-laborative efforts of engineers, human factors spe-cialists, electronics designers, and pilots. Designersand engineers always look for new technologiesthat can reduce the cost of developing the instru-ment panels without compromising design qual-ity. Ideally, they would like to evaluate instrumentprototypes relative to existing instrument panels,without physically rebuilding them. This inher-ently collaborative design activity involves heavyuse of existing physical plans, documents, andtools.

Using observations of current instrument paneldesign, DASA/EADS Airbus and DaimlerChryslerengineers produced a set of interface requirements.6

They envisioned an AR interface that would letdevelopers collaboratively outline and lay out a setof virtual aircraft instruments on a board that sim-ulates an airplane cockpit. Designers could easilyadd and remove virtual instruments from the boardusing an instrument catalog. After placing theinstruments on the board, they could evaluate andrearrange the instruments’ position. The interfaceshould also allow use of conventional tools—suchas whiteboard markers—with physical schemesand documents so that participants could docu-ment problems and solutions.

Basic conceptsWe designed the Tiles interface around a set of

simple interface principles that produce a genericand consistent AR interface.

A tile is a small cardboard card with a marker.It serves as a physical handle for interacting withvirtual objects. Conceptually similar to icons in agraphical user interface, a tile acts as a tangibleinterface control. Users physically manipulate thecorresponding tiles to interact with virtual objectsjust as they would with real objects. The resultingseamless interface requires no additional inputdevices to interact with virtual objects.

Although the tiles resemble physical icons, orphicons,3 they exhibit important differences.Phicons propose a close coupling between physicaland virtual properties so that their shape andappearance mirror their corresponding virtualobject or functionality. The Tiles interface decou-ples the physical properties of interface controlsfrom the data to create universal and generic datacontainers that can hold any digital data or no dataat all: The tile is a blank object that has no associ-

ated data until the user creates an association atruntime. Hence, techniques for performing basicoperations, such as attaching data to tiles, remainthe same for all tiles, resulting in a consistent andstreamlined user interface.

We use two separate tile classes. Operation tilesdefine the interface’s functionality and providetools to perform operations on data tiles. TheTiles system always attaches animated 3D iconsto operation tiles so that the user can identifythem. Data tiles act as generic data containers.The user can associate or disassociate any virtualobjects with data tiles at any time using operatortiles. The user physically manipulates tiles toinvoke operations between them, such as con-trolling the proximity between tiles, their distancefrom the user, or their orientation. The augmentedworking space is spatially seamless—aside fromcable length, the user has no restriction on inter-acting with the tiles.

March 2002 47

Figure 1. Users collaboratively arrange tangible data containers—tiles—on thewhiteboard and use traditional tools to add notes and annotations.

Figure 2. The user, wearing a lightweight head-mounteddisplay with an attached camera, can see real objectsand the virtual images registered on tiles.

Page 5: Developing a Generic Augmented- Reality Interface · Developing a Generic Augmented-Reality Interface ... Object Creation,” IEEE MultiMedia, vol. 7, ... bling technique

48 Computer

Tiles interfaceThe Tiles interface consists of a metal white-

board, a book, and two stacks of magnetic tiles thateach measure approximately 15 cm × 15 cm. Sittingin front of the whiteboard, the user wears a light-weight, high-resolution Sony Glasstron head-mounted display with a video camera attached, asFigure 1 shows.

The various tangible interface elements serve dif-ferent purposes. The whiteboard provides theworkspace where users can lay out virtual aircraftinstruments. The book serves as a catalog, or menuobject, that shows a different virtual instrumentmodel on each page. One tile stack stores blank

data containers, which show no content until userscopy virtual objects onto them. The remaining tilesfunction as operator tiles that perform basic oper-ations on the data tiles. Each operation has aunique tile associated with it. Currently supportedoperations include deletion, copying, and a helpfunction. Each operation tile bears a different 3Dvirtual icon that shows its function and differenti-ates it from the data tiles, as Figure 3 shows.

Invoking operations. All tiles can be freely manip-ulated in space and arranged on the whiteboard.The user simply picks up a tile, examines its con-tents, and places it on the whiteboard. The usercan invoke operations between two tiles by mov-ing them next to each other.

For example, to copy an instrument to a data tile,the user first finds the desired virtual instrument inthe menu book, then places any empty data tile nextto it. After a one-second delay, a copy of the instru-ment smoothly slides from the menu page to the tileand can be arranged on the whiteboard. Similarly,the user can remove data from a tile by moving thetrash can tile close to the data tiles, which removesthe instrument from it, as Figure 4 shows.

Using the same technique, we can implementcopy and paste operations using a copy operationtile. The user can copy data from any data tile to theclipboard, then from the clipboard to any numberof empty data tiles by moving empty tiles next tothe virtual clipboard that has data in it, as Figure 5shows. The clipboard’s current contents can alwaysbe seen on the virtual clipboard icon. Users can dis-play as many clipboards as they need—the currentimplementation has two independent clipboards.

Getting help. The Tiles interface provides a helpsystem that lets the user request assistance with-out shifting focus from the main task. Thisapproach is more suitable for AR interfaces thantraditional desktop help systems, which either dis-tract users with a constant barrage of help mes-sages or interrupt their work by making themsearch explicitly for help.

The Tiles system provides two help techniques,shown in Figure 6. With Tangible Bubble Help,simply placing the help tile beside the tile the userrequires help with brings up a text bubble next tothe help icon, as Figure 6a shows. In some cases,however, users only need short reminders, or tips,about a particular tile’s functionality. Alternatively,the Tangible ToolTips technique triggers the dis-play of a short text description associated with atile when the user moves the tile within arm’s reachand tilts it more than 30 degrees away from thebody, as Figure 6b shows.

Figure 4. The usercleans a data tile bymoving the trashcan operator tilenext to it.

Figure 3. Operationtiles. (a) The printeddesign of the physi-cal tiles that repre-sent the delete,copy, and help oper-ations, and (b) thevirtual icons thatrepresent the samethree operations inthe augmented-reality interface.

Figure 5. Copyingdata from the clip-board to an emptydata tile. The usermoves the tile close to the virtualclipboard and, aftera one-second delay,the virtual instru-ment slidessmoothly onto the data tile.

Page 6: Developing a Generic Augmented- Reality Interface · Developing a Generic Augmented-Reality Interface ... Object Creation,” IEEE MultiMedia, vol. 7, ... bling technique

Mixing physical and virtual tools. The Tiles interfacelets users seamlessly combine conventional physi-cal and virtual tools. For example, the user canphysically annotate a virtual aircraft instrumentusing a standard whiteboard pen or sticky note, asFigure 7 shows.

Multiple users. We designed Tiles with collabora-tion in mind. Thus, the system lets several usersinteract in the same augmented workspace. Allusers can be equipped with head-mounted displaysand can directly interact with virtual objects.Alternatively, users who do not wear head-mounted displays can collaborate with immersedusers via an additional monitor that presents theaugmented-reality view. Because all interface com-ponents consist of simple physical objects, boththe nonimmersed and immersed user can performthe same authoring tasks.

TILES IN OTHER APPLICATIONSOur initial user observations showed that devel-

opers of tangible AR interfaces must focus on boththe interface’s physical design and the virtual icons’computer graphics design. Physical componentdesigns can convey additional interface semantics.For example, the physical cards can be designed tosnap together like pieces in a jigsaw puzzle, result-ing in different functionality profiles depending ontheir physical configuration.

The interface model and interaction techniquesintroduced in Tiles can be extended easily to otherapplications that require AR interfaces. Object mod-ification techniques, for example, can be introducedinto Tiles by developing additional operator cardsthat would let the user dynamically modify objectsthrough scaling, color changes, employing hand ges-tures, and so on. The interaction techniques we pre-sent here will remain applicable, however.

We found that the tight coupling of 3D input anddisplay in a single interface component—the tile—lets users perform complex functions through essen-tially simple spatial manipulation and physicalarrangements of these tangible interface compo-nents. Thus, Tiles provides an application-indepen-dent interface that could lead to the development ofgeneric AR interface models based on tangible aug-mented-reality concepts.

A lthough developing additional interactiontechniques would let users apply Tiles to manydifferent application scenarios, in AR envi-

ronments the user can already switch easily betweenthe AR workspace and a traditional environment.Some tools and techniques better suit augmented

reality, while others work best in traditional form.Therefore, we believe that development of AR inter-faces should not focus on bringing every possibleinteraction tool and technique into the AR work-space. Instead, it should focus on balancing and dis-tributing features between the AR interface andother interactive media so that they all can be usedwithin a single seamless augmented workspace.

AR interfaces also offer an ad hoc, highly recon-figurable environment. Unlike traditional GUI and3D VR interfaces, in which the designer determinesmost of the interface layout in advance, in Tilesusers can freely place interface elements anywherethey want: on tables or whiteboards, in boxes andfolders, arranged in stacks, or grouped together. Theinterface configuration and layout of its elementsemerges spontaneously as the results of users’ workactivity, and evolves together with it. How the inter-face components should be designed for such envi-ronments, and whether these systems should beaware of the dynamic changes in their configura-tion, are important research questions. �

March 2002 49

Figure 6. Tangiblehelp in Tiles. (a) Toaccess TangibleBubble Help, usersplace the help tilenext to the tile theyneed assistancewith, which causestextual annotationsto appear within abubble next to thetile. (b) For lessdetailed explana-tions, TangibleToolTips displays anassociated shorttext descriptionwhen the usermoves the tilecloser and tilts it.

Figure 7. Physicallyannotating virtualobjects in Tiles.Because the printedtiles offer a physi-cal anchor for vir-tual objects, userscan make notesadjacent to themusing marking pensand sticky notes.

(a)

(b)

Page 7: Developing a Generic Augmented- Reality Interface · Developing a Generic Augmented-Reality Interface ... Object Creation,” IEEE MultiMedia, vol. 7, ... bling technique

AcknowledgmentsThis work represents a joint research initiative

carried out with support from DASA/EADS Airbus,DaimlerChrysler AG, and ATR International. Wethank Keiko Nakao for designing the 3D modelsand animations used in Tiles.

References1. R. Azuma, “A Survey of Augmented Reality,” Pres-

ence: Teleoperators and Virtual Environments, vol.6, no. 4, 1997, pp. 355-385.

2. I. Sutherland, “The Ultimate Display,” Proc. Int’lFederation of Information Processing, SpartanBooks, Washington, D.C., 1965, pp. 506-508.

3. H. Ishii and B. Ullmer, “Tangible Bits: Towards Seam-less Interfaces between People, Bits and Atoms,”Proc. Computer-Human Interaction (CHI 97), ACMPress, New York, 1997, pp. 234-241.

4. B. Ullmer and H. Ishii, “The MetaDesk: Models andPrototypes for Tangible User Interfaces,” Proc. UserInterface Software and Technology (UIST 97), ACMPress, New York, 1997, pp. 223-232.

5. H. Kato and M. Billinghurst, “Marker Tracking andHMD Calibration for a Video-Based AugmentedReality Conferencing System,” Proc. 2nd Int’l Work-shop Augmented Reality, IEEE CS Press, Los Alami-tos, Calif., 1999, pp. 85-94.

6. I. Poupyrev et al., “Tiles: A Mixed Reality Author-ing Interface,” Proc. Interact 2001, IOS Press,Netherlands, 2001, pp. 334-341.

Ivan Poupyrev is an associate researcher in SonyComputer Science Labs’ Interaction Lab, Tokyo,where he investigates the interaction implications ofcomputer-augmented environments. Poupyrevreceived a PhD in computer science from HiroshimaUniversity. Contact him at poup@csl. sony.co.jp.

Desney S. Tan is a PhD candidate at Carnegie Mel-lon University. His research interests include design-ing human-computer interfaces that augmenthuman cognition, specifically to leverage existingpsychology principles on human memory and spa-tial cognition in exploring multimodal, multidis-play information systems. He received a BS incomputer engineering from the University of NotreDame. Contact him at [email protected] or seehttp://www.cs.cmu.edu/~desney.

Mark Billinghurst is a researcher in the HumanInterface Technology Laboratory at the Universityof Washington. His research interests include aug-mented and virtual reality, conversational computerinterfaces, and wearable computers, with his mostrecent work centering around using augmentedreality to enhance face-to-face and remote confer-encing. Billinghurst received a PhD in electricalengineering from the University of Washington.Contact him at [email protected].

Hirokazu Kato is an associate professor at Facultyof Information Sciences, Hiroshima City Univer-sity. His research interests include augmented real-ity, computer vision, pattern recognition, andhuman-computer interaction. Kato received a PhDin engineering from Osaka University. Contact himat [email protected].

Holger Regenbrecht is a scientist at the Daimler-Chrysler Research Center in Ulm, Germany. Hisresearch interests include interfaces for virtual andaugmented environments, virtual-reality-aideddesign, perception of virtual reality, and AR/VR inthe automotive and aerospace industry. Regenbrechtreceived a doctoral degree from the Bauhaus Uni-versity Weimar, Germany. Contact him at [email protected].

Nobuji Tetsutani is the head of Department 3 inATR Media Information Science Laboratories. Hisresearch interests include application developmentfor high-speed networks. Tetsutani received a PhDin electrical engineering from Hokkaido Univer-sity, Hokkaido, Japan. Contact him at [email protected].

Investing in Studentscomputer.org/students/

Student members active in IEEE ComputerSociety chapters are eligible for the Richard E.

Merwin Student Scholarship.

Up to four $3,000 scholarships are available.Application deadline: 31 May

SCHOLARSHIPMONEY FORSTUDENTLEADERS