treatise of technologies for interactive augmented prototyping › 22c9 › e1bbe235c223... ·...

14
Proceedings of the TMCE 2006, April 18–22, 2006, Ljubljana, Slovenia, Edited by Horváth and Duhovnik 2006 ??press, City, ISBN 1 TREATISE OF TECHNOLOGIES FOR INTERACTIVE AUGMENTED PROTOTYPING Jouke Verlinden Faculty of Industrial Design Engineering Delft University of Technology The Netherlands [email protected] Imre Horváth Edwin Edelenbos Faculty of Industrial Design Engineering Delft University of Technology The Netherlands [email protected], [email protected] ABSTRACT The concept of augmented prototyping offers potential means to support the design process. Although a collection of such systems have been presented in literature, an exhaustive discussion regarding the support scenarios and related design domains is lacking. This literature survey compares a number of existing augmented prototyping systems and discusses their applications. Most systems are limited to presentation or only modeling single aspects of an artifact. This reduces the impact of augmented prototyping on the design process. Second, this article presents an analysis of enabling technologies; including physical model making, input and output techniques. None of these are ideal; many restrict the interactivity by obstructing the direct manipulation of the physical object and by considerable set-up times. From this collection, a subset is selected that requires further study, specifically supporting three industrial design engineering domains: information appliances, automotive, and furniture design. KEYWORDS Augmented Reality, prototyping, product modeling, industrial design engineering 1. INTRODUCTION Although the majority of designers have proficient spatial reasoning capacities, we still recognize a need for physical modeling in the early phases of design. Augmented Reality technology provides an appealing solution that combines physical and virtual reality. This combination might eliminate some of the problems associated to a entirely virtual or physical application. Several augmentation techniques could be found in Milgram and Koshino’s reality-virtuality continuum [1994] to extend physical prototypes with digital imagery. On the one extreme of this continuum, tangible user interfaces can be found, while on the other immersive Virtual Reality systems like head mounted displays. In between, a vast array of technologies can be found, ranging from video-mixing AR to see-through systems. The application of such technologies to design has been coined Augmented Prototyping (AP) [Verlinden et al., 2003]. This paper focuses on existing AP systems and technologies. In the literature, a number of Augmented Prototyping systems have been presented but an overview of the application domains and possible support scenarios is lacking. Among others, our survey included the inspection of conference and journal articles of the ACM (Association of Computing Machines) starting from the year 2000 to present, specifically the Computer Human Interfaces (CHI), the Designing Augmented Reality Environments (DARE), User Interface Software Technologies (UIST), Intelligent User Interfaces (IUI), Designing Information Systems (DIS), and other related journal issues. Subsequent references were backtracked and included when relevant. The implementation of AR systems is far more complex than traditional screen-based virtual approaches. The craftsmanship required to establish a well-working AP system is considerable; different underlying technologies can be selected for input and output and each influence the overall quality. As the concept also incorporates physical models, several alternatives of model creation will be analyzed. Specific focus of our treatise is the level of interactivity that augmented prototyping systems

Upload: others

Post on 29-May-2020

8 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: TREATISE OF TECHNOLOGIES FOR INTERACTIVE AUGMENTED PROTOTYPING › 22c9 › e1bbe235c223... · extreme of this continuum, tangible user interfaces can be found, while on the other

Proceedings of the TMCE 2006, April 18–22, 2006, Ljubljana, Slovenia, Edited by Horváth and Duhovnik 2006 ??press, City, ISBN

1

TREATISE OF TECHNOLOGIES FOR INTERACTIVE AUGMENTED PROTOTYPING

Jouke Verlinden Faculty of Industrial Design Engineering

Delft University of Technology The Netherlands

[email protected]

Imre Horváth Edwin Edelenbos

Faculty of Industrial Design Engineering Delft University of Technology

The Netherlands [email protected], [email protected]

ABSTRACT The concept of augmented prototyping offers potential means to support the design process. Although a collection of such systems have been presented in literature, an exhaustive discussion regarding the support scenarios and related design domains is lacking. This literature survey compares a number of existing augmented prototyping systems and discusses their applications. Most systems are limited to presentation or only modeling single aspects of an artifact. This reduces the impact of augmented prototyping on the design process. Second, this article presents an analysis of enabling technologies; including physical model making, input and output techniques. None of these are ideal; many restrict the interactivity by obstructing the direct manipulation of the physical object and by considerable set-up times. From this collection, a subset is selected that requires further study, specifically supporting three industrial design engineering domains: information appliances, automotive, and furniture design.

KEYWORDS Augmented Reality, prototyping, product modeling, industrial design engineering

1. INTRODUCTION Although the majority of designers have proficient spatial reasoning capacities, we still recognize a need for physical modeling in the early phases of design. Augmented Reality technology provides an appealing solution that combines physical and virtual reality. This combination might eliminate some of the problems associated to a entirely virtual or physical application. Several augmentation

techniques could be found in Milgram and Koshino’s reality-virtuality continuum [1994] to extend physical prototypes with digital imagery. On the one extreme of this continuum, tangible user interfaces can be found, while on the other immersive Virtual Reality systems like head mounted displays. In between, a vast array of technologies can be found, ranging from video-mixing AR to see-through systems. The application of such technologies to design has been coined Augmented Prototyping (AP) [Verlinden et al., 2003]. This paper focuses on existing AP systems and technologies.

In the literature, a number of Augmented Prototyping systems have been presented but an overview of the application domains and possible support scenarios is lacking. Among others, our survey included the inspection of conference and journal articles of the ACM (Association of Computing Machines) starting from the year 2000 to present, specifically the Computer Human Interfaces (CHI), the Designing Augmented Reality Environments (DARE), User Interface Software Technologies (UIST), Intelligent User Interfaces (IUI), Designing Information Systems (DIS), and other related journal issues. Subsequent references were backtracked and included when relevant.

The implementation of AR systems is far more complex than traditional screen-based virtual approaches. The craftsmanship required to establish a well-working AP system is considerable; different underlying technologies can be selected for input and output and each influence the overall quality. As the concept also incorporates physical models, several alternatives of model creation will be analyzed.

Specific focus of our treatise is the level of interactivity that augmented prototyping systems

Page 2: TREATISE OF TECHNOLOGIES FOR INTERACTIVE AUGMENTED PROTOTYPING › 22c9 › e1bbe235c223... · extreme of this continuum, tangible user interfaces can be found, while on the other

Jouke Verlinden, Imre Horváth, Edwin Edelenbos 2

provide. This relates to two subtopics: 1) the directness or human friendliness of the user interface, and 2) the capability to alter and evaluate a design in situ.

The findings of this study are reported in the following sections. After the specification of the objectives and problems of augmented prototyping, the collection of existing systems are presented and characterized. Then, technologies are presented on input and output of AR systems. Then, we briefly summarize the abilities of physical model making, including Rapid Prototyping. All technologies are revisited to explore viable technology combinations in the discussion section. Finally, we end with some conclusions and recommendations for future work.

2. AUGMENTED PROTOTYPING SYSTEMS

2.1. Augmented Prototyping systems The concept of Augmented Prototyping employs Augmented Reality technologies to support a design process. Many AP systems have been presented in literature, which are briefly discussed below and summarized in Table 1. We will focus on the design application, objectives, and evaluation methods.

Geometric modeling

Some geometric modeling tools that originated from VR were adapted to augmented reality. For example [Cheok et al., 2002] presents a see-through AR system that tracks the index finger by ARToolkit [Kato and Billinghurst, 1999]. The user can generate curves and surfaces which float in the air. As opposed to regular VR, this enables awareness of the real world; this aspect is of importance for most design activities. However, the system provides no tactile feedback because no physical objects are included, and the interaction is difficult to scale up to multiple users at a single location as the movement envelopes of the used tracking system are small. A similar system is presented by Fiorentino et al. [2002], who adapted their free-form VR modeling system to work with see-through display technologies and infrared 3D tracking. Again, the interaction takes place in mid-air and although physical objects can be included, these are only used to project texture maps.

Interactive Painting

Interactive painting systems as the dynamic shader lamps indicate the advantages of digital drawing on

physical objects [Bandyopadhyay et al., 2001]. Based on Raskar’s shader lamps technique, a white object is illuminated by a collection of video projectors from different angles. A tracked wand acts as a drawing tool. When in contact with the object’s surface, strokes are captured and rendered in an airbrush effect. As it copies natural drawing on objects, this establishes an easy to use interface that has been positively evaluated by kids and graphic artists. A restriction of such interactive painting systems is that the shape of the physical object cannot differ from the virtual – the haptic and visual display of the virtual will then be misaligned with the physical object.

Layout Design

Layout design systems like URP [Underkoffer and Ishii, 1999] and Built-it [Rauterberg et al., 1998] offer a number of fixed components that can be reconfigured on a planar surface. In the first, the components represent buildings, the augmentation focuses on simulation of light reflection, shadows and wind which are projected in 2D on the components themselves. Similarly, the Built-it system supports the layout of assembly lines, simple data is projected on top of the blocks, while a large view on the wall displays the resulting factory in 3D perspective. Both showcase the potential of using physical design components as user interface– the parts are managed by direct manipulation and a design can be reconfigured by multiple hands/users simultaneously. Furthermore, the light reflection and other simulation modules support in combination with this tangible interface show the combination of physical spatial reasoning and computational simulation.

Augmented Engineering

Augmented Engineering [Bimber et al., 2001] and Workbench for Augmented Rapid Prototyping (WARP) [Verlinden, 2003] are system approaches to apply AP to industrial design engineering. The first one is based on a see-through half-transparent mirror system, called the transflective board. This plate can be tilted and also used as a whiteboard (allowing sketching superimposed on the 3D scene). The article mentions five application scenarios for augmented reality systems in engineering, namely augmented design review, hybrid assembling, hybrid modeling/sketching, visual inspection of molded parts, and hybrid ergonomic design. However, these scenarios are not presented in full detail. The second system uses projector-based AR. We have

Page 3: TREATISE OF TECHNOLOGIES FOR INTERACTIVE AUGMENTED PROTOTYPING › 22c9 › e1bbe235c223... · extreme of this continuum, tangible user interfaces can be found, while on the other

Treatise of Technologies for Interactive Augmented Prototyping 3

documented a number of explorations with AP, spanning from the design of cars [Verlinden et al., 2003a] to interiors [Verlinden et al., 2004] and handheld devices [Verlinden et al., 2003b]. Both employ Rapid Prototyping (RP, see next section) techniques to create physical models.

Information Appliances

Information Appliances are consumer products that include electronics, e.g. mobile phones, mp3 players etc. In this case, augmentation can be used to add screens or other types of visual feedback to a physical model. As a part of the Japanese IT Carrozzeria research program, Kanai et al. [2005] have developed a usability assessment tool to rapidly test interaction with mockups by RFID technology (see section 5.2) and video projection. Much emphasis is put on measuring button interaction, to assess the usability of a design by capturing and time stamping click events. Nam and Lee [2003] also document on the design evaluation of a handheld mobile device. They use the ARToolkit to track the position and orientation of the object, the optical marker is attached to the back of the object. Both a video-based AR HMD and a projector-based AR are used, the second reportedly performs a more

accurately in the interaction tests. Nam [2005] expands this simulation to the interaction modeling of both screen and buttons for a handheld tour guide by the use of state transition graphs. Although the aspect of button interaction is well elaborated, the modeling of shape and its features is limited (e.g. the location of the buttons). Support to track layout modifications is lacking, and this has to be performed manually.

Automotive Design

Automotive design, an application of industrial design engineering received particular attention, for example by Klinker et al. [2002] who investigates the presentation of (virtual) concept cars in a typical show room, by observing the behavior of designers and by presenting some proof-of-concept examples. The main objective of these augmented prototyping systems is the presentation to other stakeholders in the design process, either higher management or potential users. As such, the interaction is passive, merely to inspect designs that are modeled in separate applications. The conclusion of an explorative study with styling clay models and video projection indicated that augmentation has more potential to be used as a modeling than presentation

Table 1 : Topical overview of Augmented Prototyping Research. publication Domain Objective Interactivity Evaluation

Cheok et al., [2002] Geometric modeling Generating curves and surfaces

Index finger is tracked, creation of control points in air

None

Fiorentino et al.[2002]

Geometric modeling ”Spacedesign”

Surface modeling Free-form surface modeling, inclusion of physical models.

Informal

Bandyopadhyay et al. [2002]

Interactive Painting "Dynamic shader lamps"

Painting on a physical object Moving object and paintbrush, selecting color from virtual palette

Informal

Underkoffer and Ishii, [1999]

Layout design "URP"

Interactive simulation for Urban Architecture reflections/ shadows/ wind

Moving objects in 2D plane Informal

Rauterberg et al. [1998]

Layout design "Built-it"

Factory planning: layout check, collaborative reviews

Moving objects in 2D plane Informal

Bimber et al., [2001]

Augmented Engineering "Augmented Engineering"

Several scenarios Supporting sketching on mirror, grasp physical objects

None

Verlinden et al. [2003]

Augmented Engineering ”WARP”

Several scenarios Operation of turntable/menus,

Informal

Klinker et al. [2002] Automotive design ”Fata Morgana”

Presentation of concept cars (BMW)

Virtual object on turntable, user moves around

Informal

Fründ et al. [2003] Automotive design Support automotive modeling (Volkswagen)

Moving components in 3D space (rotation, translation, scaling)

Informal

Nam and Lee [2003] Information Appliances Usability assessment of a digital tour guide

Operating the switches while grasping the object.

Interaction test (n=6)

Nam [2005] Information Appliances Dialogue definition and evaluation

Sketching screens and screen transitions, operating the switches while grasping the object

Informal

Kanai [2005] Information Appliances Usability assessment of a remote control

Operating switches Interaction test (n=8)

Page 4: TREATISE OF TECHNOLOGIES FOR INTERACTIVE AUGMENTED PROTOTYPING › 22c9 › e1bbe235c223... · extreme of this continuum, tangible user interfaces can be found, while on the other

Jouke Verlinden, Imre Horváth, Edwin Edelenbos 4

means [Verlinden et al., 2005]. First, the augmentation offers a direct means to interact with the complex geometry of a car body which is easier in use than traditional tape drawing or painting and design alternatives are easily captured. Second, traditional presentation models surpass the presentation level that can be reached with augmented reality in respect of resolution and material properties. A modeling system developed at Volkswagen supported modeling both car interior and exterior [Fründ et al., 2002]. However, the modeling operations were limited to component placement (translation, orientation, scaling) while the task of automotive styling requires a more subtle interaction.

Discussion

This growing collection of experimental systems indicates at potential design support by augmented prototyping. Specifically, it showcases the ability to combine 1) natural physical interaction, also specified as embodied interaction [Dourish, 2001], and 2) computational tools like modeling and virtual prototyping. This yields opportunities for collaboration and haptic feedback. The level of interactivity varies among these systems, from merely passive presentation means to combinations of modeling and simulation. However, when modeling is supported, typically only a single aspect is covered, like the user interaction for information appliances. We believe this limited support restricts the applicability of such systems.

Most evaluations of AP systems were informal, primarily aimed at the usability of a particular design with such systems (see Table 1). The role of augmented prototyping as a design means, competing with traditional and virtual technologies has not given much attention. More investigation is required on the evaluation of these systems and their influence on the design process.

2.2. Technology considerations As has been documented by many of the articles discussed above, setting up AP systems is a tedious process. All involved technologies require extensive tuning in order to get an adequate ‘experience’, while little or no tools exist to merge the physical and virtual models.

In looking for enabling technologies to establish an augmented prototype, a number of requirements can be formulated to filter and select a relevant subset.

From a generic perspective, the most important considerations are a) low threshold to create a prototype, b) the level of interactivity it offers, and c) the insight it provides in the design process. . Of course, each design field would require specific features, as will be presented in the Discussion section of this article.

The low threshold concerns the amount of effort and resources to establish and use an augmented prototype. These are depending on the physical model making and the overall complexity of the hardware/software combination. Specific issues include a fast setup (minutes) and an untethered interaction (a minimum of obstructive technologies). With respect to the second consideration (level of interactivity) Verlinden et al. [2004] presents the following guidelines: 1) to support multiple users/multi-handed interaction, 2) to support spatial interaction by using an appropriate scale for the prototype (which differs strongly per design domain), 3) pen- and speech- interface interfaces should also be considered to support the natural expressiveness of the task. The third consideration (providing insight) strongly depends on the design domain. This will be revisited in section 6.

The following sections will present an analysis of enabling technologies for augmented prototyping, which we grouped in: display technologies, input technologies, and model creation. The key principles and implementations for each of these categories will be analyzed.

3. DISPLAY TECHNOLOGIES AR surveys describe a large variety of augmented reality displays [Azuma, 1997][Azuma et al., 2001]. Of these, four principles have been documented to be used for AP: i) see-through head mounted display, ii) see-through boards, iii) projection-based and iv) embedded displays.

For design support purposes, head mounted displays (HMD) are considered to be insufficient (including see-through and video-mixing HMD setups). They introduce a large barrier between user and object, and have an insufficient resolution for AP – at most 800x600 pixels for the complete field of view [Klinker et al., 2002]. Nam and Lee [2003] compared a video mixing HMD setup with a projector based system for augmented prototyping. The purpose was to evaluate the usability of a handheld electronic guide; the screen was augmented on a physical mockup. The projector-based system was preferred

Page 5: TREATISE OF TECHNOLOGIES FOR INTERACTIVE AUGMENTED PROTOTYPING › 22c9 › e1bbe235c223... · extreme of this continuum, tangible user interfaces can be found, while on the other

Treatise of Technologies for Interactive Augmented Prototyping 5

by the subjects, due to the latency and resolution problems with the video mixed head mounted display.

The other three have different capabilities to augment physical models, which are summarized in Table 2. Details are discussed in the subsections below.

Projection-based technologies have our general preference for AP. This has a minimal influence on physical model (compared to embedded display) and does not establish a barrier between viewer and viewer (compared to see-through technologies). However, the other two are valuable alternatives.

3.1. See-through technologies The augmented engineering system [Bimber et. al., 2001] presents examples of see-through technologies, which use a half-silvered mirror to mix the virtual

models in the physical world, see Figure 1. Standard stereoscopic VR workbench systems like the Barco Baron are used to display the virtual information. In addition to the need to wear shutter glasses to view stereoscopic graphics, head tracking is required to align the virtual image between object and viewer. An advantage of this equipment is that digital images cannot be occluded by the users and that graphics can be displayed outside the perimeter of the physical object (i.e. to display the environment, annotations, tools). However, see-through displays obstruct the interaction of the user with the physical object. Multiple viewers cannot share one device, although a limited solution this has been presented in Virtual Showcase by establishing a setup which is curved [Bimber, 2002].

3.2. Projection-based technologies In this case, digital images are projected directly on physical objects. This technique has been coined Shader Lamps by [Raskar et al., 2001]. The basic light model and illumination issues are presented in Raskar and Low [2001]. Casting an image on a physical object is complementary to constructing a perspective image from a virtual object by a pinhole camera. If the physical shape has the same geometry as the virtual, no special algorithms are required to pre-distort the computer image: a simple 3D perspective transformation (represented by a 4 by 4 matrix) is sufficient. This display technique offers in essence all visual and sensory/haptic depth cues without shutter glasses or head mounted displays. Furthermore, the display can be shared is fit for multiple users

However, projector-based AR has certain disadvantages. First, there is only a limited field of view and focus depth. To reduce these problems, multiple video projectors can be employed. Another solution is to make the projector portable as the iLamps concept [Raskar et al., 2003]. Another issue is the occlusion and shadows that are cast on the surface by the user or other parts of the system. Non-convex geometries strongly depend on the granularity and orientation of the projector. The perceived quality is in some cases receptive to projection errors (also known as registration errors), especially projection overshoot [Verlinden et al., 2003b]. A solution for this problem is either to include a generic offset (dilatation) of the physical model or introduce pixel masking in the rendering 1 www.igd.fraunhofer.de

Table 2 : Summary of display technologies. See-through Projection-

based Embedded displays

Untethered interface

Barrier between viewer and object.

Untethered, might suffer from occlusion of user.

Object is connected to workstation.

Graphics capabilities

Differing physical and virtual geometries.

Bound to object’s surface

2D

Coverage of display

Object is covered for 1 viewpoint

Can extend to whole object

Only a small part is augmented

Resolution Max. 1600x1250

XGA, can be extended.

Typically Quarter VGA

Figure 1 The augmented engineering see-through

display1.

Page 6: TREATISE OF TECHNOLOGIES FOR INTERACTIVE AUGMENTED PROTOTYPING › 22c9 › e1bbe235c223... · extreme of this continuum, tangible user interfaces can be found, while on the other

Jouke Verlinden, Imre Horváth, Edwin Edelenbos 6

pipeline. A number of alternative algorithms exist to support the latter, including one that uses a probabilistic function to gradually fade out the virtual object around the edges [Fuhrmann et al., 1999].

If the physical and the virtual shapes differ the projection will be viewpoint-dependant, and head tracking is necessary. Holographic projections could be used to establish an autostereoscopic effect. Unlike other autostereoscopic displays (re-imaging displays, volumetric displays, parallax displays), physical models do not interfere with optical phenomenon of holographic projections. As holograms employ optical interference patterns that correspond to all viewpoints, many concurrent spectators can use such a system without the need of wearing glasses or head tracking. However, such projection technologies are in the early stage of development, and allow only monochromatic illumination on planar surfaces [Huebschman et al., 2003]. Some initiatives try to employ LED lasers for holographic projection, which also decreases power consumption compared to traditional video projectors2.

3.3. Embedded displays Another display option is to include a number of small LCD screens in order to display the virtual elements directly on the physical object. This practice is found in the later stages of prototyping Information Appliances as mobile phones and the like. Such screens typically have a similar resolution as PDAs and mobile phones (QVGA: 320x240 pixels) and have a special monitor connection to a workstation. The connection could be omitted when self-supporting components are used – typically a PDA with a wireless network connection. With the advent of new, flexible digital paper technologies and organic LED (OLED) technologies, it could be possible to cover a part of a physical mockup with such screens. To our knowledge, no such systems have been devised or implemented to this date. The Luminex material is approximating this as a LED/fiberglass based fabric though it is still not supporting changing light effects, see Figure 2.

4. PHYSICAL MODEL CREATION A multitude of physical prototyping methods are employed in the design practice. Both the traditional

2 www.lightblueoptics.com 3 www.luminex.it

manual method of model making and computerized techniques like Rapid Prototyping are being used simultaneously by design offices [Broek et al., 2000]. Typically, the manufacturing of physical models takes a considerable amount of time, at least a few hours. This is one of the factors that influence the usefulness of AP, as each design iteration might require its own physical instance. New interactive fabrication technologies that could radically decrease this lead time are discussed in 4.3.

For augmented prototyping, the physical models should preferably have neutral surfaces to support the graphics overlay. In particular when used in projection-based displays the models should have opaque white, non reflective surfaces There are solutions to deal with multi-colored objects by compensating the graphics rendering, but are limited to non-reflective materials [Bimber et al., 2005]. This concern might influence the material and manufacturing selection, as additional time and effort might be required to coat the model.

4.1. Manual model making In the early stages of design models of low-cost materials are typically used, which can easily provide volumes (e.g. foam, styling clay), surfaces (e.g. paper/cardboard) or structures (e.g. wires). In later stages, high quality materials are used, for example automotive styling clay. Each material has its own set of tools. For example automotive styling clay is sculpted with a wide variety of cutters, measuring devices and mirrors [Hoadley, 2002].

When incorporated in an AP system, the physical object requires digitalization. Scanning devices and reverse engineering techniques can be used to speed up this process, although there are a lot of limitations

Figure 2 Impression of the Luminex material3.

Page 7: TREATISE OF TECHNOLOGIES FOR INTERACTIVE AUGMENTED PROTOTYPING › 22c9 › e1bbe235c223... · extreme of this continuum, tangible user interfaces can be found, while on the other

Treatise of Technologies for Interactive Augmented Prototyping 7

Table 3 : Key characteristics of a current selection of RP concept modellers (Benchmark data from [Grimm, 2002]).

Prototype benchmark* Machine Principle Materials Max dimensions (mm)

time (hours) Cost ($)

Z810 (Z Corp.)

3D printing (jetting binder onto powder)

Starch-based or plaster-based powder

400x500x600 3.2 75

MDX-650 (Roland)

CNC, Milling of a range of materials

Foams, wax, solid plastics and metals

650x450x115 4.1 70

Eden 500v (Objet)

Polyjet printing of photopolymer

Acrylate Photopolymer resin 500x400x200 4.6 130

Thermojet (3D systems)

Multijet printing of Wax Wax 250x190x200 4.8 80

Dimension (Stratasys)

FDM, Extrusion of thermoplastic filaments

ABS, Polycarbonate 600x500x600 5.6 60

and typically requires human intervention. For example, the scanning of automotive styling clay proved to be difficult, as the brown and reflective surface could not be scanned directly by a laser scanner; applying a white latex cover solved this problem but significantly increased the handling time [Verlinden et al., 2005].

Models are usually created in such a way that they may be easily adjusted or disassembled. This malleability is a natural feature of manual model making – it is up to the designer when the shaping process is finished. This type of interaction in difficult to track while using AP, especially for typical industrial design engineering products. There are examples of interactive digitalization, in which case the modifications of the shape are constantly monitored and directly fed to an augmented prototyping system. For example, the Illuminating Clay system by Piper et al. [2002]. In this system, a slab of Plasticine acts as an interactive surface – the user influences a 3D simulation by sculpting the clay, while simulation results are projected on the surface. A laser-based Minolta Vivid 3D scanner is employed to continuously scan the clay surface. In the article, this principle was applied to geodesic analysis, yet can be adapted to design applications, e.g. the sculpting of car bodies. Other malleable materials that have been used in interactive digitalization setups include sand and glass beads [Ratti et al., 2004]. However, such real-time 3D scanning scenarios can only be executed on large and convex geometries without undercuts.

4.2. Automated creation Automated manufacturing of physical models is often called Rapid Prototyping (RP) [Gebhart, 2004].

For our objective to support fast prototyping, so-called “concept modelers” should be investigated. A wide variety of RP methods exist, which differ in build time, manufacturing costs, and build materials, see Table 3. Furthermore, some principles have restrictions in geometry (in particular CNC) and the maximum dimensions of the object.

Benchmark data of representative models ranged between 3 to 6 hours [Grimm, 2003]. Additional time for finishing should be considered (sanding, painting and the like). The fastest and most versatile is Zcorp’s 3D printer, although the resulting starch or plaster-based models are heavier than other methods while the surface is very brittle and have to be handled delicately. Typically, they are infiltrated with glue to reinforce the material. Although both Polyjet and FDM take considerably longer, these require less elaborate finishing and result in pieces that more accurately represent the design’s final surface and weight.

Of the presented techniques, the best surfaces for augmentation are produced by FDM and 3D printing processes. However, FDM results in polycarbonate are too reflective for projection-based displays. Photopolymers that are used in stereo lithography are translucent and reflective, although these can be painted. Wax-based models have a typical wax, translucent look yet are difficult to coat.

To conclude, the selection of RP method will depend on cost and time constraints, while surface quality and durability of the models might be of importance in specific situations. The Zcorp 3D printing technology seems to perform best on the issues discussed above.

Page 8: TREATISE OF TECHNOLOGIES FOR INTERACTIVE AUGMENTED PROTOTYPING › 22c9 › e1bbe235c223... · extreme of this continuum, tangible user interfaces can be found, while on the other

Jouke Verlinden, Imre Horváth, Edwin Edelenbos 8

4.3. Actuated surfaces Actuated surfaces represent a specific automatic manufacturing technology; by some kinematical structure, a surface is three-dimensionally distorted. Within certain boundaries, this manufacturing technique - sometimes labeled as “active deformable sheets” - offers a direct coupling between physical and virtual geometry. At present, this technology is evolving. The challenges include i) spatial resolution, ii) actuation speed, and iii) displacement [Fletcher, 1996]. One example is the Feelex apparatus [Iwata et al., 2001], that consists of a grid of small linear actuators and sensors, see Figure 3. It establishes a 50x50x18 mm surface that serves as input and output. The horizontal resolution is approximately 3 DPI (8 mm per actuator), while the vertical displacement envelope is 18 mm and has a maximum stroke rate of 7 Hz. A video projector is used to illuminate the top with computer graphics. A similar system was devised by Lind et al. [2000]; in this case bimetal alloys were used for actuation.

In comparison to manual or Rapid Prototyping methods, actuated surfaces have the potential to speed up physical prototyping to minutes or even seconds. Existing solutions are limited to a small, single deformable surface which cannot represent sharp angles and undercuts.

5. INPUT TECHNOLOGIES

5.1. Object tracking In order to merge the digital and physical, position and orientation tracking of the physical components is required. Welch and Floxin [2002] present a comprehensive overview of tracking principles that are currently available. They conclude that there is currently no ideal solution (‘silver bullet’) for position tracking in general, but some respectable alternatives are available. The same is true for object tracking in augmented prototyping. For simple tracking scenarios, either magnetic or passive optical technologies are often used.

In the case of augmented prototyping, the measurement should be as unobtrusive and invisible as possible, while still offering accurate and rapid data. As the physical model might consist of a number of parts or a global shape and some additional components (e.g. buttons), the number of items to be tracked is also of importance. It might be useful to combine one global measuring system with

a different local tracking technology to increase the number of trackers.

As the physical models are created specifically for AP, these can be adapted to optimally embed the tracking components. When RP methods are used, the 3D model can be automatically adapted for this purpose.

Global tracking

Known position tracking solutions for interior purposes are 1) active AC/DC magnetic wave tracking, 2) optical tracking with passive markers 3) optical tracking with active markers, 4) ultrasound based systems, 5) mechanical linkage-based tracking, and 6) laser scanning. Table 4 summarizes the most important characteristics of these tracking methods for augmented prototyping purposes. The data have been gathered by from commercially available equipment (respectively the Ascension Flock of Birds, ARToolkit, Optotrack, Logitech 3D Tracker, Microscribe, and Minolta VI-900). Below, a qualitative analysis of these technologies is presented. Each of these could be employed in augmented prototyping scenarios; there are significant differences in tracker/marker size, action radius and accuracy.

Magnetic wave tracking systems as the Polhemus Fastrack and Ascension Flock of Birds offer off-the-shelf solutions for many tracking applications. Depending on the transmitter power, the range can extend to a few meters per transmitter (arrays can be used to cover a wider range). The receivers are small and can be hidden inside the object (as opposed to optical/ultrasound based tracking). Unfortunately, the receivers have to be connected to the measuring

Figure 3 Feelex apparatus with top-projection [Iwata et al.,

2001].

Page 9: TREATISE OF TECHNOLOGIES FOR INTERACTIVE AUGMENTED PROTOTYPING › 22c9 › e1bbe235c223... · extreme of this continuum, tangible user interfaces can be found, while on the other

Treatise of Technologies for Interactive Augmented Prototyping 9

device, generating clutter if more are used simultaneously. Furthermore, they are susceptible of noise from electromagnetic sources (e.g. video projectors) and metal constructions.

Optical tracking uses normal or infrared light to determine the spatial position of a marker. Passive markers represent simple paper or fabric dots or icons that can be attached easily to an object. For example, the ARToolkit employs complex patterns and a regular web camera to determine position, orientation, and identification of the marker. This is done by measuring the size, 2D position and perspective distortion of a known rectangular marker [Kato and Billinghurst, 1999]. Another known system is the dtrack system by Advanced Realtime Tracking GmbH that uses spatial configurations of small reflective balls on wire frames. Passive markers enable a relatively untethered system, as no wiring is necessary. The optical markers are obtrusive when markers are visible to the user while handling the object.

Active markers are typically infrared (IR) Light Emitting Diodes that are fired sequentially to identify the marker positions. This principle is commercially developed for motion capture, e.g. the Optotrack (Northern Digital Inc.) and Visualeyez systems. High-resolution and fast scanning cameras are used for this purpose. As these systems are typically used for motion capturing of human body, the number of trackers is extendible. The active markers require a wired connection to a local multiplexing device. The

accuracy of these systems is less than the magnetic ones, and optical tracking in general suffers of occlusion and the field of view of the camera is limited. Of course, multiple cameras can be used to reduce the line of sight problems.

Ultrasound based tracking uses a number of microphones at known locations to determine the position of an ultrasound source by triangulation. As with the magnetic trackers, it does not influence the visual field, yet it suffers from the same line of sight problems as the optical systems. The accuracy is typically low and the source in some cases emits an audible ticking sound. A current implementation of this principle is the Logitech 3D tracker.

Another type of tracking is offered by mechanical linkage based systems, in

which precision encoders are used to measure the angles of the joints. For example, the Immersion Microscribe has 6 joints and is available in a number of sizes. This measurement method offers an extremely fast and accurate measurement of position and orientation. It can also be easily combined with force feedback, like the Immersion Phantom. One of its disadvantages is that each object of interest requires its own device. Its movement envelope is limited to a maximum of 1.67 meters and collisions can easily happen if multiple linkage based systems are used concurrently. Furthermore, the device’s mass yields an inertia that can be very distractive during use.

3D laser scanning is capable to keep track of positions and complete surface details, as used in the illuminated clay system discussed in 4.1. This method has a number of challenges when used as real-time tracking means, including the recognition of objects and their posture. Furthermore, these devices are designed to perform single scans; Piper et al. [2001] put considerable effort in speeding up this scanning cycle to real-time rates (1 Hz) by modifying the embedded software.

Local tracking

We can envisage the use of a single tracking system for tracking the global shape with one of the previously described techniques while supportive technologies are employed to locate smaller components – for example buttons - on the object’s

Table 4 : Summary of tracking technologies. Tracking type

Size of tracker (mm)

Typical number of trackers

Action radius/ accuracy

DOF Issues

Magnetic 16x16x16 2 1.5 m (1 mm)

6 Ferro-magnetic interference

Optical passive

80x80x0.01 >10 3 m (1 mm)

6 line of sight

Optical active

10x10x5 >10 3 m (0.5 mm)

3 line of sight, wired connections

Ultrasound 20x20x10 1 1 m (3 mm)

6 line of sight

Mechanical linkage

defined by working envelope

1 0.7 m (0.1 mm)

5 limited degrees of freedom, inertia

Laser scanning

none infinite 2 m ( 0.2mm)

6 line of sight, frequency, object recognition

Page 10: TREATISE OF TECHNOLOGIES FOR INTERACTIVE AUGMENTED PROTOTYPING › 22c9 › e1bbe235c223... · extreme of this continuum, tangible user interfaces can be found, while on the other

Jouke Verlinden, Imre Horváth, Edwin Edelenbos 10

surface. These type of local positioning systems might have less advanced technical requirements, for example the updates might be discontinuous and frequency only once a minute.

An appealing magnetic resonance-based 3D tracking system has been developed at Interval research. It principle worked similar to a Radio Frequency ID (RFID) reading system, and it was capable to track the 3D position of up to 128 passive tags simultaneously [Paradiso et al., 2000]. Typical action radius was approximately 20-30 centimeters. Although this technology could be eligible for AP, it is proprietary and no similar sensing technologies have been found.

The same RFID phenomenon is commonly used in drawing tablets and its capabilities can be quite elaborate as demonstrated in the Sensetable [Patten et al., 2001]– an altered commercial digital drawing tablet with custom-made wireless interaction devices. The Senseboard [Jacob et al., 2002] has similar functions and had an intricate grid of RFID receivers to determine the (2D) location of a RFID tag on a board. In practice, these systems rely on a rigid tracking table, but it is possible to extend this to a flexible sensing grid.

5.2. User Input Apart from location and orientation tracking, augmented prototyping applications require interaction with parts of the physical object, for example to mimic the interaction with the design. This interaction differs per augmented prototyping scenario, for example the usability analysis of an information appliance (equipped with buttons, screens etc.) versus color exploration for interior design, see Figure 4. A variety of events should be supported to cater for these applications. Such user events can be monitored by tracking physical components, fingertips, or by Wizard of Oz techniques.

Physical components

The employment of traditional sensors has been studied extensively in the CHI community labeled as Phidgets (physical widgets). Greenberg and Byle [2002] introduce a simple electronics hardware and software library to interface PCs with sensors (and actuators) that can be used to discern user interaction. The sensors include switches, sliders, rotation knobs, sensors to measure force, touch, light. More elaborate components like a mini joy stick, IR motion sensor,

air pressure or temperature sensor are commercially available. Similar initiatives are iStuff [Ballagas et al., 2003], which also hosts a number of wireless connections to sensors. Some systems cater for embedding switches with short-range wireless connections, for example the Switcheroo and the Calder systems [Avrahami and Hudson, 2002][Lee et al., 2004]. This allows a greater freedom in modifying the location of the interactive components while prototyping. The Switcheroo system uses custom-made RFID tags. A receiver antenna has to be located close (in a 10 cm distance) so the movement envelope is rather small and the physical model is attached by to the PC by a wire. The Calder toolkit [Lee et al., 2004] uses a capacitive coupling technique which has a smaller range (6 cm with small antennae) but is able to receive and transmit for long times on a small 12 mm coin cell. Other active wireless technologies would draw more power, leading to a system that would only fit a few hours. Although the costs for this system have not been specified, only standard electronics components are required to build such a receiver.

Hand tracking

Instead of attaching sensors to the model, fingertip and hand tracking can also be used to generate user events. Embedded skins represent a type of interactive surface technologies that allow accurate

Figure 4 Mockup equipped with wireless switches that

can be relocated to explore usability [Lee et al., 2004].

Page 11: TREATISE OF TECHNOLOGIES FOR INTERACTIVE AUGMENTED PROTOTYPING › 22c9 › e1bbe235c223... · extreme of this continuum, tangible user interfaces can be found, while on the other

Treatise of Technologies for Interactive Augmented Prototyping 11

measurement of touch on the object’s surface [Paradiso et al., 2000]. For example, the Smartskin by Reikimoto [2002] consists of a flexible grid of antennae. The proximity or touch of human fingers changes the capacity locally in the grid and establishes a multi-handed interaction cloth, which can be wrapped around an object. Direct electric contact can also be used to track user interaction, for example the paper buttons concept [Sokolor and Nelson ‘00] which embeds electronics on the objects and equips the finger with a 2-wire plug which supplies power and allows bidirectional communication with the embedded components when touched. Magic Touch [Pederson, 2000] uses a similar wireless system; the user wears a RFID reader on his or her finger and can interact by touching the components, which have hidden RFID tags. This method has been adapted to augmented prototyping by Kanai et al. [2005], as described in section 2.1.

Optical tracking can be used for fingertip and hand tracking as well. A simple example is the light widgets system [Fails and Olsen, 2002] which traces skin color and determines finger/hand position by 2D blobs. A more elaborate example is the virtual drawing tablet by Ukita and Kohode [2004]; fingertips are recognized on a rectangular sheet by a head-mounted infrared camera. Traditional VR gloves can be used as well for this type of tracking [Schafer et al., 1997].

Wizard of Oz

In low-fidelity prototyping, the use of Wizard of Oz interpretation is commonplace; a human operator is monitoring the user and propagates the actions to the computer. This technique is sufficient for initial experiments. For example Verlinden et al. [2004] employ Wizard of Oz interpretation in an AR interior design setting for position and event tracking. However, when fast or accurate user input is required, this approach is very limited.

5.3. Other Modalities As stated in section 2.2, speech and gesture recognition would also need consideration in augmented prototyping. Oviatt et al. [2000] offer an excellent overview of so-called Recognition based User Interfaces (RUI) including issues and human factors aspects on these modalities. In particular pen-based interaction would be a natural extension to the expressiveness of today’s designer skills. Furthermore, speech-based interaction can be useful to activate modeling operations while the hands are used for selection or parameter setting.

6. DISCUSSION Generic requirements of AP technologies were formulated as a) low threshold to create a prototype, b) the level of interactivity it offers, and c) the insight it provides in the design process. However, the selection of technologies is strongly depending on

the design domain – each has its own needs, conventions, and product realization phases. Based on various discussions with experts and evaluations of pilot implementations that involved over 90 industrial design students, we expand the support design scenarios, focusing on three sub domains of industrial design engineering: 1) information appliances, 2) automotive, and 3) furniture. We believe these may benefit from AP. The design of information appliances considers usability on both cognitive and physical levels, appearance, tactility, and the embodiment of electronics components. A wide variety of physical models are used in this

Table 5 : Support scenarios of AP for specific IDE domains.

Information Appliances

Automotive Furniture

Exploration • Material/color • Component

layout • Principles of

interaction

• Material/color • Prinicpal

curves/styling lines

• (External) component layout

• Material/color/ graphical decorations

• Product family

Verification • Aestethics • Cognitive and

physical ergonomics

• Aestethics • shape-based

engineering (FEA, Aerodynamics)

• Surface continuity • Customization

• Aestethics • Physical

ergonomics • Context of use • Manufacturing

(mould flow etc)

Commu-nication

• Concept • Usage context

• Idea/concept presentation

• Annotations on design

• Concept

Specification • Shape (geometry) • Component

layout

• Shape • Component layout

• Color/decoration

Page 12: TREATISE OF TECHNOLOGIES FOR INTERACTIVE AUGMENTED PROTOTYPING › 22c9 › e1bbe235c223... · extreme of this continuum, tangible user interfaces can be found, while on the other

Jouke Verlinden, Imre Horváth, Edwin Edelenbos 12

field – often full-scale - from mockups to working prototypes and beyond. Automotive design has to consider engineering and manufacturing aspects from early on, as well as surface quality, style and aesthetics. Typically clay models are made during the initial model presentation (scale 1:4 or 3:8), and in full scale at the end of the concept phase [Tovey, 1997]. Furniture design typically puts less emphasis on engineering aspects but encompasses ergonomics and aesthetics. There is no typical prototyping tradition in this field, although handmade paper/cardboard based models are often used.

As mentioned by Geuer [1997] four main objectives exist to drive physical prototyping: a) exploration, b) verification, c) communication, and d) specification for the downstream process. These objectives reflect the insight that is required in the design process. Possible support scenarios for these three domains are presented in Table 5. At present, we assume that the conceptualization phase can gain the most. The speed and flexibility of AP might be more important than an accurate or complete design representation.

For these domains we propose a subset of enabling technologies, as depicted in table 6. For information appliances, the most logical combination would be to employ RP or manual made models, while optical object tracking and wireless phidgets would work best for supporting interactivity. The output could be either embedded or projector based (or a combination of these). For automotive design, physical models could be made either way, depending on scale and time considerations. Styling purposes could well use actuated surfaces, the automobile’s geometry is often convex and has a fixed bottom. For input, a combination of 3D laser scanning (of manually made

models) and mechanical tracking could be used. Existing automotive AP systems described in section 2.1 employed optical tracking, yet these were limited to component-level interaction (not surface editing or line drawing). Considering the scale of these artifacts projector-based output is preferred, as see-through options are very limited. Existing tests with see-through head mounted displays indicated their limitations in resolutions, field of view, and the fact that users felt tethered/isolated. For furniture design, no AP scenarios were found in the literature, yet we foresee possible uses as described in Table 4. The traditional physical model making practice can be easily extended by augmented prototyping by including optical tracking, and 3D tracking for monitoring for user input. Depending on the type of furniture, either see-through or projector based output can be preferred.

7. CONCLUSIONS AND FUTURE RESEARCH

This literature study presents an overview of existing augmented prototyping systems and discusses a large variety of enabling techniques. We surveyed existing applications that cover geometric modeling, interactive painting, layout design, augmented engineering, information appliances, and automotive design. This collection indicates a use of augmented prototyping, although a robust evaluation and comparison of other prototyping means is lacking. The presented augmented prototyping systems also showcase the power of tangible computing as natural, embodied interaction. The design support is limited to presentation or modeling single aspects of an

Table 6 : Morphological map of AP enabling technologies. Automatic Model creation Manual

RP

Actuated surfaces

Optical Object tracking

Magnetic Passive markers

Active markers

3D laser scanning

Ultrasound Mechanical

Phidgets Virtual

Input technologies

User input Wired

connection Wireless

Surface tracking

3D tracking

Output

Embedded

Projector-based

See-through

Page 13: TREATISE OF TECHNOLOGIES FOR INTERACTIVE AUGMENTED PROTOTYPING › 22c9 › e1bbe235c223... · extreme of this continuum, tangible user interfaces can be found, while on the other

Treatise of Technologies for Interactive Augmented Prototyping 13

artifact. This reduces the impact of augmented prototyping on the design process.

Then, an analysis of output, input, and physical prototyping was presented. As output means, our first preference is projection-based display. On input and physical model making, a wide variety of options can be chosen, depending on the situation at hand.

Three sub-domains of design were selected: information appliances, automotive, and furniture design. Possibilities of AP and enabling technology combinations for these respective groups have been discussed in the previous section. More research is required to develop and test these technology combinations, based on extensive field studies in these respective design domains.

REFERENCES Azuma, R. (1997) "A Survey of Augmented Reality"

Presence: Teleoperators and Virtual Environments vol. 6, no. 4, pp. 355 – 385.

Azuma, R., Baillot, Y., Behringer, R., Feiner, S., Julier, S., MacIntyre, B. (2001) “Recent advances in augmented reality” IEEE Computer Graphics and Applications, vol. 21, no. 6, pp. 34–47.

Ballagas, R., Ringel, M., Stone, M., and Borchers, J. (2003) “iStuff: A Physical User Interface Toolkit for Ubiquitous Computing Environments” proceedings of CHI 2003, pp. 537-544.

Bandyopadhyay, D.; Raskar, R.; Fuchs, H. (2001) "Dynamic Shader Lamps: Painting on Movable Objects" International Symposium on Augmented Reality (ISMAR), pp. 207-216.

Bimber, O. (2002) “Interactive Rendering For Projection-Based Augmented Reality Displays” Ph.D. Dissertation., Darmstadt University of Technology.

Bimber, O., Stork, A., Branco, P. (2001) “Projection-based Augmented Engineering” proceedings of International Conference on Human-Computer Interaction (HCI’2001), volume 1, pp. 787-791.

Bimber, O., Emmerling, A., and Klemmer, T. (2005) “Embedded Entertainment with Smart Projectors” IEEE Computer, January issue 2005, pp. 56-63.

Broek, J.J., Sleijffers, W., Horváth, I., Lennings, A.F. (2000) “Using Physical Models in Design” proceedings of CAID/CD’2000 conference, pp. 155-163.

Cheok, A.D., Edmund, N.W.C., Eng, A.W. (2002) Inexpensive Non-Sensor Based Augmented Reality Modeling of Curves and Surfaces in Physical Space” proceedings of ISMAR’02, pp. 273-274.

Dourish, P. (2001) “Where The Action Is: The Foundations of Embodied Interaction” MIT Press, ISBN 0262041960.

Fails, J.A., Olsen, D.R. (2002) "LightWidgets: Interacting in Everyday Spaces" proceedings of IUI '02, pp. 63-69.

Fiorentino, M., de Amicis, R., Monno, G, Stork, A. (2002) “Spacedesign: a Mixed Reality Workplace for Aesthetic Industrial Design” proceedings of ISMAR’02, pp. 86-96.

Fletcher, R. (1996) “Force transduction materials for human-technology interfaces” IBM Systems Journal, Volume 35, nos. 3 & 4, pp. 630-638.

Fründ, J., Gausemeier, J., Matysczok, C., Radovski, R. (2003) “Cooperative Design Support within Automobile Advance Development Using Augmented Reality Technology” proceedings of CSCW in design, pp. 492-497.

Fuhrmann, A. Hesina, G., Faure, F., Gervautz, M. (1999) “Occlusion in collaborative augmented environments” Computers& Graphics, vol 23, no. 6, pp. 809-819.

Gebhardt, A., (2003) “Rapid Prototyping”, Hanser Gardner Publications, ISBN: 3446212590.

Geuer, A. (1996) “Einsatzpotential des Rapid Prototyping in der Produktentwickelung” Springer Verlag, Berlin, ISBN 3540614958.

Grimm, T.A. (2003) “Rapid Prototyping Benchmark: 3D printers”, T.A.Grimm&Associates, Inc. Edgewood, Kentucky.

Hoadley, F.E. (2002) “Automobile design techniques and design modeling : the men, the methods, and the materials” 2nd ed. TAH Productions, Dearborn, Michigan.

Huebschman, M., Munjuluri, B., Garner, H.R. (2003) "Dynamic holographic 3-D image projection" Optics Express, vol. 11, no. 5, pp. 437 – 445.

Iwata, H., Yano, H., Nakaizumi, F., Kawamura R. (2001) ”Project FEELEX: Adding Haptic Surface to Graphics” proceedings of SIGGRAPH2001, pp.469-475.

Kato, H., Billinghurst, M. (1999) ”Marker Tracking and HMD Calibration for a video-based Augmented Reality Conferencing System” proceedings of International Workshop on Augmented Reality (IWAR 99), pp. 85-94.

Klinker, G., Dutoit, A.H., Bauer, M., Bayer, J., Novak, V., Matzke, D. (2002) “Fata Morgana – A Presentation System for Product Design”, proceedings of ISMAR ’02, pp. 76-85.

Lind, R.J., Johnson, N., Doumanidis, C.C. (2000) “Active deformable sheets: prototype implementation,

Page 14: TREATISE OF TECHNOLOGIES FOR INTERACTIVE AUGMENTED PROTOTYPING › 22c9 › e1bbe235c223... · extreme of this continuum, tangible user interfaces can be found, while on the other

Jouke Verlinden, Imre Horváth, Edwin Edelenbos 14

modeling, and control “ in Proc. SPIE Vol. 3985, pp. 572-582.

Milgram, P., Kishino, F. A (1994) “Taxonomy of Mixed Reality Visual Displays”, IECE Trans. on Information and Systems (Special Issue on Networked Reality), vol. E77-D, no. 12, pp.1321-1329.

Nam, T-J, Lee, W. (2003) “Integrating Hardware and Software: Augmented Reality Based Prototyping Method for Digital Products”, proceedings of CHI’03, pp. 956–957.

Nam, T-J (2005) “Sketch-Based Rapid Prototyping Platform for Hardware-Software Integrated Interactive Products” proceedings of CHI’05, pp. 1689-1692.

Oviatt, S.L., Cohen, P.R., Wu, L.,Vergo, J., Duncan, L., Suhm, B., Bers, J., Holzman, T., Winograd, T., Landay, J., Larson, J. & Ferro, D. (2000) "Designing the user interface for multimodal speech and gesture applications: State-of-the-art systems and research directions" Human Computer Interaction, vol. 15, no. 4, pp. 263-322.

Paradiso, J. A., Hsiao, K., Strickon, J., Lifton, J, Adler, A. (2000) “Sensor systems for interactive surfaces” IBM Systems Journal, Vol. 39, nos. 3&4, pp. 892-914.

Raskar, R.; Welch, G.; Low, K-L.; Bandyopadhyay, D. (2001) "Shader Lamps: Animating Real Objects with Image Based Illumination" proceedings of Eurographics Workshop on Rendering, pp. 89-102.

Raskar, R., Low, K-L (2001) “Interacting with Spatially Augmented Reality”, ACM international conference on Virtual Reality, Computer Graphics and Visualization in Africa (AFRIGRAPH). pp. 101-108.

Raskar, R., van Baar, J., Beardsley, P., Willwacher, T., Rao, S., Forlines, C. (2003) “iLamps: Geometrically Aware and Self-Configuring Projectors”, SIGGRAPH, pp. 809–818.

Rauterberg, M., Fjeld, M., Krueger, H., Bichsel, M., Leonhardt, U., Meier, M. (1998) ”BUILD-IT: A Planning Tool for Construction and Design” video program of CHI'98, pp. 177-178.

Underkoffer, J Ishii, H. (1999) “Urp: A Luminous-Tangible Workbench for Urban Planning and Design”, proceedings of CHI’99, pp. 386-393.

Verlinden, J.C., de Smit, A., Peeters, A.W.J., van Gelderen, M.H. (2003) “Development of a Flexible Augmented Prototyping System” Journal of WSCG, vol 11(3), pp. 496-503.

Verlinden, J.C., de Smit, A., Horváth, I., Epema, E.., de Jong, M. (2003) “Time compression characteristics of the Augmented Prototyping Pipeline” proceedings of Euro-uRapid’03, pp. A/1.

Verlinden, J., de Smit, A., Horváth, I (2004) “Case-based exploration of the Augmented Prototyping dialogue to support design” proceedings of TMCE 2004, pp. 245-254.

Verlinden, J., van den Esker, W., Wind, L., Horváth, I. (2004) "Qualitative Comparison of Virtual and Augmented Prototyping of Handheld Products" proceedings of Design 2004, pp. 533-538.

Verlinden, J., Kooijman, A., Edelenbos, E., Go, C. (2005) "Investigation on the Use of Illuminated Clay in Automotive Styling" proceedings of CAID/CD’05 conference, pp. 514-519.

Welch, G., Foxlin, E. (2002) “Motion Tracking: No Silver Bullet, but a Respectable Arsenal” IEEE Computer Graphics and Applications, vol. 22, no. 6, pp. 24–38.