transparent tabletop interface for multiple users on...
TRANSCRIPT
Transparent Tabletop Interface for Multiple Users on Lumisight Table
Yasuaki Kakehi1, Takero Hosomi
1, Makoto Iida
1, Takeshi Naemura
1, Mitsunori Matsushita
2
1 The University of Tokyo 2 NTT Communication Science Labs., NTT Corp. {kakehi, hosomi, iida, naemura}@hc.ic.i.u-tokyo.ac.jp, [email protected]
Abstract
This paper presents a new type of tabletop interface on Lumisight Table. Putting physical objects on a tabletop display is one of the typical methods for intuitive tangible input. To date, various interactive systems that can identify and track the tabletop objects by using a camera, have been proposed. However, in these systems, the existence of objects with special devices or markers can disturb users’ natural interaction by hiding displayed information. To solve this problem, the authors propose a transparent tabletop interface that is transparent from users but visible from a camera installed inside the system. This paper describes our research motivation, design and implementation of this interface, and examples of interaction.
1. Introduction
In the fields of CHI (Computer Human Interaction)
and CSCW (Computer-Supported Cooperative Work),
tabletop displays have been attracting much attention,
since an electronic display screen embedded on a
horizontal tabletop is useful for supporting users’ work
or group discussion. One special feature of a tabletop
display is that physical objects can be placed on the
screen. Because such actions are understood intuitively
in the same way as work done at an ordinary desk,
several systems in which the placement of physical
objects serves as input methods have been proposed.
The output corresponding to the placement of
physical objects can appear in the content of the screen
image. However, this method may result in
inconveniences such as the following.
Physical objects equipped with electronic
devices, markers, etc. restrict user's natural
interactions.
Physical objects placed on the surface of the
table obstruct view of the image on the display.
To solve these problems, this paper proposes a
transparent tabletop interface. So far, the authors have
developed and reported an interactive view-dependent
tabletop display system[1]. It can present different
images to multiple users sitting around the table, and is
called the Lumisight Table. Based on this technology,
this paper presents a transparent tabletop interface for
multiple users with the features listed below (See
Figure 1).
The interface appears transparent to the users
around the table, so the image on the tabletop
screen is not obstructed.
The camera installed inside the table can
recognize the positions of the objects so that
their placement can be used for input.
First, the authors briefly describe the Lumisight
Table, and then explain the design and implementation
of the transparent tabletop interface and present an
example of interaction using it.
2. Related Works
Tabletop displays can become an effective
workspace for the users (e.g. users can put physical
Figure 1. Transparent tabletop interfaces.
Proceedings of the First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP ’06) 0-7695-2494-X/05 $20.00 © 2006 IEEE
objects on top of it). To maintain users' nonverbal
modalities on the tabletop, the method of controlling
the display should be more natural and intuitive.
Therefore, surfaces and tables that are capable of
sensing the positions of placed objects and recognizing
the hand gestures on top of it have been explored
widely. The DigitalDesk[2] is one of the pioneering
works of the interactive tabletop display, which
supports augmented interaction with physical paper
documents on the physical tabletop by the computer
vision based approach. As for interaction with physical
objects, the metaDESK[3] is one of the earlier works
that introduced the concept, which allows users to
control the system by putting physical objects
(Phicons) on it.
The methods for recognizing physical objects on a
table can roughly be classified into two categories. One
involves the incorporation of an electronic device into
the physical object itself, typified by the Sensetable0
and Smart Table[5]. The other category of methods
involves attaching a marker to the physical object that
can be recognized by a camera. Compared to the above
approach, it may be somewhat less robust, but is easy
to manufacture and has superior expansibility because
it does not involve the incorporation of electronic
devices. In this category, various methods (e.g.
recognition by using color of markers[6], recognition
with IR light[7]) have been proposed.
A precursor to the transparent tabletop interface that
is the topic of this paper is the DataTiles[8], in which
RFID tags are incorporated into transparent tiles to
allow recognition of their position and identification. In
addition to that system, there has been work in the field
of augmented reality on camera-based methods that
employ invisible markers[9][10], but application to a
tabletop display had not been considered. In this paper,
the authors propose transparent tabletop interfaces that
can be used with the Lumisight Table and camera-
based recognition.
3. Lumisight Table
The requirements for a system that supports a face-
to-face collaboration include provision for the sharing
of nonverbal modalities such as pointing and eye
contact and the ensuring of equal accessibility to shared
information, such as visibility when browsing
information that has a particular orientation (e.g. texts,
images). What is important for satisfying those
constraints is the position and orientation of
information displayed on the screen.
Lumisight Table is an interactive view-dependent
tabletop display. The following two characteristics
frame the essence of Lumisight Table.
Physically single, but visually multiple:
Lumisight Table displays four different images,
Figure 2. Lumisight Table.
Figure 3. Interaction on Lumisight Table.
Figure 4. System overview of Lumisight Table.
Proceedings of the First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP ’06) 0-7695-2494-X/05 $20.00 © 2006 IEEE
one for each user's view (see Figure 2).
Capturing user gestures and physical objects on
the tabletop: Users can control the system by
showing hand gestures or putting objects on the
tabletop (see Figure 3).
Figure 4 shows the overview of Lumisight Table. A
camera and multiple projectors are installed under the
table. Our special screen system allows multiple images
to be projected at once.
In order to share one screen without uneven
visibility, the system should provide different images to
every user around the tabletop screen. It is important to
realize this function without any particular device that
users should wear or hold. The tabletop screen should
have a function that filters out images selectively in
accordance to each user.
As a screen material to satisfy such requirement, this
system utilizes Lumisty film for the tabletop screen for
projection. It becomes transparent or opaque depending
on the viewing direction. Usually it is used for a
building material. Figure 5 illustrates the optical
property of Lumisty film. In our system, the opaque
direction is utilized as a back-projection screen. When
an image is projected onto the screen from the opaque
direction, the user sees only the image from the
projector in front of him/her. Several layers of Lumisty
film thus allow us to present different images in
different directions. Concretely, when the system is
intended to be used by four users, two Lumisty films
are laid, one film orthogonal to the other. In addition, a
Fresnel lens is installed below the Lumisty films in this
screen system. Since the range of the angle that
Lumisty film becomes opaque is limited, it is desirable
that the projected light can be regarded as parallel light
rays in order to improve the image quality. By using
the Fresnel lens, the system can realize this function.
As mentioned before, the camera inside the system
captures manual interactions with real physical objects.
Since the Lumisty film is transparent in a vertical
direction, the camera inside of the system can capture
the images of the tabletop, while projectors present
images onto the screen. Since all devices such as
projectors and a camera are installed inside the table,
the system design could be compact and users’ hands
and physical objects do not disturb the image
projection and the marker tracking.
The authors have installed a few interactive
application programs for use with the Lumisight Table
in facilitating collaborative face-to-face work by
multiple users[11] or entertainment[12]. While the
Lumisight Table can display entirely different images
to the respective users, it can also present the same
information to all of the participants like an ordinary
display by projecting the same image from all the
projectors. That means that it features the ability to
display both information to be shared by all users and
information specific to each user on the same screen.
As an example of what has been implemented so far, a
card game for multiple players who gather around a
Figure 5. Optical property of Lumisty film.
Figure 6. Card game on Lumisight Table.
Figure 7. Collaborative application.
Proceedings of the First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP ’06) 0-7695-2494-X/05 $20.00 © 2006 IEEE
single screen table is illustrated in Figure 6. In this
application, players can see the numbers that represent
their own cards, but the cards of the other users appear
as turned face down and cannot be seen. Figure 7
shows a map simulation application in which the map
information is presented to all of the users with the
same places appearing in the same orientations. On the
other hand, text, icons and additional information are
presented with its direction rotated for ease of reading
and recognizing by the different users.
4. Transparent Tabletop Interface for
Multiple Users
4.1. Concept of Transparent Tabletop
Interface
The system the authors describe here uses the
Lumisight Table to display different information in
different positions and with different orientation for
each user. When displaying information according to
the placement of physical objects, the system can also
move the information to a position where each user can
avoid occlusion by the placed object With this
approach, however, the displaying of information in
different places for each user may disrupt the sharing of
non-verbal modalities: if a user designates the
information by his/her finger, another user cannot
recognize what the designated information is.
To solve the problem, this paper proposes a
transparent tabletop interface for use with the
Lumisight Table. The interface is transparent to the
users around the table and does not obstruct their view
of the displayed image even if the interface is placed on
the tabletop. Using this transparent interface increases
the freedom of drawing images without avoiding
occlusion problem of the physical objects. This
approach preserves the equal accessibility to shared
information, facilitates the sharing of non-verbal
modalities and realizes interaction that is natural to the
user.
The recognition of the transparent interfaces
described here is done with a camera installed inside
the table and image processing by installing an
electronic device in the object itself.
4.2. System Design
To implement objects that are invisible to the user
but detectable by the camera, this system uses a
transparent heat insulating film that passes visible light
but absorbs light in the infrared region. Therefore it can
be recognized by an infrared camera. The configuration
of this system is illustrated in Figure 9. First, this
implementation assumes the ceiling above the table to
be evenly illuminated with infrared light. Inside the
table, an infrared camera is placed facing directly
upwards. When markers made of transparent heat
insulating film in known shapes are placed on the table,
the markers block the infrared light that is reflected
from the ceiling. The markers thus appear as dark
shapes to the infrared camera and can be recognized in
real time by image processing.
This time, we use Reftel (type ZC-05G) [13], a
highly transparent, heat insulating film produced in
sheet form by TEIJIN, since it is easy to make markers
from it.
Figure 9. System design of a transparent
tabletop interface.
Figure 8. System overview.
Proceedings of the First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP ’06) 0-7695-2494-X/05 $20.00 © 2006 IEEE
4.3. Implementation
An overall view of the prototype system is shown in
Figure 8. The transparent tabletop interface is made of
a transparent material such as acrylic plastic with Reftel
objects of known shaped attached. Inside the Lumisight
Table (80 cm high) are mounted four projectors (PLUS
V-1100WZ) and one camera. In addition, an infrared
LED lamp was placed near the Lumisight Table to
illuminate the ceiling (white, 2.4m high) with infrared
light.
The system operation situation when transparent
tabletop interface objects are placed on the table screen
is shown in Figure 10. Figure 10 (a) shows an input
camera image, in which the Reftel markers can be seen
as dark shapes because of their low transparency to
infrared light. These objects can be determined by
background subtraction techniques automatically, as
shown in Figure 10 (b).
Figure 10 (c) shows the situation with the image
projected on the screen. The areas where the markers
are placed are somewhat darkened, but the
transparency is sufficient for the image projected onto
the table screen to be fully visible.
In a dark room with one of the projectors inside the
table projecting a white image, the luminance at the
center of the screen was measured with a colorimeter
(Minolta CS-100A) that was placed facing the
projector one meter horizontally away from the center
of the table screen at a height that produced an angle of
40 degrees. The result was a luminance of 3,080
(cd/m2) when no physical object placed on the screen
and a luminance of 1,960 (cd/m2) with a physical
object on the screen.
4.4. Interaction with Transparent Tabletop
Interface for Multiple Users
The transparent tabletop interface can be used on
tabletop displays that adopt transparent or
semitransparent material as screen. Especially, it is
effective on Lumisight Table, since different
information could be displayed on the common
position to each user. In this section, we describe the
interaction on Lumisight Table using the transparent
tabletop interface and show a simple example of
application for multiple users.
4.4.1. Marker recognition and interactive
information display. The image from the camera is
processed by computer to allow display of different
information for the different physical objects. Using
this method, disks that differ in color and marking can
be projected at the positions where objects of different
shapes have been placed, as shown in Figure 11. In the
present implementation, shape recognition was
accomplished in a simple manner with the algorithm
listed below.
(1) Camera image input.
(2) A background subtraction.
(3) A labeling.
(4) A feature parameter extraction.
For the future, the authors are considering use of a
more sophisticated image processing tool such as
(a) Original camera input image.
(b) Background subtraction image.
(c) Appearance of tabletop and transparent
interface.
Figure 10. Implementation.
Proceedings of the First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP ’06) 0-7695-2494-X/05 $20.00 © 2006 IEEE
ARToolKit[13] to implement various kinds of feedback
beyond simple shape information, such as object
orientation, and position.
4.4.2. Use with face-to-face collaboration
applications. We have implemented applications that
present geographic information to each user
surrounding the table, and show the related text-based
or visual information when placing a real object onto it
on three types of combination of displays and physical
objects:
when using an opaque physical object on an
ordinary tabletop display (back projection type),
when using an opaque physical object on
Lumisight Table,
when using a transparent interface on Lumisight
Table.
In addition, the size of the display is 40cm square, and
the size of the marker placed on the physical object is
4cm across.
Figure 12 shows the view from two directions when
physical objects are placed on an ordinary tabletop
display and corresponding photographic or textual
information is displayed on the screen. In this case, all
of the information is displayed at the same position and
with the same orientation, so while non-verbal
modalities such as pointing are shared by the users, the
information orientation problems and the occlusion of
physical objects occur.
Figure 13 shows the case in which opaque physical
objects are used on the Lumisight Table. The
photographic and textual information is oriented for
ease of viewing and is also displayed in a position that
is easily seen, avoiding the physical objects.
Information access symmetry is preserved in this case,
but the projection of information at different positions
may prevent the sharing of non-verbal modalities.
Figure 14 shows the case in which the Lumisight
Table is used with the proposed transparent tabletop
interface as seen from two directions. The same
information is displayed at the indicated position, but is
oriented for ease of reading by each user and the
objects do not obstruct the screen.
This time, we compare the transparent interface with
other opaque objects on such a simple application.
However, we can think other various situations
including the type of application, the function of the
interface. Thus, we are planning to examine the
effectiveness of the transparent interface through a user
experiment in each situation.
5. Conclusion and Future Works
We have proposed a transparent tabletop interface
for use with the Lumisight Table. The interface is
implemented with a material that is invisible to the user
but blocks infrared light so that it can be recognized by
the camera. We described an example of user
interaction with the interface. By using the transparent
tabletop interface together with the Lumisight Table,
multiple users around the table can share non-verbal
modalities yet maintain symmetry of information
access, thus effectively satisfying two requirements that
have previously been mutually conflicting.
In future work, we plan to improve and evaluate the
transparent markers, including comparison with other
materials, and to improve the hardware as well. We are
also considering development of system that works in
combination with opaque materials as well as with a
completely transparent interface. Additionally, we will
continue to propose and implement new types of
interaction and specific applications and to test and
verify their effectiveness.
Finally, we would like to thank Prof. Hiroshi
Harashima, Daisuke Akatsuka and Kouji Takashima
for giving us a lot of useful advises.
6. References
[1] Y. Kakehi, M. Iida, T. Naemura, Y. Shirai, M.
Matsushita and T. Ohguro: ``Lumisight Table:
Interactive View-Dependent Tabletop Display,’’ IEEE
Computer Graphics & Applications, vol.25, no.1, pp
48-53, 2005.
[2] P. Wellner: ``The DigitalDesk calculator: Tangible
manipulation on a desk top display,’’ In Proceedings of
UIST'91, ACM Symposium on User Interface Software
Figure 11. An image on a tabletop screen
changes according to the position and the shape
of the markers.
Proceedings of the First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP ’06) 0-7695-2494-X/05 $20.00 © 2006 IEEE
and Technology, pp. 27-34, 1991.
[3] B. Ullmer and H. Ishii: ``The metaDESK: Models and
Prototypes for Tangible User Interfaces,’’ In
Symposium on User Interface Software and
Technology (UIST'97), pp. 223-232, 1997.
[4] J. Patten, H. Ishii, J. Hines and G. Pangaro:
``Sensetable: A Wireless Object Tracking Platform for
Tangible User Interfaces,’’ in Proceedings of the ACM
Conference on Human Factors in Computing Systems
(CHI'01), pp. 253-260, 2001.
[5] P. Steurer and M. B. Srivastava: ``System Design of
Smart Table,’’ First IEEE International Conference on
Pervasive Computing and Communications
(PerCom'03), pp. 473-480, 2003.
Figure 12. Case of using an opaque physical object on an ordinary tabletop display.
Figure 13. Case of using an opaque physical object on Lumisight Table.
Figure 14. Case of using the proposed transparent tabletop interface on Lumisight Table.
Proceedings of the First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP ’06) 0-7695-2494-X/05 $20.00 © 2006 IEEE
[6] J. Underkoffler and H. Ishii: ``Illuminating Light: An
Optical Design Tool with a Luminous-Tangible
Interface,’’ in Proceedings of the ACM CHI'98, pp.
542-549, 1998.
[7] J. Rekimoto and N. Matsushita: ``Perceptual Surfaces:
Towards a Human and Object Sensitive Interactive
Display,’’ in Workshop on Perceptual User Interfaces
(PUI’97), pp. 30-32, 1997.
[8] J. Rekimoto, B. Ullmer and H. Oba: ``DataTiles: A
Modular Platform for Mixed Physical and Graphical
Interactions,’’ Proceedings of CHI2001, pp.269-276,
2001.
[9] T. Uchida, T. Endo, N. Kawakami and S. Tachi:
``Research of the camouflage marker for the position
detection in Retro-reflective Projection Technology,’’
VRSJ the 9th Annual Conference, pp. 147-150, 2004.
(in Japanese)
[10] Y. Nakazato, M. Kanbara and N. Yokoya: ``Discreet
markers for user localization,’’ Proceedings of 8th
IEEE International Symposium on Wearable
Computers (ISWC’04), pp. 172-173, 2004.
[11] M. Matsushita, M. Iida, T. Ohguro, Y. Shirai, Y. Kakehi,
and T. Naemura: ``Lumisight Table: A face-to-face
collaboration support system that optimizes direction of
projected information to each stakeholder,’’ In
Proceedings of CSCW2004, pp. 274-283, 2004.
[12] Y. Kakehi, M. Iida, T. Naemura, Y. Shirai, M.
Matsushita, and T. Ohguro: ``Lumisight Table:
Interactive View-Dependent Display-Table Surrounded
by Multiple Users,’’ ACM SIGGRAPH 2004 Emerging
Technologies, etech_0016, 2004.
[13] TEIJIN:``Reftel,’’
http://www.teijin.co.jp/english/about/reftel/default.htm
[14] H. Kato and M. Billinghurst: ``Marker Tracking and
HMD Calibration for a video-based Augmented Reality
Conferencing System,’’ In Proceedings of the 2nd
International Workshop on Augmented Reality (IWAR
99), pp. 85-94, 1999.
Proceedings of the First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP ’06) 0-7695-2494-X/05 $20.00 © 2006 IEEE