infoticles: information modeling in immersive …infoscape.org/publications/iv02.pdfinfoticles:...
TRANSCRIPT
Infoticles: Information Modeling in Immersive Environments
Andrew Vande Moere
Chair of Architecture and CAAD
Swiss Federal Institute of Technology Zurich
Abstract
This paper introduces an immersive virtual reality
application that allows users to browse and explore the
contents of database systems. We have implemented a
visualization metaphor that is based upon the intrinsic
characteristics of particles, coined 'infoticles', which are
used as representations of data objects. Users are able to
interact with the dynamic, three-dimensional visualization
by manipulating forces and surfaces. These tools,
representing respectively user interests and data filters,
influence the collection of infoticles according to the rules
of Newtonian mechanics. Informational values are
expressed through the presence of both dynamic and
static characteristics such as motion, directionality, and
form. We demonstrate these principles trough a prototype
that uses our university’s financial budget data.
Keywords: information visualization, virtual reality,
database, exploratory data analysis
1. Introduction
Current database technology makes it possible to store
and manage real world data in a comprehensive manner.
However, the exploration of this data is still bound to
relatively rigid interfacing methods, such as table-based or
schematic queries. Intuitive interfaces are needed that
support the process of data browsing, which is
distinguished from traditional information retrieval and
data queries and instead focuses on rapid filtering
mechanisms.
The emergence of virtual reality hardware at affordable
prices offers opportunities for novel interaction and
visualization methods in many scientific applications [8].
The unique properties of presence, spatial awareness and
stereoscopic depth of such immersive systems offer
application designers a complete novel set of possibilities
that promise tremendous capabilities.
Simultaneously, one can observe the increasing
importance of interaction design related to user-
experience, and the appearance of characteristics such as
entertainment, pleasant feelings, or coolness, in current
data-driven visualizations. Many examples of
contemporary multimedia interfaces [14], some created
with Macromedia Flash [9][10], show the use of 3D-
imitating and graphically intensive interfaces. In fact,
these applications are real-world examples of interactive
information browsing worlds that users find pleasing to
the eye and enjoyable to work with.
Taking into consideration previous phenomena, we
have tried to tightly merge visualization and interaction
techniques into a single metaphor that helps users to
visually find data patterns and exceptions through a
continuous refinement and re-evaluation process. In our
application, a collection of evolving particle systems
represents the data sets that are retrieved from a remote
database. Tools such as forces and surfaces influence the
continuous flowing of atomic objects, so that data
relationships are emphasized by dynamic and spatial
characteristics such as directionality, motion, and form.
Our current visualization application deals with
financial data retrieved from our university, and explores
the relationships between several variables such as
budgets, departments and amounts of students over time.
2. Related Work
Scatter plot representations like Starfield Displays [1]
demonstrate the use of spatially distributed points
organized in static graphs, which are capable of being
directly adapted by dynamic user decisions. This
technique is able to handle massive amounts of data in an
efficient manner, and is used in different variations to
visualize e.g. adaptive database query streams [6], and
financial documents [5] or time-varying storm simulations
[7] in immersive systems. Mostly, they use some kind of
Euclidian positioning mechanism that translates numeric
data values into static spatial coordinates.
Figure 1. Two data streams (circles: money, students)
are affected by two forces (squares: departments D-
ARCH, D-CHEM) and are filtered by a surface (D-
ARCH). D-ARCH infoticles cluster around the force, D-
CHEM infoticles bounce back from the filter, while all
other infoticles are unaffected by this specific spatial
setting.
Particle systems are a general technique within the
field of computer graphics for creating a wide range of
visually complex effects [13]. A group of particles can
even convey complex behavior [17] when combined with
phenomena like external forces or internal relationships.
The use of force placement and spring-embedded
algorithms is a widely investigated topic in the field of
information visualization. These techniques are capable of
generating so-called undirected graphs [4], point clouds,
3D landscapes [3], and blobby forms [15] and are used in
virtual reality applications, such as Q-SPACE [12] and
VR-VIBE [2].
Typically, these applications require a certain amount
of dedicated pre-computation of a static virtual world,
leaving users with a limited set of interaction possibilities.
Due to the performance constraints of the complex spatial
organization methods, time dependent changes and direct
user influences are limited, so that the resulting
procedurally generated structures are most often read in a
static state.
3. Visualization metaphor using particles
In our visualization environment, each emitter
corresponds with a unique database table, and each single
particle relates, conceptually as well as programmatically,
with a certain data object (e.g. a single student). This
translation technique leads to every single data value (e.g.
a student, $10, etc.) being treated equally and results in an
identical ‘particle’ representation.
Figure 2. Large circular regions surrounding the icons
enable easy selection and manipulation by users in
the immersive environment.
Notably, numeric data values retrieved from the
database rows are first interpreted into an according set of
particles (e.g. the value of 1000 students of department D-
ARCH is translated into 1000 unique infoticles of
department D-ARCH), so that the resulting visualization
enables a visual interpretation of e.g. proportionality.
We have coined these particles 'infoticles', as they are
determined and behave depending upon the informational
values they contain. The main physical characteristics of
an infoticle, such as speed, direction and lifespan, are
intrinsically time-dependent. Seen as a group, infoticles
can evolve over time and may exhibit complex behavior.
Possible data sources using this metaphor can be not
only static databases, but also live streams of time-varying
informational values. As data values are mapped onto
infoticles with a limited lifespan, fresh data enters the
world while older infoticles fade off, and are removed
from the scene. Consequently, users can observe the
changes in the data stream and repeat the time sequences.
Next to the typical particle lines, which offer a rapid
and non-occluding visualization method, we are currently
implementing different visual representations methods
such as texture blending rendering [Fig. 3], and are
considering the use of implicit surface modeling [15][19].
3.1 Tools
We have paid special attention to developing an
intuitive user interface that neither breaks the three-
dimensional illusion nor occludes the visualization with
text-based menus, sliders or other widget elements. Users
control the dynamic information visualization solely
through a set of modeling tools that influence the
infoticles according to the laws of Newtonian mechanics.
Figure 3. A collection of infoticles exposing more
detailed information as users approach. Here, the
infoticles are rendered using a texture blending
method.
These tools are placed inside the three-dimensional
scene and thus determine the spatial layout. They consist
of attracting/repulsing forces and boundary surfaces,
which represent respectively user interests and data-
filtering queries. Each of these tools contains a specific
data attribute (e.g. department D-ARCH), so that it solely
affects those infoticles that possess an equal data value.
This means that infoticles (representing e.g. money,
students) carrying a certain value (e.g. department D-
ARCH) are attracted by forces or let through by surface
filters with the same data value. All infoticles with
different data values are unaffected by these forces, but
bounce back from those surfaces [Fig. 1].
In practice, forces cause specific subsets of particles to
group or move into specific directions of attention.
Properly positioned filters divide the workspace into user-
specified regions of data objects that contain a common
value. These features are implemented so that users can
enforce data filtering and clustering by combining filters
and forces in spatial constellations [Fig. 2, 4]. Ultimately,
users should be able to ‘model’ their personalized spatial
configuration inside the information environment.
3.2 Interface
Abstract icons represent the forces and infoticle
sources, while filters are depicted by rectangular surfaces.
Large circular selectable regions surround the small icons,
so that they become easy to select and manipulate inside
the immersive environment. A cursor that is steered by
pointing a six-dimensional VR mouse currently serves as
the primary input mechanism. Furthermore, users are able
to navigate around as well as move and rotate all elements
inside the environment.
Figure 4. A user immersed in an infoticle application,
using a pair of six-dimensional mice as input devices.
Users are capable to select and interact with a more
detailed subset of information [Fig. 3], through a user-
gaze LOD (Level of Detail) mechanism, which is
controlled by a direction sensor on the head of the user. In
practice, this means that infoticles that are both nearby
and in the visual range of a user become selectable and
reveal their detailed data attributes in the form of text
labels.
Our virtual world is built up in human proportions,
with filters appearing as large as normal doors, providing
users with intuitive and cognitive ways of orientation.
Moreover, infoticles are not mapped onto coordinate axes.
Instead, they expose meaning through characteristics such
as their relative distances and proximity to forces and
filters, the proportionality of clusters (e.g. in relation to
other clusters, size, amount, position, and also internally
as proportion of colors, data values, etc.), their natural,
spatial solution paths that the infoticles follow, and the
sudden changes of direction and speed when users adapt
the environment. We predict that users show less
disorientation and interpretation problems than with
ordinary three-dimensional Cartesian mapping
visualizations, as the need of recognizing the visualization
context in relation to the center of the visualization or
detecting the directions of the graph axes becomes
unnecessary.
3.3 Analysis
The moving, animated infoticle systems show resulting
tendencies of user queries, as users drag or change forces,
and infoticles regroup or cluster in new constellations.
Simple hand movements act like weak wind blows, as
users try to ‘grasp’ inside the visualized data clouds or
wave away those infoticles that occlude their view. By
setting the forces and surfaces in a unique user-defined
constellation, different flocks of infoticles will
dynamically move, change direction or merge in several
groupings, unveiling the proportionality and amounts of
certain data attributes. Although the snapshots and figures
might look visually complex, research as well as our own
hands-on experience has shown that stereo viewing is a
powerful pre-attentive feature [11] that increases the size
of a graph that can be understood [18].
The aspect of motion, which is intrinsically connected
to the concept of particle systems, makes the application
react in real-time. The continuous cue-of-motion
generated by the infoticles or the navigation of users is
also reinforced by the powerful stereoscopic aspects of the
virtual reality hardware that exposes the relative and
informational depth values of the infoticles and tools.
Additionally, as all infoticles are influenced by the same
set of tools, the motion also exposes the aspect of history
in ways of volumetric tentacles and directional paths.
Once ‘frozen’, the clusters of infoticles can be read as a
single object (data overview) with unique clusters or
bulges towards forces or distinctive direction changes
nearby filtering surfaces. When users approach these
infoticles, the contents of the atomic entities (database
content fields, attributes, etc.) become revealed.
Consequently, the data analysis can be made in many
meaningful ways: from different distances and moments in
time and with the system in a static or a dynamic state.
In our current application, we have decided that the
data sets will stream out in a time-ordered way, so that
users can also observe changes in relative position or size
of infoticle groups over a certain period of time. However,
no analysis can be done in an ‘absolute’ quantitative way:
there is always a reasonable possibility that some
infoticles flew out of view or were otherwise affected
during the dynamic modeling process itself. We are
assessing this problem by continuously repeating streams
of equal data sets, giving users the chance to learn from
their previous modeling actions, so the spatial
constellation of tools can be adapted accordingly.
It is also an interesting question as to whether users
are capable of reading and interpreting the visualization
representations as expected. Early experiences, however,
have shown that as our modeling tools have an immediate
effect upon the visual representation, their meaning and
goal, and thus the resulting interpretation of the
visualization, needs little explanation or reasoning.
4. Implementation
We have implemented, for testing and evaluation
purposes, a set of prototypes of our proposed infoticle
system. These applications were programmed on top of
the SGI OpenGL Performer programming interface.
Without considering performance optimization, we are
currently able to handle and render about 12.000
particles/data objects simultaneously with an acceptable
frame rate. Test runs are being made in the early prototype
of the blue-c [16] virtual reality theater, a tele-immersive
virtual reality environment. We currently use a pair of
Ascension ‘Flock of Birds’ six-dimensional mice as crude
but effective input devices. Simultaneously, we have
implemented an application middleware framework that
enables bi-directional data transfers between a MySQL
database, the virtual environment and several Internet
clients. The implementation of this system framework was
kept as generic as possible, so that different kinds of data
can be submitted and subsequently be visualized.
5. Future Work
We view these early experiments as very promising
and are exploring numerous applications for the infoticle
metaphor. We will also perform an evaluation of the used
visualization and interaction methods.
Currently, most individual infoticle characteristics such
as speed, mass and lifespan are globally defined in
relation to an easy comprehension of the visualization,
and are not individually mapped to the connected data
object. Consequently, we would like to develop further
conceptual models that map meaningful values onto these
infoticle variables.
We are looking forward to other scenarios, such as
visualizations of a real-time stock exchange or news
agencies data streams. We want to further develop the
infoticle metaphor and analyze how several forces can be
meaningfully combined or which kinds of conceptual data
queries can be made. Additional research will have to
prove if this technique can be used as a general way of
three-dimensional data browsing or data mining for
immersive systems, capable of dealing with different sorts
of data and dynamic user queries. Further development
will be needed to deal with larger amounts of real-world
time-varying data sets. Additionally, we will investigate
how user annotations that track the progress and
discoveries of the data exploration can be recorded back
into the database. We will also experiment with different
visual representations and interaction mechanisms of the
particles and modeling tools.
6. Conclusion
This paper presented an overview of the concept of
infoticles, an interaction and visualization metaphor that
supports the exploration of large amounts of data in
immersive virtual environments, and explained its
interfacing and data translation principles. We have shown
how this technique is highly interactive, as users are
provided with an absolute creative power over the
visualization. By offering a limited set of powerful
manipulation tools such as forces and boundary surfaces,
users are able to filter and query data in a direct and visual
way. Information values are exposed through both static
and dynamic characteristics of the resulting infoticle
clusters and forms.
The infoticle metaphor uses some capabilities of
immersive virtual reality, by providing users with a
human-scale visualization environment, which has
intuitive, three-dimensional interfacing methods.
Furthermore, the visualization uses stereoscopic
characteristics to perceive the occurrence and the meaning
of the animated changes, relative distances and directional
moves of the infoticles.
7. Acknowledgements
We would like to thank Prof. Gerhard Schmitt and
Prof. Maia Engeli for their continuous support and useful
comments, and Martin Naef and Kuk-Hwan Mieusset for
their programming help.
8. References [1] C. Ahlberg and B. Shneiderman, Visual Information
Seeking: Tight Coupling of Dynamic Query Filters with
Starfield Displays, Proc. CHI '94, April 1994, pp. 313-317.
[2] S. Benford, D.Snowdon, C. Greenhalgh, and R. Ingram,
VR-VIBE: A Virtual Environment for Co-operative
Information Retrieval, Eurographics'95, August 1995, pp.
349-360.
[3] M. Chalmers, Using A Landscape Metaphor to Represent a
Corpus of Documents, Proc. European Conference on
Spatial Information Theory, Elba, September 1993.
[4] P. Eades, A Heuristic for Graph Drawing, Congressus
Numerantium, No. 42, pp. 149-160.
[5] D.S. Ebert, C. Shaw, A. Zwa, E. L. Miller, and D. A.
Roberts, Minimally-immersive Interactive Volumetric
Information Visualization, Proc. of the IEEE Symposium
on Information Visualization, October 1996.
[6] J.M. Hellerstein, R. Avnur, A. Chou, C. Hidber, C. Olston,
V. Raman, T. Roth, and P.J. Haas, Interactive Data
Analysis: The Control Project, IEEE Computer, August
1999.
[7] V. Jaswal, CAVEvis: Distributed Real-Time Visualization
of Time-Varying Scalar and Vector Fields Using the
CAVE Virtual Reality Theater, Proc. IEEE Visualization
'97, 1997.
[8] J. Leigh, A. Johnson, T. DeFanti, S. Bailey, and R.
Grossman, A Tele-Immersive Environment for
Collaborative Exploratory Analysis of Massive Data Sets,
ASCI 99, June 1999, pp. 3-9.
[9] Macromedia: http://www.macromedia.com
[10] Moccu: http://www.moccu.com
[11] K. Nakayama and G. Silverman, Serial and Parallel
Processing of Visual Feature Conjunctions, Nature 320,
1986, pp. 264-265.
[12] S. Pettifer, J. Cook, and J. Mariani, Towards Real-Time
Interactive Visualisation in Virtual Environments: A Case
Study of Q-SPACE, Proc. International Conference on
Virtual Reality 2001, Laval, May 2001.
[13] W. T. Reeves, Particle Systems – A Technique for
Modeling a Class of Fuzzy Objects, Computer Graphics,
Vol. 17, No. 3, 1983, pp. 359-376.
[14] Thinkmap: http://www.thinkmap.com
[15] T. C. Sprenger, R. Brunella, M. and H. Gross. H-BLOB: A
Hierarchical Visual Clustering Method Using Implicit
Surfaces, IEEE Visualization 2000, October 2000.
[16] O. G. Staadt, A. Kunz, M. Meier, and H. Gross, The blue-
c: Integrating Real Humans into a Networked Immersive
Environment, Proc. of ACM Collaborative Environments,
September 2000, pp. 201-202.
[17] D. Tonnesen, Particle Systems for Artistic Expression,
Proc. of Subtle Technologies Conference, Toronto, May
2001, pp. 17-20.
[18] C. Ware and G. Franck, Evaluating Stereo and Motion
Cues for Visualising Information Nets in Three
Dimensions, ACM Transactions on Graphics, Vol. 15, No.
2, April 1996, pp. 121-140.
[19] A. P. Witkin and P.S. Heckbert, Using Particles to Sample
and Control Implicit Surfaces, Computer Graphics (Proc.
SIGGRAPH '94), Vol. 28, 1994.