an information-theoretic framework for flow visualization lijie xu, teng-yok lee, & han-wei shen...
TRANSCRIPT
An Information-Theoretic Framework for
Flow Visualization
Lijie Xu, Teng-Yok Lee, & Han-Wei Shen
The Ohio State University
2
Introduction: Visualization vs Communication
Data
Visualization Algorithm
OutputInput
Communication Channel
Visualization
The effectiveness of a visualization algorithm depends
on the amount of information that can be transmitted
3
Introduction: Information-theoretic Visualization Framework
Can more information be shown?
Visualization Algorithm
Yes
VisualizationData
Visualization should be information-dependente.g. for flow visualization, the flow near the salient flow features shouldn’t be missed
Information
Information in Visualization
No
Information in Data
Stop
4
Outline
1. Realization of this visualization framework for static flow
a. Measurement of information in data (vector field)
b. Measurement for information in visualization (streamlines)
c. Information comparison between data & visualization
2. An information-aware streamline placement algorithm• Place streamlines to highlight salient flow features• Automatically stop when no more information can be
shown
5
Review: Information and Entropy
Information theory: quantitatively measures the amount of information contained in a data source
Data SourceRandom variable X
x1 x2 x3 x4
p(xi): Probability X = xi
p(xi)
x1 x2 x3 x4
Less information
Only one possible outcome
p(xi)
x1 x2 x3 x4
More information
More equally possible outcomes
Maximal EntropyMinimal Entropy
Entropy of X H(X) = -∑p(xi) log2p(xi)
6
Information in Vector Fields
• Concept• Treat the vector field as a data source that generates vector orientation
as outcome• The more diverse the vector orientations, the more information is
contained in the vector field
• Measurement • Estimate the distribution of the vector orientation• Compute the entropy of this distribution as the measurement
Vector field Polar Histogram
7
Information in Vector Fields (contd’)
8
Entropy Field and Seeding
Measure the entropy around each point’s neighborhood
Vector Field Entropy field: higher value means more information in the corresponding region
Entropy-based seeding: Places streamlines on the region with high entropy
9
Information in Streamlines
Given information: estimate the vector along the streamlines by their trajectory
Reconstructed information: synthesize the vectors of the empty regions
What kind of information is represented by the streamlines?
A synthesized vector field
10
The Information Comparison between Data/Visualization
Conditional entropy H(X|Y):The information in X not represented by Y
An effective visualization should represent most information in the data, i.e.
H(X|Y) should be small
Vector Field X
H(X) H(Y)
Streamlines Y
H(X|Y)
11
H(X)
Conditional Entropy and Joint Entropy
H(X|Y) H(Y) H(X, Y) H(Y) – =
Vector field from the streamlines
H(Y): entropy of the distribution of only vectors Y on all points
Input vector field
H(X, Y): entropy of the joint distribution of both vectors X and Y on all points
H(X, Y): Joint Entropy of
both X and Y
H(Y): Entropy of Y
streamlines
12
Conditional Entropy Field and Seeding
Measure the under-represented information in each region
Streamlines Conditional entropy field
Conditional-entropy-based seeding: Place more seeds on regions with higher under-represented information
13
Removal of Redundant Streamlines
• Remove the streamlines that cannot reduce the conditional entropy
• Computing conditional entropy per streamline is slow
• Approximation• Discard a streamline if all its points are too
close to existing streamlines• Use a smaller distance threshold for
regions with higher entropy
After the removal of redundant streamlines
Initially placed seeds
14
Information-theoretic Streamline Placement Algorithm
Can more information be shown?
Visualization Algorithm
Yes
(Streamlines)(Vector Field)
Information
Information in Visualization
No
Information in Data
Stop
(Entropy Field)Iteration 1
Entropy-based Seeding
Iteration > 1 Conditional-
entropy-based Seeding
Can H(X|Y) be further reduced?
(Synthesized Vector Y)(Original Vector X)
Data Visualization
15
Result: 2D Vector Fields
1st iteration: Entropy-based seeding
2nd iteration: Cond.-entropy-based seeding
When conditional entropy converges
Conditional entropy
16
Comparison with Distance-based Methods
Evenly-space streamline placement (Jobard & Lefer, 1997)1
Farthest-point streamline placement (Mebarki et al., 2005)2
Information-theoretic streamline placement
1. B. Jobard and W. Lefer. Creating evenly-spaced streamlines of arbitrary density. In Eurographics Workshop on Visualization in Scientific Computing, 1997.
2. A. Mebarki, et al. Farthest point seeding for efficient placement of streamlines. In IEEE Vis ’05, 2005.
Emphasizes regions with more information
17
Comparison with Distance-based Methods (contd’)
• The first 54 streamlines placed by different methods. • Distance-based methods need more
streamlines to capture the salient flow features
Entropy-based streamline placement
Evenly-space streamline placement (Jobard & Lefer, 1997)1
Farthest-point streamline placement (Mebarki et al., 2005)2
18
Result: 3D Vector Fields
Entropy field1st iteration: Entropy-
based seeding2nd iteration: Cond.-
entropy-based seeding
Conditional entropy
19
Entropy-based Occlusion-reduction
• Reduce the occlusion to the salient features in 3D flow field• Use a transfer function to map the entropy of each
streamline fragment to its opacity
20
Conclusion
• Contributions• An information-theoretic visualization framework• Information measurement for flow field visualization• An information-aware streamline placement algorithm
• Future work• Time-varying flow fields• Other types of visualization
• Isosurface• Direct volume rendering
21
Acknowledgement
• The authors would like to thank Torsten Möller, Carrie Stein, and the anonymous reviewers for their comments
• This work was supported in part by • NSF ITR Grant ACI-0325934• NSF RI Grant CNS-0403342• NSF Career Award CCF-0346883• DOE SciDAC grant DE-FC02-06ER25779
• Questions?