Figures

Creative Commons License

The original figures from the book are released here under the Creative Commons Attribution 4.0 International License (CC BY 4.0). When you use a figure for your own work, please, cite the book appropriately, for example, like this: Christian Tominski and Heidrun Schumann. "Interactive Visual Data Analysis". AK Peters Visualization Series, CRC Press, 2020.

Some figures were created by colleagues from the visualization community and are used in the book under the CC BY 4.0 license. Below, these figures are clearly marked by giving the original figure author in the figure caption. When you use these figures, please, do include an appropriate attribution to the original author.

Figure Download

You can download an archive with all figures from the book.

Figure Search

Search for figures based on their caption or browse, view, and download individual figures below.

Currently showing figures.

Figure Overview

Chapter 1

Chapter 2

Chapter 3

Chapter 4

Chapter 5

Chapter 6

Figure Details

Chapter 1

Figure 1.1a
Original book figure.
Node-link diagram visualizing a graph's structure and attributes. (a) Plain structure.
Figure 1.1b
Original book figure.
Node-link diagram visualizing a graph's structure and attributes. (b) Encoding degree via color.
Figure 1.1c
Original book figure.
Node-link diagram visualizing a graph's structure and attributes. (c) Encoding degree via color and size.
Figure 1.1d
Original book figure.
Node-link diagram visualizing a graph's structure and attributes. (d) Encoding weight via line width.
Figure 1.2a
Original book figure.
Dynamic filtering to focus on relevant parts of a climate network. (a) Full graph with 6,816 nodes and 116,470 edges.
Figure 1.2b
Original book figure.
Dynamic filtering to focus on relevant parts of a climate network. (b) Filtered graph with 938 nodes and 5,324 edges.
Figure 1.3
Original book figure.
Multiple-views visualization of a climate network.
Figure 1.4a
Original book figure.
Guidance provides assistance during data navigation. (a) Where should I go next?
Figure 1.4b
Original book figure.
Guidance provides assistance during data navigation. (b) Visual cues hint at candidates!
Figure 1.6
Original book figure. Icons by icons8.com.
Chapter structure of this book.

Chapter 2

Figure 2.1a
Original book figure.
Visualization of life satisfaction in Germany. (a) Failing visual representation.
Figure 2.1b
Original book figure.
Visualization of life satisfaction in Germany. (b) Succeeding visual representation.
Figure 2.2
Original book figure.
Functional dependency between the reference space and the attribute space. For a point in the reference space, there is exactly one point in the attribute space.
Figure 2.3
Original book figure.
Key terms for characterizing data.
Figure 2.4
Original book figure.
The scope defines to which extent an observation is valid. (a) Global scope. (b) Local scope. (c) Point scope.
Figure 2.5a
Original book figure.
Visualizing the local scope of measurements of water quality. (a) Data points only.
Figure 2.5b
Original book figure.
Visualizing the local scope of measurements of water quality. (b) Voronoi partitioning.
Figure 2.5c
Original book figure.
Visualizing the local scope of measurements of water quality. (c) Shepard interpolation.
Figure 2.6
Original book figure.
Meta-data to characterize the data to be analyzed.
Figure 2.7
Original book figure.
Four-set Venn diagram illustrating different classes of data.
Figure 2.8a
Original book figure.
Different visual encodings to support different tasks. (a) Coloring suited to identifying values.
Figure 2.8b
Original book figure.
Different visual encodings to support different tasks. (b) Coloring suited to locating extrema.
Figure 2.9
Original book figure.
A target as defined by projection and selection.
Figure 2.10
Original book figure.
Goals, questions, targets, and means characterize analysis tasks.
Figure 2.11a
Original book figure.
Meteorological measurements over the course of the year. (a) Hours of sunshine.
Figure 2.11b
Original book figure.
Meteorological measurements over the course of the year. (b) Air temperature.
Figure 2.11c
Original book figure.
Meteorological measurements over the course of the year. (c) Cloud cover.
Figure 2.12a
Original book figure.
Histograms of the distribution of cloud cover values. (a) Cloud cover value frequencies in Rostock.
Figure 2.12b
Original book figure.
Histograms of the distribution of cloud cover values. (b) Cloud cover value frequencies in Dresden.
Figure 2.13
Original book figure. Adapted from Munzner (2009).
Four nested levels outline how to design interactive visual data analysis solutions.
Figure 2.14
Original book figure. Adapted from Chi (2000).
Data-oriented and graphics-oriented stages and operators.
Figure 2.15
Original book figure.
A network of operators describes the data's transformation through several stages from data values to image data.
Figure 2.16
Original book figure. Adapted from Sacha et al. (2014).
Knowledge generation model.

Chapter 3

Figure 3.1a
Original book figure.
Illustration of the effect of different visual representations. (a) Line plot.
Figure 3.1b
Original book figure.
Illustration of the effect of different visual representations. (b) Spiral plot (cycle length 32 days).
Figure 3.1c
Original book figure.
Illustration of the effect of different visual representations. (c) Spiral plot (cycle length 28 days).
Figure 3.2
Original book figure.
Visual variables.
Figure 3.3
Original book figure. Adapted from Mackinlay and Winslow (2014).
Effectiveness ranking of visual variables.
Figure 3.4
Original book figure.
Color maps for identifying and locating values and classes.
Figure 3.5a
Original book figure. Adapted from bl.ocks.org/mbostock/4063318.
Applying the color maps from Figure 3.4 to temperature data. (a) Color coding for identification tasks.
Figure 3.5b
Original book figure. Adapted from bl.ocks.org/mbostock/4063318.
Applying the color maps from Figure 3.4 to temperature data. (b) Color coding for location tasks.
Figure 3.6
Original book figure.
Basic mapping of a data variable onto a visual variable.
Figure 3.7
Original book figure.
Enhanced data-dependent visual mapping. (a) Value range expansion. (b) Logarithmic mapping. (c) Box-Whisker mapping.
Figure 3.8
Original book figure.
Combined color map for comparing two data variables.
Figure 3.9
Original book figure. Adapted from John et al. (2008).
Two-tone coloring explained.
Figure 3.10
Original book figure.
Two-tone visualization of 20 years of daily temperatures.
Figure 3.11a
Original book figure.
Visual encoding via position, area, color, and shape. (a) Mapping to position.
Figure 3.11b
Original book figure.
Visual encoding via position, area, color, and shape. (b) Mapping to area.
Figure 3.11c
Original book figure.
Visual encoding via position, area, color, and shape. (c) Mapping to color.
Figure 3.11d
Original book figure.
Visual encoding via position, area, color, and shape. (d) Mapping to shape.
Figure 3.12
By Martin Röhlig. Licensed under CC BY.
Terrain visualization with overview+detail.
Figure 3.13a
Original book figure.
Illustration of focus+context for a table-based visualization of the Iris flower dataset. Focused rows are magnified to accommodate labels. (a) Regular visualization.
Figure 3.13b
Original book figure.
Illustration of focus+context for a table-based visualization of the Iris flower dataset. Focused rows are magnified to accommodate labels. (b) Focus+context distortion of rows.
Figure 3.14
Original book figure.
Multiple coordinated views for analyzing multivariate data.
Figure 3.15
Original book figure.
Two-tone colored table-based visualization of the Cars dataset.
Figure 3.16
Original book figure.
Table Lens with textual labels for focused data tuples.
Figure 3.17
Original book figure.
9 × 9 scatter plot matrix of meteorological data. Color is used to ease the recognition of data variables.
Figure 3.18
Original book figure.
Visualization with polylines across parallel and star-shaped axes. (a) Parallel coordinates plot. (b) Radar chart.
Figure 3.19
Original book figure.
Visual patterns between pairs of parallel axes.
Figure 3.20
Original book figure.
The same data as in Figure 3.19 visualized as scatter plots.
Figure 3.21
Original book figure.
Parallel coordinates with histograms showing demographic data.
Figure 3.22
Original book figure.
Examples of classic glyphs for visualization. (a) Autoglyph. (b) Stick figures. (c) Chernoff faces.
Figure 3.23
Original book figure.
Corn glyph for representing six ordinal data values.
Figure 3.25
By Thomas Nocke. Licensed under CC BY.
Pixel-based visualization of daily values of six meteorological attributes collected for more than hundred years in the city of Potsdam.
Figure 3.27
Original book figure.
Mosaic plot visualizing the survival of passengers of the Titanic.
Figure 3.29
Original book figure. Adapted from Allen (1983).
Temporal relations for time instants and time intervals. (a) Instant relations. (b) Interval relations.
Figure 3.30
Original book figure.
Aspects of time to be considered when visualizing temporal data.
Figure 3.31
Original book figure. Adapted from Steiner (1998).
Types of data with references to time.
Figure 3.32
Original book figure.
Small multiples visualization of the number of people diagnosed with problems of the upper respiratory tract.
Figure 3.33
Original book figure.
Time Wheel visualization of human health data.
Figure 3.34
Original book figure. Adapted from Qiang et al. (2012).
Visual representation of intervals using the triangular model. (a) Standard interval representation. (b) Intervals in the triangular model.
Figure 3.35
Original book figure. Adapted from bl.ocks.org/mbostock/4060954.
Stream graph with randomly generated data.
Figure 3.36
Original book figure.
Spiral display with four years of daily temperatures in Rostock.
Figure 3.37
Original book figure.
Comparison of a regular line plot (top) and a cycle plot (bottom).
Figure 3.39
Original book figure. Adapted from Aigner et al. (2005).
Visualization of uncertain time intervals for planning purposes.
Figure 3.40
Original book figure.
The TimeViz Browser provides an illustrated overview of more than a hundred techniques for visualizing time and temporal data.
Figure 3.41
Original book figure.
Spatial regions at different scales: Federal state, districts, zip-code regions.
Figure 3.42
Original book figure.
Geo-spatial data can refer to different spatial units. (a) Points. (b) Lines. (c) Areas. (d) Volumes.
Figure 3.43a
Original book figure. Adapted from bl.ocks.org/mbostock/3757119.
Different map projections preserve different spatial properties. (a) Equirectangular.
Figure 3.43b
Original book figure. Adapted from bl.ocks.org/mbostock/3757132.
Different map projections preserve different spatial properties. (b) Mercator.
Figure 3.43c
Original book figure. Adapted from bl.ocks.org/mbostock/4479477.
Different map projections preserve different spatial properties. (c) Natural Earth.
Figure 3.44
Original book figure. Adapted with permission by Nicolas Belmonte from philogb.github.io/page/myriahedral/.
Myriahedral projections of the Earth.
Figure 3.45a
Original book figure. Generated with mapshaper.org.
Map representation at different resolutions. (a) Original data 100%.
Figure 3.45b
Original book figure. Generated with mapshaper.org.
Map representation at different resolutions. (b) Data reduced to 50%.
Figure 3.45c
Original book figure. Generated with mapshaper.org.
Map representation at different resolutions. (c) Data reduced to 10%.
Figure 3.46
By Steve Dübel. Licensed under CC BY.
Terrain rendering of the Puget Sound region.
Figure 3.47a
Original book figure.
Reducing overlap of stream graph glyphs on a map. (a) Straightforward placement.
Figure 3.47b
Original book figure.
Reducing overlap of stream graph glyphs on a map. (b) Overlap-optimized placement.
Figure 3.48a
Original book figure.
Indirect visualization of geo-spatial data. (a) Univariate choropleth map plus multivariate parallel coordinates plot.
Figure 3.48b
By Thomas Butkiewicz. Licensed under CC BY.
Indirect visualization of geo-spatial data. (b) Flexible visualization via probes.
Figure 3.49
Original book figure. Adapted from Dübel et al. (2014).
Systematic view of 2D and 3D representations of geo-spatial data and geographic space.
Figure 3.50
By Steve Dübel. Licensed under CC BY.
3D visualization of the trajectory of an aircraft approaching Sion airport.
Figure 3.51
By Martin Röhlig. Licensed under CC BY.
Visibility widgets help users identify obscured information in 3D geo-visualizations.
Figure 3.52
Original book figure.
Visualization of movement trajectories. (a) 2D map with 2D paths. (b) 2D map with stacked 3D bands.
Figure 3.53a
Original book figure.
Visualizing spatio-temporal data using 3D glyphs on a 2D map. (a) Pencil glyphs for linear trends.
Figure 3.53b
Original book figure.
Visualizing spatio-temporal data using 3D glyphs on a 2D map. (b) Helix glyphs for cyclic patterns.
Figure 3.54
Original book figure.
Creation of a non-planar 3D slice through space-time. (a) Topological path. (b) Geometrical path. (c) Extruded slice.
Figure 3.55
Original book figure.
Spatial-temporal visualization along a wall on a map.
Figure 3.56
Original book figure. Adapted from Hadlak et al. (2015).
Facets to be considered when visualizing graphs. (a) Structure (b) Attributes (c) Time (d) Space (e) Groups
Figure 3.57
Original book figure. Created with gephi.org.
Node-link diagram of flights connecting US airports.
Figure 3.58
Original book figure. Adapted from van Ham et al. (2009).
Node-link diagram and corresponding matrix representation.
Figure 3.59
Original book figure. Adapted from Shen and Ma (2007).
Graph patterns represented as matrices and node-link diagrams.
Figure 3.60a
Original book figure. Adapted from bost.ocks.org/mike/miserables/.
Differently ordered matrix representations of the same data. (a) Ordered by name.
Figure 3.60b
Original book figure. Adapted from bost.ocks.org/mike/miserables/.
Differently ordered matrix representations of the same data. (b) Ordered by frequency.
Figure 3.60c
Original book figure. Adapted from bost.ocks.org/mike/miserables/.
Differently ordered matrix representations of the same data. (c) Ordered by community.
Figure 3.61
Original book figure. Adapted from Schulz et al. (2011).
Node-link representation compared to implicit representations. (a) Node-link. (b) Inclusion. (c) Overlap. (d) Adjacency.
Figure 3.62
Original book figure. Software courtesy of Steffen Hadlak.
Implicit visualizations of a classification hierarchy. (a) Squarified Treemap. (b) Information pyramids. (c) 3D sunburst.
Figure 3.63
Original book figure. Adapted with permission by Jean-Daniel Fekete from www.aviz.fr/Research/Nodetrix.
NodeTrix visualization of a co-author network.
Figure 3.64
Original book figure. Adapted from Hadlak et al. (2015).
Spatial composition of graph facets in a single representation. (a) Juxtaposition. (b) Superimposition. (c) Nesting.
Figure 3.65
Original book figure.
Map with tree layouts embedded into selected regions.
Figure 3.66
Original book figure.
Three map layers visualize the data of three consecutive time steps. Spikes and lines indicate differences between the layers.
Figure 3.67
Original book figure.
Multiple coordinated views for multi-faced graph visualization.

Chapter 4

Figure 4.1
Original book figure. Adapted from Norman (2013).
Stages of action forming the action cycle.
Figure 4.2
Original book figure. Adapted from Cooper et al. (2007).
Conceptual separation across different models.
Figure 4.3
Original book figure.
Spatial separation between the graphical user interface (right) and the visual representation in the main view (center).
Figure 4.4
Original book figure. Adapted from Buxton (1990).
Three-state model of graphical input.
Figure 4.5
Original book figure.
Model-view-controller pattern.
Figure 4.6a
Original book figure.
Rubberband selection for marking multiple data elements. (a) Selection by inclusion.
Figure 4.6b
Original book figure.
Rubberband selection for marking multiple data elements. (b) Selection by intersection.
Figure 4.7
Original book figure.
Four steps of selecting multiple trajectories using modifier keys.
Figure 4.8a
Original book figure.
Selecting segments based on their color, which represents speed. (a) How to select slow-speed segments?
Figure 4.8b
Original book figure.
Selecting segments based on their color, which represents speed. (b) Select via an interactive legend!
Figure 4.9a
Original book figure.
Selecting nodes based on their data attributes. (a) How to select high-degree nodes?
Figure 4.9b
Original book figure.
Selecting nodes based on their data attributes. (b) Select via slider handles!
Figure 4.10
Original book figure.
Strategies for visual emphasis of relevant data and attenuation of less-relevant data.
Figure 4.11a
Original book figure.
Visual feedback for selections in visual representations of graphs. (a) Original visual representation.
Figure 4.11b
Original book figure.
Visual feedback for selections in visual representations of graphs. (b) Highlighting by encircling nodes.
Figure 4.11c
Original book figure.
Visual feedback for selections in visual representations of graphs. (c) Dimming nodes and edges.
Figure 4.11d
Original book figure.
Visual feedback for selections in visual representations of graphs. (d) Filtering nodes and edges.
Figure 4.12a
Original book figure.
Brushing a range (red) of an axis for binary and fuzzy selection. (a) Binary selection in the range.
Figure 4.12b
Original book figure.
Brushing a range (red) of an axis for binary and fuzzy selection. (b) Fuzzy selection beyond the range.
Figure 4.13
Original book figure.
Brushing & linking in a multiple-views graph visualization.
Figure 4.14
Original book figure.
Using V^= to scale relevant data to fit the display space.
Figure 4.15
Original book figure.
Illustration of the conceptual model of zoomable interfaces.
Figure 4.16
Original book figure.
Geometric zooming of a node-link visualization.
Figure 4.17
Original book figure.
Semantically enhanced zooming of a node-link visualization.
Figure 4.18
Original book figure.
A zoomable graph visualization and its controls.
Figure 4.19
Original book figure. Adapted from Gladisch et al. (2013).
Visual cues for pointing to off-screen data.
Figure 4.20
Original book figure.
Bring & go with radar view and proxy nodes.
Figure 4.21
Original book figure.
Viewports during an animated transition.
Figure 4.22
Original book figure.
Snapshots of the viewport animation outlined in Figure 4.21.
Figure 4.23
Original book figure.
A range slider controls the time period mapped to a spiral visualization of the daily average temperature for the city of Rostock.
Figure 4.24a
Original book figure.
Adjusting a time period at different input scales. (a) Regular range slider with global scale.
Figure 4.24b
Original book figure.
Adjusting a time period at different input scales. (b) A slider with increased precision is dynamically added to the range slider.
Figure 4.25a
Original book figure.
Integrated sliders for nD pan and zoom in the TimeWheel. (a) Plain non-interactive axes.
Figure 4.25b
Original book figure.
Integrated sliders for nD pan and zoom in the TimeWheel. (b) Axes with integrated sliders.
Figure 4.26
Original book figure.
Integrated range slider for per-axis pan and zoom.
Figure 4.27
Original book figure. Adapted from Tominski et al. (2017).
Schema of an interactive lens.
Figure 4.28
Original book figure. Adapted from Tominski et al. (2017).
Model of a lens pipeline attached to a standard visualization.
Figure 4.29a
Original book figure.
Fundamental effects of lens functions. (a) Alteration.
Figure 4.29b
Original book figure.
Fundamental effects of lens functions. (b) Suppression.
Figure 4.29c
Original book figure.
Fundamental effects of lens functions. (c) Enrichment.
Figure 4.30a
Original book figure.
Lenses with different shapes and orientation. (a) Circular.
Figure 4.30b
Original book figure.
Lenses with different shapes and orientation. (b) Rectangular orientable.
Figure 4.30c
Original book figure.
Lenses with different shapes and orientation. (c) Content-adaptive shape.
Figure 4.31a
Original book figure.
Direct manipulation of lenses. (a) Move and resize.
Figure 4.31b
Original book figure.
Direct manipulation of lenses. (b) Adjust parameters.
Figure 4.32a
Original book figure.
Magnifying details in a map visualization with a fish-eye lens. (a) Regular map visualization.
Figure 4.32b
Original book figure.
Magnifying details in a map visualization with a fish-eye lens. (b) Details magnified with a fish-eye lens.
Figure 4.33a
Original book figure.
Graph lenses for exploring structural relationships. (a) Node-link diagram without lens.
Figure 4.33b
Original book figure.
Graph lenses for exploring structural relationships. (b) Local-edge lens.
Figure 4.33c
Original book figure.
Graph lenses for exploring structural relationships. (c) Bring-neighbors lens.
Figure 4.33d
Original book figure.
Graph lenses for exploring structural relationships. (d) Composite lens.
Figure 4.34
Original book figure.
A lens to query temporal characteristics of movement data.
Figure 4.35
Original book figure.
Orthogonal node-link diagram of a biological network.
Figure 4.36a
Original book figure. Adapted from Gladisch et al. (2014).
Editing using the edit lens. (a) Place lens to insert.
Figure 4.36b
Original book figure. Adapted from Gladisch et al. (2014).
Editing using the edit lens. (b) Adjust lens to update.
Figure 4.36c
Original book figure. Adapted from Gladisch et al. (2014).
Editing using the edit lens. (c) Flick lens to delete.
Figure 4.37
Original book figure.
Visual designs for comparison tasks.
Figure 4.38a
Original book figure.
Natural behavior of people comparing information on paper. (a) Side-by-side.
Figure 4.38b
Original book figure.
Natural behavior of people comparing information on paper. (b) Shine-through.
Figure 4.38c
Original book figure.
Natural behavior of people comparing information on paper. (c) Folding.
Figure 4.39
Original book figure.
Creating sub-views for comparison. A red frame indicates where the left sub-view has been detached from the main view.
Figure 4.40
Original book figure. Adapted from Tominski et al. (2012).
Overview of natural interaction techniques for visual comparison.
Figure 4.41
Original book figure. Adapted from Tominski et al. (2012).
Folding geometry.
Figure 4.42
Original book figure. Adapted from Tominski et al. (2012).
Information-rich, natural, and occlusion-free folding styles.
Figure 4.43
Original book figure.
Relocating selected regions to form a ring for easier comparison. The map background has been desaturated for the purpose of illustration.
Figure 4.44
Original book figure.
Visualizing future appointments with a SpiraClock.
Figure 4.45
Original book figure. Adapted from Spindler et al. (2010).
Extended interaction with tangible views.
Figure 4.46
Original book figure.
A circular tangible lens for magnification purposes.
Figure 4.47
Original book figure.
Comparing matrix data with two tangible views.
Figure 4.48a
Original book figure.
Tangible views for different visualizations. (a) Parallel coordinates.
Figure 4.48b
Original book figure.
Tangible views for different visualizations. (b) Node-link diagram.
Figure 4.48c
Original book figure.
Tangible views for different visualizations. (c) Space-time cube.
Figure 4.50a
Original book figure. Adapted from Lehmann et al. (2011).
Interacting by physical movements. (a) Zones for global control.
Figure 4.50b
Original book figure. Adapted from Lehmann et al. (2011).
Interacting by physical movements. (b) Gaze plus lens for local control.

Chapter 5

Figure 5.2
Original book figure.
Procedure of determining visual density in parallel coordinates. (a) Binned axes. (b) Bin map. (c) Categorization. (d) Drawing.
Figure 5.4
By Helwig Hauser. Licensed under CC BY.
User-selected data in red compared against general trends in green.
Figure 5.5
Original book figure. Inspired by Lhuillier et al. (2017).
General procedure of bundling.
Figure 5.6a
Original book figure. Adapted from bl.ocks.org/mbostock/4341134.
Visualization of dependencies in a software class hierarchy. (a) Conventional representation.
Figure 5.6b
Original book figure. Adapted from bl.ocks.org/mbostock/4341134.
Visualization of dependencies in a software class hierarchy. (b) Hierarchical edge bundling (Holten, 2006).
Figure 5.7
Original book figure.
Illustration of Furnas' DoI function. (a) Distances to focus node. (b) Adding node levels. (c) Extracted subtrees.
Figure 5.8
By Steffen Hadlak. Licensed under CC BY.
Five years of a dynamic co-author network extracted from DBLP.
Figure 5.11
Original book figure. Adapted from Abello et al. (2014).
Glyph design for representing collapsed subgraphs.
Figure 5.12a
By Christian Eichner. Licensed under CC BY.
Feature specification with an interactive interface. (a) Specification of thresholds.
Figure 5.12b
By Christian Eichner. Licensed under CC BY.
Feature specification with an interactive interface. (b) Formal feature definition.
Figure 5.13a
By Christian Eichner. Licensed under CC BY.
Comparison of direct volume visualization of the particle concentration of one protein and ellipsoid-based visualization of features representing high concentrations of two different proteins. (a) Volume visualization.
Figure 5.13b
By Christian Eichner. Licensed under CC BY.
Comparison of direct volume visualization of the particle concentration of one protein and ellipsoid-based visualization of features representing high concentrations of two different proteins. (b) Ellipsoid visualization.
Figure 5.14a
By Christian Eichner. Licensed under CC BY.
Visualizing the temporal evolution of features. (a) Node-link diagram of the event graph.
Figure 5.14b
By Christian Eichner. Licensed under CC BY.
Visualizing the temporal evolution of features. (b) Two features at two time steps.
Figure 5.15
By Martin Röhlig. Licensed under CC BY.
Drawing thousands of trajectories of chaotic movements for multiple simulations leads to cluttered and indecipherable visual representations.
Figure 5.16
By Martin Röhlig. Licensed under CC BY.
From entities (dot marks) to density map (gray-scale image) to regions (colored image).
Figure 5.17
Original book figure.
2D movement reduced to 1D time series of feature values.
Figure 5.18
Original book figure.
Feature visualization with overview and detail views.
Figure 5.19
By Martin Röhlig. Licensed under CC BY.
Visualization of parameter settings, feature values, and detail information for selected parts of the data. (a) Parameter settings as gray-scale matrix; (b) Feature values over time as color-coded matrix; (c) Chart with selected time series; (d) Trajectory view with selected trajectory segments.
Figure 5.20
By Martin Röhlig. Licensed under CC BY.
Visualizing the parameter dependency of average group size.
Figure 5.21
By Martin Röhlig. Licensed under CC BY.
Visualization of the average distance to free proteins reveals the sweeping effect. (a) Density map. (b) Average protein-raft distance.
Figure 5.22
By Martin Luboschik. Licensed under CC BY.
Visualization of a time series with more than 1.7 million time points, where each black pixel represents about 1,000 data points.
Figure 5.23
Original book figure.
One and the same time series at two different scales.
Figure 5.24
Original book figure.
Unifying the sample points of two successive scales by mapping and interpolation.
Figure 5.25
Original book figure.
Computing the absolute value difference (AVD) and the slope sign difference (SSD) between two successive data scales.
Figure 5.26
Original book figure.
Aggregation of data differences with maximum aggregation for the absolute value difference (AVD) and average aggregation for the slope sign difference (SSD) function.
Figure 5.27
Original book figure.
Visualizing aggregated differences along with the actual data.
Figure 5.28
By Martin Luboschik. Licensed under CC BY.
Time series plot of the simulation outcome and corresponding multi-scale difference bands of SSD and AVD with local color mode.
Figure 5.29
By Martin Luboschik. Licensed under CC BY.
Studying the details of the middle peak-notch-spike pattern from Figure 5.28.
Figure 5.30
Original book figure.
A decision tree classifying enterprises according to their sales.
Figure 5.31
Original book figure. Adapted from Röhlig et al. (2015).
Illustration of activity recognition based on parameter-dependent algorithms that learn from some ground truth.
Figure 5.32
By Martin Röhlig. Licensed under CC BY.
Visualization of parameter-dependent classification outcomes for activity recognition. (a) Parameter configurations; (b) Recognized activities; (c) Color legend; (d) Ground truth; (e) and (f) Stacked histograms with aggregated information.
Figure 5.33
By Martin Röhlig. Licensed under CC BY.
Highlighting the incorrectly classified time steps in red.
Figure 5.34
Original book figure. Adapted from Glaßer (2014) with permission of Sylvia Saalfeld.
Illustration of clustering strategies. (a) Input data. (b) K-means. (c) Ward's method. (d) STING. (e) DBSCAN. (f) Dendrogram.
Figure 5.35
Original book figure. Software courtesy of Alexander Lex.
Comparison of clusters generated with hierarchical clustering, k-means, and affinity propagation. (a) Hierarchical clustering. (b) K-means clustering. (c) Affinity propagation.
Figure 5.36a
Original book figure.
Applying hybrid SOM-based clustering to sort rows in a table lens visualization. (a) Unordered rows before clustering.
Figure 5.36b
Original book figure.
Applying hybrid SOM-based clustering to sort rows in a table lens visualization. (b) Ordered rows after clustering.
Figure 5.37
Original book figure. Adapted from Hadlak (2014).
Two-step procedure of clustering nodes based on their attributes. First, nodes with similar attribute behavior are grouped. Second, groups are refined based on connected components. (a) Initial grouping. (b) Refined clusters.
Figure 5.39
Original book figure. Adapted from Hadlak (2014).
Structure-based clustering. (a) Initial set of states and transitions based on the sequence of graphs Gi in DG; (b) Hierarchical grouping of states and transitions based on similar structures.
Figure 5.40
By Steffen Hadlak. Licensed under CC BY.
Example of a state-transition graph characterizing an underlying dynamic graph.
Figure 5.41
By Steffen Hadlak. Licensed under CC BY.
Analyzing a wireless network supported by structure-based clustering. (a) State-transition graph; (b) Average link quality of selected state; (c) Representative graph structure of selected state.
Figure 5.42
Original book figure.
Reducing dimensionality with principal component analysis. (a) Original data space. (b) Principal component space. (c) Reduced space.
Figure 5.45
Original book figure.
Overview of automatic computational methods to support interactive visual data analysis by reducing the complexity of the data and their visual representations.

Chapter 6

Figure 6.2
By Christian Eichner. Licensed under CC BY.
Graphical interface for creating and controlling multi-display visual analysis presentations, including content pool (top), logical presentation structure (middle), and preview (bottom).
Figure 6.3
Original book figure.
Basic two-step procedure of the automatic view layout.
Figure 6.5
Original book figure.
Changing the content of views by launching visualization software.
Figure 6.6
By Christian Eichner. Licensed under CC BY.
Graphical interface for analysis coordination and meta-analysis, including filtering support (top), analysis history graph (middle), and timeline with undo and redo buttons (bottom).
Figure 6.7
Original book figure.
A knowledge gap exists when target or path is unknown.
Figure 6.8
Original book figure. Adapted from van Wijk (2006).
Adapted variant of van Wijk's model of visualization. Artifacts as boxes: data [D], specifications [S], visualization images [I], and user knowledge [K]. Functions as circles: analytic and visual transformation (T), perception and cognition (P), and interactive exploration (E).
Figure 6.9
Original book figure. Adapted from Ceneda et al. (2017).
Conceptual model of guided interactive visual data analysis. *Added artifacts and functions: domain conventions and models [D*], history and provenance [H*], visual cues [C*], options and alternatives [O*], and guidance generation (G*).
Figure 6.10
Original book figure.
Navigation recommendations for graph visualization.
Figure 6.11
Original book figure. Adapted from Streit et al. (2012).
Tailored domain model as the basis for user guidance.
Figure 6.13
Original book figure. Adapted from van Wijk (2006).
Incremental processes highlighted in van Wijk's model of visualization.
Figure 6.14
Original book figure. Adapted from Schulz et al. (2016).
Extended notation for operators and transitions for progressive visual data analysis.
Figure 6.15
Original book figure. Adapted from Schulz et al. (2016).
Simple example of a progressive transformation pipeline.
Figure 6.16
Original book figure.
A multi-threading architecture for progressive visual data analysis.
Figure 6.17
Original book figure.
Illustration of asynchronous processing threads operating on data chunks stored in priority queues.
Figure 6.18
Original book figure.
Comparison of single-thread and multi-thread solutions. (a) Regular single-thread solution. (b) Progressive multi-thread solution.
Figure 6.19
Original book figure.
The three typical scenarios of progressive visual data analysis: progressive data processing, progressive visualization, and progressive display.
Figure 6.20a
By Marco Angelini. Licensed under CC BY.
Visualization of progressively processed data chunks of car crashes from a database with more than 370,000 entries. (a) Regular progression of chunks.
Figure 6.20b
By Marco Angelini. Licensed under CC BY.
Visualization of progressively processed data chunks of car crashes from a database with more than 370,000 entries. (b) Prioritized progression of chunks.
Figure 6.21
Original book figure.
Progressive force-directed layout of a social network with 747 nodes and 60,050 edges.
Figure 6.22
Original book figure.
Progressive visualization of a climate network with about 6,816 nodes and 232,940 edges.
Figure 6.23
Original book figure. Adapted from Rosenbaum et al. (2011).
Device-dependent progressive transmission of a treemap visualization image.
Figure 6.24
By Axel Radloff-Delosea. Licensed under CC BY.
Progressive display for a dynamically defined region of interest. (a) Global view. (b) Overview+detail view. (c) Focus+context view.