1 / 28

Visual Analytics

Visual Analytics. Lewis F. Jones III. Top 10 Observations for VA Technologies and Systems. Whole-Part Relationship Overall view of data Relationship Discovery Interaction and Explorative Techniques Combined Exploratory and Confirmatory Interaction Multiple Datatypes

zandra
Download Presentation

Visual Analytics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Visual Analytics Lewis F. Jones III

  2. Top 10 Observations for VATechnologies and Systems • Whole-Part Relationship • Overall view of data • Relationship Discovery • Interaction and Explorative Techniques • Combined Exploratory and Confirmatory Interaction • Multiple Datatypes • Temporal Views and Interactions • Flowcharts, timelines, etc. • Groupings and Outlier Identification • Multiple Linked Views • Multiple views on one display • Labeling • Reporting • Explain analysis of data • Interdisciplinary Science

  3. Initial Conditions for VA Challenges • Untethered to Device/Network/Interaction • No dependence • Tethered to Data/Information • Use of multiple types and sources • Indefinite or Indeterminate Data • Tools will judge usefulness • Minimized Transaction Costs • Needs to be fast • Trust • Security

  4. Top 10 Challenges for VA • Human-Information Discourse • “Walk-up usable” interfaces • Multi-device / Cross-platform • Collaborative Analytics • Not only evidential and confirmatory analytics, but also exploratory, hypothesis-driven, and predictive and proactive thinking • Holistic Visual Representations • Complete story at a glance • Effective labeling • Multi-type, multi-source data • Scale Independence • Enable reasoning over large, diverse information spaces to facilitate analytics and uncertainty refinement • Information Representations • Information synthesis (model + sensor) • Mathematical and semantically rich, data-preserving representations

  5. Top 10 Challenges for VA • Information Sharing • Share information securely between VA components and people with privacy-aware technologies • Active Information Products • Modifiable, reusable analytic components • Lightweight Software Architectures • Support and standards to rapidly develop VA applications and tools • Utility Evaluation • Evaluations of the utility of VA science, technology and systems • Sustaining Talent Base • Research, design and development continues

  6. VA Stereotypes • VA is adopted to primarily see and understand… • Massive Data • Complex Data • New Visual Paradigms • Hidden Insights

  7. VA Realities • Massive Data • VA is equally useful with small and large data • People spend substantially more time working with small data sets than massive ones • VA should focus on data dimensionality rather than the number of observations • Complex Data • Most important questions are simple • Simple questions are answered much more quickly using VA • Even complex questions are often best answered using simple visualizations • New Visual Paradigms • Answering sophisticated questions does not require complex visual displays • A sequence of simple displays works just fine • Hidden Insights • VA should put more focus on saving users time rather than finding some hidden information

  8. Goals of Analyzing Data Exploring Cleaning Gaining Confidence Summarizing Pursuing Inconclusive Paths Confirming Facts Presenting Findings

  9. Concepts to Avoid Difficult user interfaces Lack of visual intelligence Analytical inflexibility Complicated architectures

  10. Insight • Spontaneous Insight – a moment of enlightenment in cognitive science; “eureka!” • Occurs subconsciously and isn’t a process that can be directly controlled, manipulated, or repeated • An event that can be experienced or had • Knowledge-Building Insight – an advance in knowledge or a piece of information • A substance that can be discovered, gained, or provided

  11. Insight • Visualization should promote both types • Fours processes that lead to knowledge-building insight: • Provide overview • Adjust • Detect patterns • Match mental model • Deep, complex knowledge increases the possibility of spontaneous insight • Spontaneous insights increase the possibility of new directions for knowledge-building • Human learning is allowed to be flexible and scalable

  12. Insight Management • VA approaches can be challenged by large amounts of insights • Insight Management becomes essential • Insight Recording • Insight Association • Insight Retrieval • Insight Exchange • Insight Characteristics: • Complex, deep, qualitative, unexpected, and relevant • Three Basic Components of Insights: • Set of information items • Specification describing how the information was gathered • Descriptive annotations

  13. Common Insight Management Problems • Requiring manual annotation • Time-consuming and tedious • Can be incomplete, imprecise, and hard to understand • Requiring manual relationship detection • Does not scale to a sense-making process with large amounts of insights, long analysis times, and multiple analysts • Hard to search for and reuse recorded insights • Different users may user different descriptive terms for an unregulated annotation process • Unsupported insight exchange in collaborative VA • Rely heavily on users to manually search and understand collaborators’ insights

  14. Insight Description Model • Three Components: • A fact extracted from analyzed data • Examples include outliers, patterns, and relationships • A mental model for evaluating the fact • Objective and subjective evaluations of the fact • Mental models are hard to do due to variations amongst data sets, applications, and analysts • Types of facts are predictable and independent from data sets, applications, and analysts • Examples include value/derived value, distribution, difference, extreme, rank, category, cluster, outlier, association, trend, compound fact, and meta fact

  15. Fact Management Framework • Effectively and efficiently detect, annotate, associate, retrieve, and exchange facts using automatic or semi-automatic approaches • Fact taxonomy is created for categorizing facts • Fact taxonomies have the following criteria: • Completeness – cover the majority of the facts that can be discovered using the visualization tools under different conditions • Unambiguous – accurately and clearly distinguish fact types • Independence – separate from the applications and visualizations used to discover the facts • Utility – feasible for use in fact and insight management

  16. Fact Management Framework • Semi-Automatic Fact Annotation • Once a distinguished fact’s category is determined, the system knows what needs to be extracted from the data according to fact taxonomy attributes • Fact Organization, Indexing, Browsing, and Retrieval • Use keywords in the annotations, similar to YouTube • Fact Network • Constructed from annotation correlations and user modifications • Guided Fact Discovery • User notification upon fact discovery • Fact Exchange • User fills in a form to retrieve information, leaving attributes that are to be learned from blank

  17. Analytical Discourse • Process of constructing a collaborative plan by two or more agents • Three Structures: • Mechanical – segmental structure of analytical steps • Intentional – the way that discourse purposes relate • Focus of Attention • “SharedPlan” Theory of Collaboration • Knowledge of actions and the set of mental states held towards a plan • Intention, belief, and commitment

  18. “SharedPlan” Theory of Collaboration • A subplan becomes a Full SharedPlan when… • Participating agents have shared beliefs that all intend, and are committed to the whole plan • All actions on the leaf-nodes are basic actions • Parameters are already instantiated or another Full SharedPlan is ready for identifying the parameter • Partial SharedPlan is an ongoing discourse • Full SharedPlan is a completed discourse • Root plan is all encompassing, with subplans stemming from it identifying parameters and basic actions

  19. Sense-Making Process • Four primary stages: • Information Generation • Information and procedures for searching and analyzing the information are generated from a data source • Collaboration: Individual work for unhindered generation of comprehensive individual perspectives on the data • Schematization • Schemas created to guide iterations of categorization • Collaboration: Virtual sub-groups for comparison of findings • Argumentation and Shifting Schemas • Schemas are refined; outliers are discarded or form new schemas • Collaboration: Face-to-face meetings for physical analytical discourse • Decision Making • Schemas guide further analysis and generate questions and answers for the task being processed • Collaboration: Face-to-face meetings with simultaneous physical and virtual analytical discourse (using visualization tools together to answer questions)

  20. Collaborative VA Techniques • Collocated Collaboration • Large displays and shared workspaces • Synchronous Distance Work • Real-time networked displays • Asynchronous Collaboration • Partitioning work across time and space • Results in higher-quality outcomes such as broader discussions, more complete reports, and longer solutions than face-to-face collaborations due to the greater division of labor

  21. Asynchronous Collaboration Design Considerations • Division and Allocation of Work • Modularity – how work is segmented into atomic units, parallelizing work into independent tasks (modules) • Granularity – module measurement of the cost or effort involved in performing a task • Cost of Integration – measurement of the cost or effort involved in synthesizing the contributions of each individual module • Common Ground and Awareness • Participants must be able to see the same visual environment in order to ground each others’ actions and comments • Bookmarking, or sharing specific states of the visualization, could be used • References and Deixis • General – a direction • Definite – named entities • Detailed – described by attributes • Deictic – pointing to a specific object, group, or region (indexing) • Pointing – some form of vectorial reference to direction attention • Placing – moving an object to a region that has shared, conventional meaning

  22. Asynchronous Collaboration Design Considerations • Incentives and Engagement • Monetary – material compensation such as a salary or reward • Hedonic – well-being or engagement experienced intrinsically • Social-psychological – perceived benefits such as increased status or social capital • Identity, Trust, and Reputation • A hypothesis suggested by a more trusted or reputable person will have a higher probability of being accepted as part of the group consensus • Group Dynamics • Explicit mechanisms for assisting group formation may aid collaboration • Consensus and Decision-Making • Agreement on the data to collect • How to organize and interpret data • Making decisions based upon the data

  23. Visualization Tool Design Considerations • Some criticize tools for being too data-centric • They do not help users develop concepts and understanding from the results of visual exploration • Tools should be separated and coordinated into the following five areas… • Exploration • Analysis • Synthesis • Evaluation • Presentation • Ease of use • Keyword searchable data • File and information sharing

  24. VA Project Ideas • Use RealXtend and/or Second Life to design and create a collaborative visual analytics tool set for data from any field of research • Use the ideas, considerations, and guidelines listed in this PowerPoint’s research • Focus on ease-of-use and speed rather than deep and extensive automatic analysis • Perform data calculations outside of the virtual reality environment, focusing only on displaying the information when within the VRE • Implement some data analysis techniques from COS 702 • Use concepts learned during thesis work to better the design process and implementation

  25. VA Project Features • Display VA data in 3D rather than flat graphs • Give the users options • Simple displays such as line graphs, bar graphs, pie charts, etc. (easier) • Complex displays such as radial tree maps, parallel coordinates, etc. (the real focus of the work) • Allow for quickly switching between display types on the fly (series of displays) • Utilize the second life environment features for visual understanding, such as colors and glow effects (do not go overboard!) • Make sure 3D is for better understanding; not for sake of 3D • This would avoid the complex creation of images using external tools • Tools for users to create extensive group work and history logging, visualization customization, and a tutorial / tooltips system • Be able to alert users to new insights and analysis as they sift through and manipulate data • Try to allow for plug-in support in some way • This is far off, but keep it in mind when designing

  26. VA Project Abstract Designing and implementing an asynchronous collaborative visual analytics toolset within a virtual reality environment, utilizing various three-dimensional, serial displays for efficient and effective understanding, interest, insight and analysis.

  27. Primary Sources Cai, G. “Formalizing Analytical Discourse in Visual Analytics.” In Proceedings of the 2007 IEEE Symposium on Visual Analytics Science and Technology (October 30 - November 01, 2007). Virtual reality, archeology, and cultural heritage. IEEE Computer Society, Washington, DC, 217-218. Chabot, C. “Demystifying Visual Analytics.” IEEE Computer Graphics and Applications, Vol. 29, No. 2, March 2009, 84-87. Chang, R., Ziemkiewicz, C., Green, T. M., and Ribarsky, W. “Defining Insight for Visual Analytics.” IEEE Computer Graphics and Applications, Vol. 29, No. 2, March 2009, 14-17. Chen, Y., Yang, J., and Ribarsky, W. “Toward Effective Insight Management in Visual Analytics Systems.” In Proceedings of the 2009 IEEE Pacific Visualization Symposium (April 20 - 23, 2009). PACIFICVIS. IEEE Computer Society, Washington, DC, 49-56. Ha, D., Kim, M., Wade, A., Chao, W. O., Ho, K., Kaastra, L., Fisher, B., and Dill, J. “From Tasks to Tools: A Field Study in Collaborative Visual Analytics.” In Proceedings of the 2007 IEEE Symposium on Visual Analytics Science and Technology (October 30 - November 01, 2007). Virtual reality, archeology, and cultural heritage. IEEE Computer Society, Washington, DC, 223-224. Heer, J. and Agrawala, M. “Design Considerations for Collaborative Visual Analytics.” In Proceedings of the 2007 IEEE Symposium on Visual Analytics Science and Technology (October 30 - November 01, 2007). Virtual reality, archeology, and cultural heritage. IEEE Computer Society, Washington, DC, 171-178. Robinson, A. C. “Needs Assessment for the Design of Information Synthesis Visual Analytics Tools.” In Proceedings of the 2009 13th International Conference Information Visualization (July 15 - 17, 2009). IV. IEEE Computer Society, Washington, DC, 353-360. Thomas, J. and Kielman, J. “Challenges for Visual Analytics.” Foundations and Frontiers of Visual Analytics, Vol. 8, No. 4, Winter 2009, 309-314.

  28. Secondary Sources Aragon, C. R., Bailey, S. J., Poon, S., Runge, K. J., and Thomas, R. C. “Sunfall: A Collaborative Visual Analytics System for Astrophysics.” In Proceedings of the 2007 IEEE Symposium on Visual Analytics Science and Technology (October 30 - November 01, 2007). Virtual reality, archeology, and cultural heritage. IEEE Computer Society, Washington, DC, 219- 220. Brand, M. v., Roubtsov, S., and Serebrenik, A. “SQuAVisiT: A Flexible Tool for Visual Software Analytics.” In Proceedings of the 2009 European Conference on Software Maintenance and Reengineering (March 24 - 27, 2009). CSMR. IEEE Computer Society, Washington, DC, 331-332. Goodall, J. R. and Tesone, D. R. “Visual Analytics for Network Flow Analysis.” In Proceedings of the 2009 Cybersecurity Applications & Technology Conference for Homeland Security (March 03 - 04, 2009). CATCH. IEEE Computer Society, Washington, DC, 199-204. Jern, M., Rogstadius, J., Åström, T., and Ynnerman, A. “Visual Analytics Presentation Tools Applied in HTML Documents.” In Proceedings of the 2008 12th International Conference Information Visualization (July 09 - 11, 2008). IV. IEEE Computer Society, Washington, DC, 200-207. Riveiro, M., Falkman, G., and Ziemke, T. “Visual Analytics for the Detection of Anomalous Maritime Behavior.” In Proceedings of the 2008 12th International Conference Information Visualization (July 09 - 11, 2008). IV. IEEE Computer Society, Washington, DC, 273-279. Steed, C. A., Swan II, J. E., Jankun-Kelly, T. J., and Fitzpatrick, P. J. “Guided Analysis of Hurricane Trends Using Statistical Processes Integrated with Interactive Parallel Coordinates.” In Proceedings of the IEEE Symposium on Visual Analytics Science and Technology (October 12 – 13, 2009). VAST. IEEE Computer Society, Washington, DC, 19–26. Wong, P. C., Leung, L. R., Lu, N., Scott, M. J., Mackey, P., Foote, H., Correia Jr., J., Taylor, Z. T., Xu, J., Unwin, S. D., and Sanfilippo, A. “Designing a Collaborative Visual Analytics Tool for Social and Technological Change Prediction.” IEEE Computer Graphics and Applications, Vol. 29, No. 5, September 2009, 58-68. Ziegler, H., Nietzschmann, T., and Keim, D. A. “Visual Analytics on the Financial Market: Pixel-based Analysis and Comparison of Long-Term Investments.” In Proceedings of the 2008 12th International Conference Information Visualization (July 09 - 11, 2008). IV. IEEE Computer Society, Washington, DC, 287-295.

More Related