Domain knowledge-based visualization recommendation system

Studies have long advocated the inclusion of domain knowledge for producing an effective visualization system. The insights and reasoning artifacts gained from these systems are closer to the knowledge of the domain users and the data context. However, most existing knowledge-based visualization applications focus on integrating domain knowledge tailored only for the specific analytic task. Visualization recommendation systems are those systems that provide different insights into the dataset by automatically selecting different views or visualizations of the dataset. Previous work relating to the development of visualization recommendation systems suggest visualizations based on different parameters: visual mapping of data attributes, preselection of user tasks and mapping accordingly, deviation based theory, machine trained visuals to data encoding schemes, ontology mapping, etc. However, there are limited studies that have tried to include domain knowledge as the visualization selection criteria. In developing a visualization recommendation system where the ultimate goal is not to answer any specific question but to explore the datasets multidimensional insights, the inclusion of domain knowledge is not common. Thus, though we know that domain knowledge could be a pivotal ingredient to increase the visualization interpretation, how such knowledge can be included in a visualization recommendation system has not yet been sufficiently explored. In this thesis, we have explored how domain knowledge can be integrated into various stages of visualization recommendation systems. As a result of that work, we have developed a novel domain knowledge-based visualization recommendation system. We have used biodiversity research as our application domain. The contributions of this thesis are 1) The domain knowledge-based visualization recommendation model. 2) A system for automatic runtime generation of visual goals.



Citation style:
Could not load citation form.


Use and reproduction:
All rights reserved