We live in the age of data, in which we are constantly confronted by the challenge of analyzing massive amounts of scientific data generated across diverse disciplines, such as biomedical imaging and meteorology. Visualizations are indispensable for analysis of large-scale complex data because they provide us with an ability to efficiently detect interesting data features, assess situations, and make scientific decisions. One of the major challenges faced by visualization systems is uncertainties inherent to the input data and computational processes. The limited precision of data acquisition instruments, reduced representations of large-scale data, and computational approximations of data processing algorithms are a few prime sources of uncertainties in the visualization pipeline. The data and computational uncertainties can adversely impact the accuracy of final visualizations, leading to incorrect analyses of data.

In my research, I am investigating the impact of uncertainties in data fed to the visualization systems on the final renderings for improving the credibility of visualizations. The topic of uncertainty visualization has received growing attention in the past two decades, and has been recognized as one of the top research challenges. The conventional approach for studying uncertainty in visualizations comprises Monte Carlo sampling of uncertain input data followed by the empirical analysis of variations in the final visualizations. My work differs from the conventional Monte Carlo sampling approach in that I propose adaptations of existing visualization algorithms to be able to process uncertain input data, and thereby quantify uncertainty in the final visualizations in closed form.

In the scientific visualization literature, the paper describing the marching cubes algorithm for isosurface rendering is by far the most cited, and the volume ray casting algorithm is prevalent for gaining insights into scientific data. Both algorithms hold a strong assumption that the input data are known or certain, which is rarely the case in the real world. In my research, I propose adaptations of the steps of the marching cubes and volume ray casting algorithms to be able to process uncertain input data and statistically quantify variations in the final visualizations. From a mathematical perspective, I model the uncertainty in input data using statistical representations, such as mean or probability density functions, and I study the integration of statistical representations with the steps of the visualization algorithms.

In addition to my work on the study of adaptation of existing algorithms for visual analysis of uncertain data, I take a specific interest in investigating uncertainty visualization for domain-specific problems. One such example of my work is statistical uncertainty analysis in the domain of deep brain stimulation (DBS). DBS is neuromodulation therapy for treating patients with movement disorders, e.g., Parkinson’s disease (PD). I have developed an interactive visualization system for analyzing positional uncertainty in DBS electrodes implanted in a patient head. The system has been developed to reduce the treatment time for PD patients.