Cognitive Systems Engineering (CSE) has a history built, in part, on leveraging representational design to improve system performance. Traditionally, however, CSE has focused on visual representation of "monitored" processes -- active, ongoing, and interconnected activities occurring in a system of interest and monitored by human controllers -- such as operating a power plant, controlling a petrochemical process, or monitoring a patient's respiration. In contrast, comparatively little attention has been directed toward the representation of processes in which direct monitoring and control of the process is not a central function. Of particular interest is the Macrocognitive Sensemaking process of information analysis, especially as currently practiced in the U.S. Intelligence Community.
In addressing how to effectively represent such processes, this research begins with a more fundamental question: "What defines high quality information analysis?" This research takes a decidedly narrow view of this issue, positing a rigid definition of analysis quality: simply, that the quality of an analysis process is, or perhaps should be, defined by the quality of the output it produces. To the extent that process measures that predict product quality can be identified, these measures yield the base data necessary for creating representations that provide outcome-diagnostic insight into analysis activities.
The relationship between measures of analytical process and assessments of analytical products was investigated via a two-part, laboratory-based, Staged World study of web-based information analysis. The purpose of this study was to identify meaningful relationships between process and product measures -- in particular automatically collectable indicators of analytic quality -- in order to enable the discovery of domain-independent techniques for representing information analysis activity. This study built on prior research with professional intelligence analysts which suggests analytical strategies that predict performance, as well as studies which found that providing insight into analytical activities represents a potentially strong mechanism for influencing perceptions of rigor.
In the first part of the study, participants used a computer workstation configured with a standard web browser to perform one or more web-based analysis tasks under variable scenario conditions. In the second part of the study, a separate pool of participants used a modified Q-sort method to assess the relative quality of the analysis products generated by participants in the first part. These independent assessments were combined to yield a measure of concordance, which served as a critical component for assessing the overall quality of products generated by the first set of participants. The data were then analyzed by comparing process measures collected during the first part of the study to product assessments that were generated during the second part.
This study found that automatically collected measures of information analysis activity both predicted performance and differentiated between analytical strategies. In addition, the results suggest applications for the research method beyond the scope of this study and implications on visualizing information analysis activity that extend across domains. Finally, the findings suggest several novel insights that should refine how CSE conceptualizes and measures expertise in naturalistic information analysis, specifically, and in Macrocognitive Sensemaking more broadly.