Download as xlsx, pdf, or txt
Download as xlsx, pdf, or txt
You are on page 1of 195

All fields are free text except year.

No need to annotate with your own words if you are not sure about the meaning. Cut and paste first to let your subconscious mind in Keep your (annotator's) name in the filename. It's easier to keep the year and interesting pages on paper and put it back into the sheet later instead of going back and fo Finally, when pasting blocks of text from PDF file, you need to paste into a text editor (wrangler) first, then substitute "\r\n" with " " to prevent the newlines from causin year author & title & link Interesting Pages Abstract

2011

2011

2011

2011

2005

2004

The recent financial collapse has laid bare the inadequacies of the infrastructure supporting the US financial system. Technical challenges around largescale data systems interact with significant economic forces involving innovation, transparency, confidentiality, complexity, and organizational change, to create a very difficult problem. The postcrisis reform legislation has created a unique opportunity to rebuild financial risk management on a solid foundation of management principles. This should help reduce operating costs and operational risk. More importantly, it will support both the monitoring and the containment of financial risk on a previously unprecedented scale. These objectives will pose several management challenges, including Flood, M., Jagadish, H.V., Kyle, A., Olken, F. issues of knowledge representation, quality, data and Raschid, L. (2011). Using Data for 145 (problem summary), integration, and presentation. This paper presents a Systemic Financial Risk Management, In 146-147 (research vision of an rich financial risk management system, Proceedings of CIDR (2011) , 144-147. challenges in detail) and a research agenda to facilitate its realization. Visual analytics is a new interdisciplinary field of study that calls for a more structured scientific Fisher, B, Green, T.M, and Arias-Hernndez, approach to understanding the effects of interaction R. (2011). VA as a translational cognitive with complex graphical displays on human cogscience. Topics in Cognitive Science, 3(3), nitive processes. Its primary goal is to support the 609625. design and evaluation of graphical information Arias-Hernndez, R, Kaastra, L.T, Green, T.M, Studying how humans interact with abstract, visual and Fisher, B. (2011). Pair analytics: representations of massive amounts of data Capturing reasoning processes in provides knowledge about how cognition works in collaborative VA. In Proceedings of the 44th visual analytics. This knowledge provides guidelines Annual Hawaii International Conference on for cognitive-aware design and evaluation of visual System Sciences: January 4-7, 2011, Koloa, analytic tools. Different methods have been used to Wehrend, S, and Lewis, C. (1990). A problemProgress in scientific visualization could be oriented classification of visualization accelerated if workers could more readily find techniques. In Proceedings of the 1st IEEE visualization techniques relevant to a given problem. Conference on Visualization (Visualization This paper describes an approach to this problem 90): October 23-26, 1990: San Francisco, based on a classification of visualization techniques California, A. Kaufman (ed), IEEE Computer that is independent of particular application Amar, R, Eagan J, and Stasko, J. (2005). LowExisting system-level taxonomies of visualization level components of analytic activity in tasks are geared more towards the design of InfoVis. In Proceedings of the 11th IEEE particular representations than the facilitation of Symposium on InfoVis 2005 (INFOVIS 05): user analytic activity. We present a set of ten lowOctober 23-25, 2005, Minneapolis, level analysis tasks that largely capture peoples Minnesota, J. Stasko (ed), IEEE Computer activities while employing information visualization Amar, R, and Stasko, J. (2004). A knowledge The design and evaluation of most current task-based framework for design and information visualization systems descend from an evaluation of visualizations. In Proceedings emphasis on a users ability to unpack the of the 10th IEEE Symposium on InfoVis 2004 representations of data of interest and operate on (INFOVIS 04): October 10-12, 2004, Austin, them independently. Too often, successful decisionTexas, M. Ward, T. Munzner (eds), IEEE making and analysis are more a matter of

Bavoil, L., Callahan, S., Crossno, P., Freire, J., Scheidegger, C., Silva, C., and Vo, H. Vistrails: enabling interactive multiple-view visualizations. Proc. IEEE Visualization (2005), 2005 135-142. Dou, W., Jeong, D.H., Stukes, F., Ribarsky, W., Lipford, H.R., and Chang, R. Recovering reasoning process from user interactions. IEEE Computer Graphics and Applications (2009). 29(3): 5261. 2009 Fink, G.A., North, C.L., Endert, A., and Rose, S. Visualizing cyber security: Usable workspaces. Proc. Visualization for Cyber Security (2009), 45-56. 2009 Garg, S., Nam, J., Ramakrishnan, I., and Mueller, K. Model-driven visual analytics. Proc. IEEE Visual Analytics Science and Technology (2008), pp. 1926. 2008 Gotz, D. and Zhou, M. Characterizing users visual analytic activity for insight provenance. Proc. Visual Analytics Science and Technology (2008), 123130. 2008 Grabler, F., Agrawala, M., Li, W., Dontcheva, M., Igarashi, T. Generating Photo Manipulation Tutorials by Demonstration. SIGGRAPH (2009), 66:1-66:9. Heer, J., Mackinlay, J., Stolte, C., and 2009 Agrawala, M. Graphical histories for visualization: Supporting analysis, communication, and evaluation. IEEE Transactions on Visualization and Computer Graphics (2008), 14(6):11891196. 2008 Jankun-Kelly, T., Ma, K. and Gertz, M. A model and framework for visualization exploration. IEEE Transactions on Visualization and Computer Graphics (2007), 13(2): 357369. 2007 Kadivar, N., Chen, V., Dunsmuir, D., Lee, E., Qian, C., Dill, J., Shaw, C., Woodbury, R. Capturing and supporting the analysis process. Proc. IEEE Visual Analytics Science and Technology (2009), 131-138. 2009 Pike, W.A., Stasko, J., Chang, R., and OConnell, T.A. The science of interaction. Information Visualization (2009), 8(1): 263274 2009 Shrinivasan, Y. and van Wijk, J. Supporting the analytical reasoning process in information visualization. Proc. ACM CHI (2008), 12371246. 2008

VisTrails is a new system that enables interactive multiple-view vi- sualizations by simplifying the creation and maintenance of visu- alization pipelines, and by optimizing their execution. It provides a general infrastructure that can be combined with existing visu- alization systems and

The goal of cyber security visualization is to help analysts in- crease the safety and soundness of our digital infrastructures by providing effective tools and workspaces. Visualization research- ers must make visual tools more usable and compelling than the text-based tools that currently dominate cyber We describe a Visual Analytics (VA) infrastructure, rooted on techniques in machine learning and logicbased deductive reason- ing that will assist analysts to make sense of large, complex data sets by facilitating the generation and validation of models repre- senting relationships in the data. We use Insight provenancea historical record of the process and rationale by which an insight is derivedis an essential requirement in many visual analytics applications. While work in this area has relied on either manually recorded provenance (e.g., user notes) or automat- ically recorded event-based

Thomas, J. and Cook, K. Illuminating the path. National Visualization and Analytics Center, 2004. 2004 Xiao, L., Gerth, J., and Hanrahan, P. Enhancing visual analysis of network traffic using a knowledge representation. Proc. IEEE Visual Analytics Science and Technology (2006), 107114

2006

nd paste first to let your subconscious mind internalize and synthesize, then paraphrase later when writing article. nto the sheet later instead of going back and forth between the article and excel, unless you have 2 monitors n" with " " to prevent the newlines from causing problems in Excel (NotePad++ or Jedit or gVim are all good for this) Analytic Problems Posed The banking crisis that erupted in 2008 underscored the need for reductions in systemic financial risk. [4] While there were myriad interacting causes, a central theme in the crisis was the proliferation of complex financial products that overwhelmed the systems capacity for appropriately diligent analysis of the risks involved. In many cases, the information available about products and counterparties was minimal. <P> For many years, both financial firms and their regulators have been hampered by a state of data anarchy, despite (or perhaps because of) the enormous volumes of mission-critical data the industry handles daily. <P> The widespread use of PCs has dispersed access, ownership, and control of data throughout the firm, to create multiple, overlapping data silos. The result is disparate, inconsistent and inaccurate information. <P> Furthermore, crossinstitutional barriers often inhibited regulators from obtaining the data that could have resulted in recognizing systemic risk early on, and thus, potentially preventing a timely response to emergent failures. Systemic risk monitoring is inherently complex. Financial institutions acquire information from hundreds of sources. Further complicating these daunting technical challenges, there are also strong incentives for many market participants to restrict transparency around risks [7]. The data quality gap in finance is an evolutionary outcome of years of mergers and internal realignments, exacerbated by business silos and inflexible IT architectures. Integration remains point-to-point and occurs tactically in response to emergencies. Many firms still lack an executive owner of data content and have no governance structure to address funding challenges, organizational alignment or battles over priorities. Integration remains point-to-point and occurs tactically in response to emergencies. Many firms still lack an executive owner of data content and have no governance structure to address funding challenges, organizational alignment or battles over priorities. Systemic risk monitoring is not a precise science, but there is a general consensus that it should consider at least *6+: forward-looking risk sensitivities to stressful events (e.g., what would a 1% rise in yields mean for my portfolio?); margins, leverage, and capital for individual participants (e.g., how large a liquidity shock could I absorb before defaulting?); the contractual interconnectedness of investors and firms (e.g., if Lehman Bros. fails, how will that propagate to me?) concentration of exposures, relative to market liquidity (e.g., how many banks are deeply exposed to California real estate?) The OFR/DC will need the following types of information *2+: Financial instrument reference data: information on the legal and contractual structure of financial instruments, such as prospectuses or master agreements, including data about the issuing entity and its adjustments based on corporate actions; Legal entity reference data: identifying and descriptive information, such as legal names and charter types, for financial entities that participate in financial transactions, or that are otherwise referenced in financial instruments; Positions and transactions data: terms and conditions for both new contracts (transactions) and the accumulated financial exposure on an entity's books (positions); Prices and related data: transaction prices and related data used in the valuation of positions, development of models and scenarios, and the measurement of microprudential and macro-prudential exposures. Research Challenges: Data Representation and Complex Models, Data Integration, Data Quality, Streaming, Change Management and Performance, Metadata Management: Ontologies, Open Standards and Provenance, and Data Presentation (see 146-147 for details). Visual analytics was defined as the science of analytical reasoning facilitated by inter- active visual interfacesand its initial structure established by a panel of researchers drawn largely from the computer science visualization community. From the perspective of engi- neering design, interfaces that must support human analysts interaction with information as a primary design specification ought to be informed by design guidelines and formative and summative evaluation criteria that incorporate what we know about human perceptual and cognitive capabilities. The development of these models is complicated by the need to take into account individual differences, in particular the skills and abilities acquired through extensive experience solving a particular type of problem using a given interface and dataset.Because visual analytics proposes to find commonalities in analytical cognition across a range of specific The study of analytical reasoning and interaction with visual analytic tools is a critical step in the advancement of visual analytics as a scientific field [25]. Creating knowledge and models about higher- order cognitive processes in visual analytics [13] and the leverage points in which computational tools can amplify human cognition [24] is currently one of the most pressing challenges in laying out the scientific foundations for this endeavor. To account for the influence of cognitive factors, experiments sometimes make use of inscription devices *20+ that track or monitor neurological, physiological or interactional states (e.g. fMRI, eye-tracking, logging mechanisms) and tests that characterize psychological profiles of individuals.

Existing system-level taxonomies of visualization tasks are geared more towards the design of particular representations than the facilitation of user analytic activity.Information visualization research, especially that dealing with the automatic generation of information presentations [10,15], has produced several taxonomies of system tasks that map visualization operations to user cognitive processes. In one sense, these taxonomies might be considered low-level task taxonomies or hierarchies, since they form part of a compositional language upon which automatic generation systems build higher-order externalizations of data. In general, information visualization can benefit from understanding the tasks that users accomplish while doing actual analytic activity. Such understanding achieves two goals: first, it aids designers in creating novel presentations that amplify users analytic abilities; second, it The design and evaluation of most current information visualization systems descend from an emphasis on a users ability to unpack the representations of data of interest and operate on

The growing demand for visualization has led to the development of new and freely available systems [14, 20, 24], which due to increased computational power, wide availability of inexpensive Graphics Processing Units (GPUs), and more efficient visualization algorithms, allow users to generate and interactively explore com- plex visualizations. they lack the infrastructure to properly manage the pipelines, and they do not exploit optimization opportunities during pipeline execution. As a result, the creation, maintenance, and exploration of visualiza- tion data products are major bottlenecks in the scientific process, limiting the scientists ability to fully exploit their data. Exploring data through visualization requires scientists to as- semble dataflows that apply sequences of operations over a set of data products. Often, insight comes from comparing the results of a variety of visualizations [23]. operations are process through which an analyst arrives at the conclusion is just as important as the discoveries themselves. Un- derstanding how an analyst per- forms a successful investigation will finally let us start bridging the gap between the art of analy- sis and the science of analytics. Unfortunately, understanding an analysts reasoning process is not a trivial task, especially because few researchers have ac- cess to analysts performing their tasks using classified or highly confidential material. We hypothesize that when an analyst interacts with a welldesigned VA tool, much of that ana- lysts reasoning process is embedded within his or her interactions with the tool. Therefore, through careful examination of the analysts interaction logs, we should be able to retrieve a great deal of the analysts reasoning process. Cyber analysts who defend our computer infrastructures use primitive, command-line tools that are ineffective at the high vol- ume and velocity of the data they must process. They have re- sisted using visualizations, partly because no visualization has yet met their complex needs. Large displays have been valuable in other applications with massive data [1], and we suspect that they can be helpful in this application, too. For cyber security professionals, a usable workspace should support multiple, simultaneous, open-ended investigations. Sepa- rate tasks that arise from distinct tip-off points may eventually connect, implying the need for an overview of all active tasks and their relationships. Individual clues are only valuable if they support other clues that point to the same root cause. To acquire multiple, complementary information items, cyber analysts rapidly switch between analytic inquiries, multi-tasking and Modern day enterprises, be they commerce, government, science, engineering or medical, have to cope with voluminous amounts of data. Effective decision making based on large, dynamic datasets with many parameters requires a conceptual high-level under- standing of the data. Acquiring such an understanding is a diffi- cult problem, especially in the presence of incomplete, inconsis- tent, and noisy data acquired from disparate real-world sources.Locating patterns in high-dimensional space via visual inspec- tion can be challenging. The method of parallel coordinates (PC) [8], which has seen various refinements in recent years (see e.g.,[17]), maps a high-dimensional point into a piece-wise linear line that spans the vertical dimension axes. However, thus far the main emphasis in VA has been mostly on visualization, data management, and user interfaces (see e.g. [5] and other work mentioned

Solutions and Methods Posed

In response, the Dodd-Frank Wall Street Reform Act among its many regulatory changes has created an Office of Financial Research (OFR) with the mandate to establish a sound data management infrastructure for systemic-risk monitoring. The OFR will contain a Data Center (OFR/DC) to manage data for the new agency. Pair analytics provides the process of visual analytics with a perspective on what is actually happening in the field and also produces a short partnership in which professional knowledge is used to refine the processes of visual analytics. Pair Analytics, a research method for capturing reasoning processes in visual analytics that addresses some of the limitations of currently available methods. Pair analytics requires a dyad of participants: one Subject Matter Expert (SME) and one Visual Analytics Expert (VAE). These two participants are given one analytical task, one data set, and one computer with a visual analytics tool. Each participant is assigned a role. The VAE plays the role of the

present a set of ten primitive analysis task types, representative of the kinds of specific questions that a person may ask when working with a data set. The development of this set of primitives stemmed from ongoing research into the utility of information visualization systems. It also was driven by a large set of questions gathered from an assignment in which students generated queries about five sample data sets. 1. Retrieve Value 2. Filter 3. Compute Derived Studies such as Kobsas [13] test more how well users can unpack the representation of individual data than how users actually discern any higher-level trends or

VisTrails enables interactive multiple-view visualizations by simplifying the creation and maintenance of visualization pipelines, and by optimizing their execution. Our design goals included: creating an infrastructure that maintains the provenance of a large number of visualization data products; providing a general framework for efficiently executing a large number of We found that when analysts based their investigations purely on visual patterns, our coders had a difficult time determining their methods. Although some of the coders errors in extracting the analysts reasoning process can be attributed to operator errors, the most consistent and common errors stem from the coders not be- ing able to see the same visual representations as the analysts. One practical solution is to connect the operational- analysis tool We found the behaviors of the cyber analysts we studied were distinct from behaviors of analysts in other domains such as intelligence analysis. Using the insight gained from the interviews with the cyber analysts, we derived the following set of design principles for usable workspaces for cyber analysts: 1. Provide history and traceability for investigations 2. Support multiple, simultaneous investigation cases 3. Design visualization tools to be flexible and To make progress on this problem one must draw on the comple- mentary strengths of computing machinery and human insight. We seek a visual interface that allows users to point out arbi- trary high-dimensional patterns in a more direct manner. An early high-dimensional exploration paradigm providing a combined scatter plot of an arbitrary number of variables is the Grand Tour (GT) [2], which is part of the GGobi package [19]. The user is

You might also like