posted by user: Jscholtz0416 || 3140 views || tracked by 10 users: [display]

Metrics for Visual Analytics 2007 : InfoVis Workshop: Metrics for the Evaluation of Visual Analytics

FacebookTwitterLinkedInGoogle

Link: http://www.cs.umd.edu/hcil/InfoVisworkshop/
 
When Oct 28, 2007 - Oct 28, 2007
Where Sacramento, CA
Submission Deadline Sep 21, 2007
Notification Due Sep 30, 2007
Categories    HCI   visualization   information technology
 

Call For Papers

The workshop is a full day and is scheduled for Sunday, October 28th

The field of visual analytics is now recognized as a research area in many universities and organizations. As new fields develop ways of assessing, progress in those fields also expands. In the field of visual analytics, we are fortunate in that we already have lessons learned about evaluating visualizations. Unfortunately, these lessons still point out that this is a difficult problem. Visual analytics compound this problem by adding more dimensions: not only are we concerned with some measure of the visualizations, but we are concerned with evaluating the impact these visualizations have in helping analysts in their work. User-centered evaluations are vital in visual analytics as they contribute greatly to adoption of research software. The issues we face in developing user-centered evaluations for visual analytics are selecting:
- The task: the tradeoff is between simple tasks that are easily evaluated and developing more realistic tasks that consume more time and are much less straightforward to evaluate.
- The corresponding dataset: the same issues as above plus the issues of developing a publicly releasable dataset that resembles a realistic dataset
- The system and environment: how much does the system or environment play a role in the utility or success of the task.
- The participants: access to senior analysts or junior analysts in evaluations and ensuring that analysts are open to new technology
- Training: how much training should be provided to analysts prior evaluations or whether analysts should be paired with technologists to operate the software
- The metrics: what combination of quantitative and qualitative measures will be accepted? How can we ensure that qualitative measures are collected with and meet some rigor? How can we measure insights that were derived from the visualization and interactions with the visualization? This is especially problematic as not all analysts approach problems in the same fashion. Most importantly, what measures are most helpful to the analytic community and to the research community?

. Selected participants will receive copies of all accepted position papers. These participants will present their ideas or current research during the morning (about 10-15 minutes each). Based on the position papers and these presentations, the organizers will develop a list of possible metrics. An initial list will be distributed to the participants prior to the workshop. After all the presentations, this list will be discussed and refined by the participants.

In the afternoon session, the organizers will provide representative examples of different types of visual analytics systems (The VAST 2007 contest winners have agreed to let us use their submissions) and the workshop participants will test the list of metrics by evaluating these systems using the metrics. A discussion session will follow to identify successes and difficulties, and refine the list of metrics. The organizers will generate a report evaluating the metrics based on the participants�?? usage and the discussions.

We will produce a poster from the workshop which will be included in the poster session. The poster will focus on the metrics used during the workshop and the lessons learned for each.

We will also consider a joint journal paper or future conference paper with the workshop participants contributing to the various metrics proposed by the workshop.


Submission of position papers:
Submissions should be no longer than 4 pages and should focus on metrics and methods for evaluating visual analysis environments. If participants have used these methods already, please include lessons learned and references. If the proposal has not yet been tried, please provide some estimates of the efforts that would be needed to implement these. Position papers should be submitted to the organizers (see e-mail addresses below) no later than September 15th. Please see http://www.cs.umd.edu/hcil/InfoVisworkshop/
for details - posted in early September. Participants will be notified of acceptance no later than September 30th.


Organizers:
Jean Scholtz
Pacific Northwest National Laboratory, 340 Northslope Way, Rockaway Beach, OR 97136
Jean.scholtz@pnl.gov

Georges Grinstein
University of Massachusetts Lowell, Lowell MA 01854
grinstein@cs.uml.edu

Catherine Plaisant
University of Maryland, College Park, MD 20742, U.S.A.
plaisant@cs.umd.edu


Related Resources

SOTICS 2020   The Tenth International Conference on Social Media Technologies, Communication, and Informatics
ACM--NLPIR--Ei Compendex and Scopus 2020   ACM--2020 4th International Conference on Natural Language Processing and Information Retrieval (NLPIR 2020)--Scopus, Ei Compendex
JDST 2020   JDST: Big Data Challenges – Situation Awareness and Decision Support
ICSIE--ACM, Ei Compendex, Scopus 2020   2020 9th International Conference on Software and Information Engineering (ICSIE 2020)--ACM, Ei Compendex, Scopus
LREC 2020   12th Conference on Language Resources and Evaluation
IEEE CAIT--Ei Compendex, Scopus 2020   2020 IEEE International Conference on Artificial Intelligence Technology (CAIT 2020)--Ei Compendex, Scopus
DIONE 2020   (Virtual) 1st EAI International Conference on Data and Information in Online Environments
AIP Journal - Indexed in Scopus 2020   Journal of Social and Business Informatics - Acta Informatica Pragensia
AI, Data Analytics and Blockchain JIEM 2020   Emerging Trends and Impacts of the rise of AI, Data Analytics and Blockchain, Journal of Enterprise Information Management (JIEM, Q1)
ACM--ICACS--Ei Compendex and Scopus 2020   ACM--2020 4th International Conference on Algorithms, Computing and Systems (ICACS 2020)--Ei Compendex, Scopus