posted by user: paws || 1438 views || tracked by 2 users: [display]

ASSESS 2014 : Data Mining for Educational Assessment and Feedback


When Aug 24, 2014 - Aug 24, 2014
Where New York City
Submission Deadline Jun 20, 2014
Notification Due Jul 5, 2014

Call For Papers

Assessing educational achievement and providing feedback to learners is crucial to emerging course work, MOOCs and labor market-making systems. Automating the assessment and feedback mechanisms on open-ended tasks (e.g., short-answer and essay questions) will allow both learning and recruitment to be opened up to a much larger set of people that currently miss out due to resource constraints. Although automated or semi-automated assessment/feedback remains a nascent field, there have been recent advances drawing on techniques from data mining, knowledge discovery, machine learning, and crowdsourcing for peer grading. Notwithstanding, the technical requirements for accuracy and expressiveness for the many purposes for assessment (some high-stakes and some low-stakes; some formative and some summative) are not well-defined.

In this workshop, we will bring together a diverse group of researchers and industry practitioners in data mining, machine learning, and psychology to (1) discuss current state of the art in automated assessment, (2) identify a vision for future research, and (3) lay out a vision for the steps required to build a good automated or peer-grading based assessment. The organizers hope that a unified framework for thinking about assessment and feedback emerges by the end of workshop.

Call for papers
We invite submission of papers describing innovative research on all aspects of assessment of educational achievement and providing feedback. Position papers and papers inducing discussion are strongly encouraged, but should base themselves on fresh insight and/or new data. Topics of interest include, but are not limited to:
- Automated and semi-automated assessments of open-responses
- Assessment by crowdsourcing and peer-grading
- Automatic feedback generation
- Automatic problem (item) generation/design, calibration, modeling
- Assessment design, test blue-print, rubric design, validity and reliability
- Test integrity, equity, proctoring and high-stake testing
- Non-cognitive assessments in personality, behavior, motivation, etc.
- New areas: gamification, simulation based assessments, etc.
- Insights derived by large scale assessment data towards learning, recruitment, performance prediction, etc.
Papers submitted should be original work and different from papers that have been previously published or are under review in a journal or another conference/workshop.

Accepted papers shall be provided a speaking slot at the workshop.

Important Dates
Paper submission deadline June 15th, 2014, 23:59 Pacific Standard Time
Acceptance notification July 05, 2014
Final paper due July 15, 2014

Invited speaker:
Damian Bebell, Boston College
Sumit Gulwani, Microsoft Research
Eliana Feasley, Khan Academy
Donovan Stevens, edX
(More to be decided)

Divyanshu Vats, Postdoctoral Fellow, Rice University
Lav R. Varshney, Assistant Professor, University of Illinois at Urbana-Champaign
Steven E. Stemler, Associate Professor, Wesleyan University
Varun Aggarwal, CTO, Aspiring Minds

Related Resources

ADAH 2017   Advanced Data Analytics in Health
ICDM 2017   IEEE International Conference on Data Mining 2017
ACML 2017   The 9th Asian Conference on Machine Learning
WSDM 2018   The 11th ACM International Conference on Web Search and Data Mining
DSAA 2017   The 4th IEEE International Conference on Data Science and Advanced Analytics 2017
DMBD 2017   DMBD'2017:Call for Papers (Jan. 30, 2017)
ACM--ICCIP 2017   ACM--2017 the 3rd International Conference on Communication and Information Processing (ICCIP 2017)--Ei Compendex, ISI Web of Science and Scopus
ICDIM 2017   Twelfth International Conference on Digital Information Management
BAFI 2018   Business Analytics in Finance and Industry
BDBI 2017   2nd CFC: Utilizing Big Data Paradigms for Business Intelligence