IAAI 2011 : 23rd Annual Conference on Innovative Applications of Artificial Intelligence
Conference Series : Innovative Applications of Artificial Intelligence
Call For Papers
IAAI-11 Call for Papers
Paper Submission Site
A PDF version of this document is available for download
* December 2010 – January 31, 2011: Authors register on the AAAI web site
* February 1, 2011: Electronic papers due
* April 15, 2011: Notification of acceptance or rejection
* April 26, 2011: Camera-ready copy due at AAAI office
The Twenty-Third Annual Conference on Innovative Applications of Artificial Intelligence (IAAI-11) will focus on successful applications of AI technology. The conference will use technical papers, invited talks, and panel discussions to explore issues, methods, and lessons learned in the development and deployment of AI applications; and to promote an interchange of ideas between basic and applied AI.
IAAI-11 will consider papers in two tracks: (1) deployed application case studies and (2) emerging applications or methodologies. Submissions should clearly identify which track they are intended for, as the two tracks are judged on different criteria (see definitions and criteria, below). All submissions must be original.
Deployed Application Case Study Papers
Case-study papers must describe deployed applications with measurable benefits that include some aspect of AI technology. The application needs to have been in production use by its final end-users (not the people who created it) for sufficiently long that the experience in use can be meaningfully collected and reported. This period typically spans at least three months. The case study may evaluate either a stand-alone application or a component of a complex system. In addition to the criteria listed below for Emerging Track papers, the deployed applications will also be evaluated on the following:
Task or Problem Description: Describe the task the application performs or the problem it solves. State the objectives of the application and explain why an AI solution was important. If other solutions were tried and failed outline these solutions and the reasons for their failure.
Application Description: Describe the application, providing key technical details about design and implementation. What are the system components, what are their functions, and how do they interact? What languages and tools are used in the application? How is knowledge represented? What is the hardware and software environment in which the system is deployed? Provide examples to illustrate how the system is used.
Uses of AI Technology: On what AI research results does the application depend? What key aspects of AI technology allowed the application to succeed? How were the techniques modified to fit the needs of the application? If applicable, describe how AI technology is integrated with other technology. If a commercial tool is used, explain the decision criteria used to select it. Describe any insights gained about the application of AI technology. What AI approaches or techniques were tried and did not work? Why not?
Application Use and Payoff: How long has this application been deployed? Explain how widely, how often, and by whom the application is being used. Also describe the application’s payoff. What measurable benefits have resulted from its use? What additional benefits do you expect over time? What impacts has it had on the users’ business processes?
Application Development and Deployment: Describe the development and deployment process. How long did they take? How many developers were involved? What were the costs? What were the difficulties, and how were they overcome? What are the lessons learned? What, if any, formal development methods were used?
Maintenance: Describe your experience with and plans for maintenance of the application. Who maintains the application? How often is update needed? Is domain knowledge expected to change over time? How does the design of the application facilitate update?
Original papers on the deployment issues in AI applications are welcome, even if other papers on the AI technology have been presented at or submitted to other conferences. We encourage updates on applications that have been in use for an extended period of time (that is, multiple years). Each of the accepted deployed application papers will receive the IAAI “Innovative Application” Award. A copy of the review form for Deployed Application Case Study Papers will be posted on the IAAI website.
Emerging Application Case Study Papers
The goal of the emerging application track is to “bridge the gap” between basic AI research and deployed AI applications, by discussing efforts to apply AI tools, techniques, or methods to real world problems. Emerging applications are on aspects of AI applications that are not appropriate for deployed application case studies, or are not sufficiently deployed to be submitted as case studies. This track is distinguished from reports of scientific AI research appropriate for the AAAI-11 Conference in that the objective of the efforts reported here should be the engineering of AI applications.
Emerging application papers may include any aspects of the technology, engineering, or deployment of AI applications, including discussions of prototype applications; performance evaluation of AI applications; ongoing efforts to develop large-scale or domain-specific knowledge bases or ontologies; development of domain or task focused tools, techniques, or methods; evaluations of AI tools, techniques or methods for domain suitability; unsuccessful attempts to apply particular tools, techniques or methods to specific domains (which shed insight on the applicability and limitations of the tool, technique or method); system architectures that work; scalability of techniques; integration of AI with other technologies; development methodologies; validation and verification; lessons learned; social and other technology transition issues.
The following questions will appear on the review form for emerging technology papers. Authors are advised to bear these questions in mind while writing their papers. Reviewers will look for papers that meet at least some (although not necessarily all) of the criteria in each category.
Significance: How important is the problem being addressed? Is it a difficult or simple problem? Is it central or peripheral to a category of applications? Is the tool or methodology presented generally applicable or domain specific? Does the tool or methodology offer the potential for new or more powerful applications of AI?
AI Technology: Does the paper identify AI research needed for a particular application or class of applications? Does the paper characterize the needs of application domains for solutions of particular AI problems? Does the paper evaluate the applicability of an AI tool or methodology for an application domain? Does the paper describe AI technology that could enable new or more powerful AI applications?
Innovation: Does the tool, technique, or method advance the state of the art or state of the practice of AI technology? Does the tool, technique, or method address a new or previously reported problem? If it is a previously reported problem, does the tool, technique, or method solve it in a different, new, more effective, or more efficient way? Does the reported work integrate AI with other AI or non-AI technologies in a new way? Does the work provide a new perspective on an application domain? Does the work apply AI to a new domain?
Content: Does the paper motivate the need for the tool or methodology? Does the paper adequately describe the task it performs or the problem it solves? Does it provide technical details about the design and implementation of the tool or methodology? Does the paper clearly identify the AI research results on which the tool or methodology depends? Does it relate the tool or methodology to the needs of application domains? Does it provide insights about the use of AI technology in general or for a particular application domain? Does it describe the development process and costs? Does it discuss estimated or measured benefits? Does it detail the evaluation method and results?
Evaluation: Has the tool or methodology been tested on real data? Has it been evaluated by end users? Has it been incorporated into a deployed application? Has it been compared to other competing tools or methods?
Technical Quality: Is the paper technically sound? Does it carefully evaluate the strengths and limitations of its contribution? Are the results described and evaluated? Are its claims backed up? Does it identify and describe relevant previous work?
Clarity: Is the paper clearly written? Is it organized logically? Are there sufficient figures and examples to illustrate the key points? Is the paper accessible to those outside the application domain? Is it accessible to those in other technical specialties?
A copy of the review form for Emerging Application Case Study Papers will be posted on the IAAI website.
Electronic submissions are required. Papers must be in trouble-free, high resolution PDF format and formatted for United States Letter (8.5" x 11") paper. Submissions need to be in AAAI two-column format, (www.aaai.org/Publications/Author/author.php.) Deployed papers can be up to eight (8) pages. Emerging papers are limited to six (6) complimentary pages and two (2) optional additional pages at $275 each.
Authors should register on the IAAI-11 web-based paper submission site. A login and password, as well as detailed instructions about how to submit an electronic paper, will be sent to the author in a subsequent e-mail message. Authors must then submit a formatted electronic version of their paper through this software no later than Tuesday, February 1, 2011. We cannot accept papers submitted by e-mail or fax.
Submissions received after the deadline or that do not meet the length or formatting requirements detailed previously and at the IAAI11 web site will not be accepted for review. Notification of receipt of the electronic paper will be mailed to the first author (or designated author) soon after receipt. If there are problems with the electronic submission, AAAI will contact the primary author by e-mail. Papers will be reviewed by the Program Committee and notification of acceptance or rejection will be mailed to the contact author in late March. PDFs of accepted papers will be due on April 26, 2011. Authors will be required to transfer copyright (no exceptions).
Registration or clarification inquiries may be sent to AAAI at email@example.com, 650-328-3123, or 650-321-4457 (fax).
IAAI-11 Program Chairs
Daniel Shapiro (ISLE)
Markus Fromherz (Palo Alto Research Center)