posted by organizer: chunji0112 || 293 views || tracked by 1 users: [display]

NTCIR 2025 : NTCIR-18 Task Proposal

FacebookTwitterLinkedInGoogle

Link: https://research.nii.ac.jp/ntcir/index-en.html
 
When Jun 10, 2025 - Jun 13, 2025
Where Tokyo
Submission Deadline Feb 9, 2024
Notification Due Mar 8, 2024
 

Call For Papers

CALL FOR NTCIR-18 TASK PROPOSALS
(Task Proposals Due: Feb 9th, 2024)

NTCIR (NII Testbeds and Community for Information Access Research) is a series of evaluation conferences that mainly focus on information access with East Asian languages and English. The first NTCIR conference (NTCIR-1) took place in August/September 1999, and the latest NTCIR-17 conference was held in December 2023. Research teams from all over the world participate in one or more NTCIR tasks to advance the state of the art and to learn from one another's experiences.
Now it is time to call for task proposals for the next NTCIR (NTCIR-18) which will start in March 2024 and conclude in June 2025. Task proposals will be reviewed by the NTCIR Program Committee, following the schedule below:

* IMPORTANT DATES:
Feb 9, 2024 Task Proposal Submission Due (Anywhere on Earth)
Mar 8, 2024 Acceptance Notification of Task Proposals

* SUBMISSION LINK:
https://easychair.org/conferences/?conf=ntcir18proposal

* NTCIR-18 TENTATIVE SCHEDULE:
March 2024: Kickoff Event
May 2024: Dataset release*
Jun-Dec 2024: Dry run*
Sep 2024-Feb 2025: Formal run*
Feb 1, 2025: Evaluation results return
Feb 1, 2025: Task overview release (draft)
Mar 1, 2025: Submission due of participant papers (draft)
May 1, 2025: Camera-ready participant paper due
Jun 10-13 2025: NTCIR-18 Conference
(* indicates that the schedule can be different for different tasks)

* WHO SHOULD SUBMIT NTCIR-18 TASK PROPOSALS?
We invite new task proposals within the expansive field of information access. Organizing an evaluation task entails pinpointing significant research challenges, strategically addressing them through collaboration with fellow researchers (including co-organizers and participants), developing the requisite evaluation framework to propel advancements in the state of the art, and generating a meaningful impact on both the research community and future developments.

Prospective applicants are urged to underscore the real-world applicability of their proposed tasks by utilizing authentic data, focusing on practical tasks, and solving tangible problems. Additionally, they should confront challenges in evaluating information access technology, such as the extensive number of assessments needed for evaluation, ensuring privacy while using proprietary data, and conducting live tests with actual users.

In the era of large language models (LLMs), these models are anticipated to significantly influence daily human activities. Nonetheless, the content produced by LLMs often exhibits issues, such as hallucinations. NTCIR-18 specifically encourages tasks that focus on evaluating the quality of content generated by LLMs.


* PROPOSAL TYPES:
We will accept two types of task proposals:

- Proposal of a Core task:
This is for fostering research on a particular information access problem by providing researchers with a common ground for evaluation. New test collections and evaluation methods may be developed through the collaboration between task organizers (proposers) and task participants. At NTCIR-17, the core tasks are Lifelog-5, QA Lab-PoliInfo-4, MedNLP-SC, SS-2, and FinArg-1 (Details can be found at http://research.nii.ac.jp/ntcir/NTCIR-17/tasks.html).

- Proposal of a Pilot task:
This is recommended for organizers who propose to focus on a novel information access problem and there are uncertainties either in task designing or organization. It may focus on a sub-problem of an information access problem and may attract a smaller group of participating teams than core tasks. However, it may grow into a core challenging task in the next round of NTCIR. At NTCIR-17, the pilot tasks are FairWeb-1, Transfer, UFO, and ULTRE-2 (Details can be found at http://research.nii.ac.jp/ntcir/NTCIR-17/tasks.html).

Organizers are expected to run their tasks mainly with their own funding and to make the task as self-sustaining as possible. A part of the fund can be supported by NTCIR, which is called "seed funding". It is usually used for some limited purposes such as hiring relevance assessors. The amount of seed funding allocated to each task varies depending on requirements and the total number of accepted tasks. Typical cases would be around 1M JPY for a core task and around 0.5M JPY for a pilot task (note that the amount is subject to change).

Please submit your task proposal as a pdf file via EasyChair by Feb 9, 2024 (Anywhere on Earth).
https://easychair.org/conferences/?conf=ntcir18proposal

* TASK PROPOSAL FORMAT:
The proposal should not exceed six pages in A4 single-column format. The first five pages should contain the main part and appendix, and the last page should contain only a description of the data to be used in the task. Please describe the data in as much detail as possible so that we can help your data release process after the proposal is accepted. In the past NTCIRs, it took much time to create memorandums for data release, which sometimes slowed down the task organization.

Main part
- Task name and short name
- Task type (core or pilot)
- Abstract
- Motivation
- Methodology
- Expected results

Appendix
- Names and contact information of the organizers
- Prospective participants
- Data to be used and/or constructed
- Budget planning
- Schedule
- Other notes

Data (to be used in your task)
- Details
(Please describe the details of the data, which should include the source of the data, methods to collect the data, range of the data, etc.)
- License
(Please make sure that you have a license to distribute the data, and details of the license

should be provided. If you do not have permission to release the data yet, please describe your plan to get the permission.)
- Distribution
(Please describe how you plan to distribute the data to participants. There are mainly three choices: distributed by the data provider, distributed by organizers, and distributed by NII.)
- Legal / Ethical issues
(If the data can cause legal or ethical problems, please describe how you propose to address them. e.g. some medical data may need approval from an ethical committee. e.g. some Web data may need filtering for excluding discriminative messages.)
If you want NII to distribute your data to task participants on behalf of you, please send an email to ntc-admin@nii.ac.jp prior to your task proposal submission attaching the task proposal.

* REVIEW CRITERIA:
- Importance of the task to the information access community and to the society
- Timeliness of the task
- Organizers’ commitment in ensuring a successful task
- Financial sustainability (self-sustainable tasks are encouraged)
- Soundness of the evaluation methodology
- Detailed description about the data to be used
- Language scope

* NTCIR-18 PROGRAM Co-Chairs
Qingyao Ai (Tsinghua University, China)
Chung-Chi Chen (National Institute of Advanced Industrial Science and Technology (AIST), Japan)
Shoko Wakamiya (Nara Institute of Science and Technology (NAIST), Japan)
* NTCIR-18 GENERAL CHAIRS:
Charles Clarke (University of Waterloo, Canada)
Noriko Kando (National Institute of Informatics, Japan)
Makoto P. Kato (University of Tsukuba, Japan)
Yiqun Liu (Tsinghua University, China)

Related Resources

MLTEEF 2024   Machine Learning Technologies on Energy Economics and Finance
GEM shared task 2024   GEM 2024 multilingual data-to-text and summarization shared task
MLSP 2024   Multilingual Lexical Simplification Pipeline (MLSP) Shared Task @ 19th Workshop on Innovative Use of NLP for Building Educational Applications
GermEval2024 GerMS-Detect 2024   GermEval2024 Shared Task GerMS-Detect - Sexism Detection in German Online News Fora @Konvens 2024
PDCTA 2024   13th International Conference on Parallel, Distributed Computing Technologies and Applications
WAMTA 2024   Workshop on Asynchronous Many-Task Systems and Applications
WMT-Testsuites 2024   'Help us break LLMs' - Test suite sub-task of the Ninth Conference on Machine Translation (WMT24)
Effective Grant Writing Using AI 2024   Invitation to Faculty Development Program Effective Grant Writing Strategies Using AI
AMTE 2024   Asynchronous Many-Task systems for Exascale 2024
KONVENS-ST/T/WS 2024   Call for Shared Task, Workshop and Tutorial Proposals @ KONVENS 2024