ICNLSP 2015 : International Conference on Natural Language and Speech Processing
Call For Papers
Aims and Scope
The first International Conference on Natural Language and Speech Processing ICNLSP 2015 is a conference which creates a synergy between Natural Language Processing and Speech Recognition communities. It highlights new approaches related to the language in both its aspects: text and speech spanning basic theories to applications. Regular and posters sessions will be organized, in addition to keynotes presented by senior international researchers. The conference program will include all presentations of the accepted papers, posters or oral.
ICNLSP 2015 will be held on October 18-19, 2015, in Algiers, capital of Algeria. The conference location, Hilton hotel, is situated in front of the beach, not far from the International airport Houari Boumedienne and very close to the commercial center Ardis.
Participants of ICNLSP 2015 will benefit from exceptional offers since they will be accommodated freely in one of the most luxurious hotels like Hilton, Aldjazair (Sait-Georges), Mercure or Ibis. Free food and break coffee will be offered during the conference period. The only dues to pay are the Registration fees: 350 Euro.
In addition to this, a touristic visit and a Gala dinner will be programmed in the last day of the conference.
An award will be given to the participant who will present the best paper. Its value will be defined later. The second, third, fourth and fifth best papers will be awarded as well.
ICNLSP 2015 invites papers discussing the science and technology related to speech and natural language, regardless of the language studied, however works on Arabic are encouraged.
The following list includes cognition and natural language processing, information retrieval, speech recognition, speech translation, but does not represent an exhaustive list of all topics.
Signal processing, acoustic modeling.
Architecture of speech recognition system.
Deep learning for speech recognition.
Speech comprehension and summarization
Speaker and language identification
Phonetics, phonology and prosody.
Cognition and natural language processing
Under-resourced languages: tools and corpora.
New language models.
- Stephan Vogel (QCRI, Qatar)
- Jean Paul Haton (University of Lorraine, France)
- Mohamed Afify (Microsoft Advanced Technology Lab., Cairo)
Chair: Kamel Smaili (Professor, University of Lorraine, France)
Mourad Abbas (Researcher at CRSTDLA, Algeria)
Ahmed Abdelali (Researcher at QCRI, Qatar)
Driss Aboutajadine (Professor at University of Rabat Head of CNRST, Morroco)
Mohamed Afify (Researcher at Microsoft Advanced Technology Lab., Cairo)
Melissa Barkat (Researcher at CNRS, France)
Fréderic Béchet (Professor at University of Aix-Marseille, France)
Laurent Besacier (Professor at University of Grenoble, France)
Khalid Daoudi (Researcher at INRIA Bordeaux, France)
Yannick Estève (Professor at university of Le Mans, France)
Dominique Fohr (researcher at CNRS, France)
Corinne Fredouille (Assistant Professor at University of Avignon, France)
Eric Gaussier (Professor at university of Grenoble, France)
Jean-Paul Haton (Professor Emeritus, University of Lorraine, France)
Salma Jamoussi (Assistant Professor University of Sfax, Tunisia)
Denis Jouvet (researcher at INRIA Lorraine, France)
David Langlois (Assistant Professor Lorraine University, France)
Chiraz Latiri (Professor at University of Tunis, Tunisia)
Georges Linarès (Professor at University of Avignon, France)
Fatiha Sadat (Associate Professor UQAM, Canada)
Olivier Siohan (researcher at Google, USA)
Yahya Slimani (Professor at University of Tunis, Tunisia)
Stephan Vogel (Research director at QCRI, Qatar)
Imed Zitouni (Researcher at Microsoft, USA)
Papers for the ICNLSP proceedings should be prepared by using Latex according to IEEE template. The number of pages should range from 4 to 6 including references.
Papers must be submitted via the online paper submission system Easychair.
Each submitted paper will be reviewed by three program committee members.
Chair of ICNLSP: Mourad Abbas (email@example.com)