posted by user: danilo_dessi || 2897 views || tracked by 5 users: [display]

XSA 2021 : Explainable Deep Learning for Sentiment Analysis


When N/A
Where N/A
Submission Deadline Dec 31, 2021
Categories    sentiment analysis   deep learning   explainability   embeddings

Call For Papers

CfP: Special Issue on Deep Learning and Explainability for Sentiment Analysis - Electronics (IF: 2.397)


Deadline: Dec 31, 2021



People use online social platforms to express opinions about products and/or services in a wide range of domains, influencing the point of view and behavior of their peers. Understanding individuals’ satisfaction is a key element for businesses, policy makers, organizations, and social institutions to make decisions. This has led to a growing amount of interest within the scientific community, and, as a result, to a host of new challenges that need to be solved. Sentiment analysis methodologies have been investigated and employed by researchers in the past to provide methodologies and resources to stakeholders. In the field of machine learning, deep learning models which combine several neural networks have emerged and have become the state-of-the-art technologies in various domains for a variety of natural language processing tasks. The most prominent deep learning solutions are combined with word embeddings. However, how to include sentiment information in word-embedding representations to boost the performances of deep learning models, as well as explain what deep learning models (often employed as a black-box) learn are questions that still remain open and need further research and development.

The investigation of these key points will answer to why and how design choices for creating embedding representations and designing deep learning should be made. This goes toward the direction of Explainable Deep Learning (XDL), whose aim is to address how deep learning systems make decisions. This Special Issue aims to foster discussions about the design, development, and use of deep learning models and embedding representations which can help to improve state-of-the-art results, and at the same time enable interpreting and explaining the effectiveness of the use of deep learning for sentiment analysis. We invite theoretical works, implementations, and practical use cases that show benefits in the use of deep learning with a high focus on explainability for various domains.

The Special Issue is focused but not limited to these topics:

Deep learning topics:
- Deep learning topics
- Aspect-based DL and XDL models;
- Bias detection within DL and XDL for sentiment analysis;
- DL and XDL for toxicity and hate speech detection;
- Multilingual DL and XDL for sentiment analysis;
- DL and XDL for emotions detection;
- Weak-supervised DL and XDL for sentiment analysis;
- XDL design methodologies for sentiment analysis;
- Analysis of DL models for sentiment analysis.
Data representations topics
- Word embeddings for sentiment analysis;
- Knowledge graph and knowledge graph embeddings for sentiment analysis;
- Use of external knowledge (e.g., knowledge graphs) to feed DL for sentiment analysis;
- Combination of existing sentiment analysis resources (e.g., SenticNet) with embedding representations;
- Analysis of the performance of data representations for sentiment analysis tasks;
- Lexicon-based explainability for sentiment analysis.
Case studies
- Educational environments;
- Healthcare systems;
- Scholarly discussions (e.g., peer review process discussions, mailing lists, etc.);
- News platforms;
- Mental health systems;
- Social networks.


Important dates:
Deadline for paper submission: Dec 31, 2021.

Papers submitted before the deadline will be reviewed upon receipt and published continuously in the journal as soon as accepted.


Submission information:
Please use the Latex template you find here
Or microsoft word


Guest Editors:

Prof. Dr. Diego Reforgiato Recupero, University of Cagliari (Italy)
Prof. Dr. Harald Sack, FIZ Karlsruhe - Leibniz Institute for Information Infrastructure & Karlsruhe Institute of Technology (Germany)
Dr. Danilo Dessi', FIZ Karlsruhe - Leibniz Institute for Information Infrastructure & Karlsruhe Institute of Technology (Germany)


Related Resources

EAIH 2024   Explainable AI for Health
Ei/Scopus-AACIP 2024   2024 2nd Asia Conference on Algorithms, Computing and Image Processing (AACIP 2024)-EI Compendex
AMLDS 2025   2025 International Conference on Advanced Machine Learning and Data Science
ICDM 2024   IEEE International Conference on Data Mining
ACML 2024   16th Asian Conference on Machine Learning
DSIT 2024   2024 7th International Conference on Data Science and Information Technology (DSIT 2024)
ICONDATA 2024   6th International Conference on Data Science and Applications
AASDS 2024   Special Issue on Applications and Analysis of Statistics and Data Science
GridCom 2024   International Conference on Grid Computing
IITUPC 2024   Immunotherapy and Information Technology: Unleashing the Power of Convergence