DICTA 2021 : International Conference on Digital Image Computing: Techniques and Applications
Conference Series : Digital Image Computing: Techniques and Applications
Call For Papers
The International Conference on Digital Image Computing: Techniques and Applications (DICTA) is the flagship Australian Conference on computer vision, image processing, pattern recognition, and related areas. DICTA was established in 1991 as the premier conference of the Australian Pattern Recognition Society (APRS). DICTA 2021 is technically cosponsored by the Institute of Electrical and Electronics Engineers (IEEE) and the International Association for Pattern Recognition (IAPR). Accepted papers will be submitted for inclusion into IEEE Xplore subject to meeting IEEE Xplore’s scope and quality requirements.
Topics include, but are not limited to, the following:
Image coding and processing
Statistical and structural pattern recognition
Shape and texture analysis
Content-based image retrieval and image databases
Biomedical and e-health applications
Astronomical image analysis
Surveillance, defence and industrial applications
Special Session on Deep Learning for Earth Observation and Remote Sensing (Best Paper Award sponsored by SmartSat CRC/AI4Space Research Network)
Special Session on Deep Learning for Medical Image Analysis
Authors are invited to submit papers of up to 8 pages according to the guidelines on the conference website. Each paper will go through double-blinded peer review. It is a condition of publication that each paper is presented at the conference. Each accepted paper must be registered by an author before the early registration deadline to avoid withdrawal from the conference proceedings and technical program. It is a condition of publication that accepted papers are presented by one of the authors.
DICTA 2021 Special Sessions
Deep Learning for Earth Observation and Remote Sensing
A variety of remote sensing imagery has been available at an increasingly higher resolution and a faster pace to better empower many applications such as earth observation, due to the advancement of various sensing technologies. To manage the challenges imposed by the exponential growth of such image data, to maximise the value of such abundant data, and to develop innovative and advanced solutions for earth observation, new technology innovations are essential for improved processing, analysis, and understanding of such remote sensing images. In recent years, deep learning has led to many successful breakthroughs for a wide range of vision tasks. Similarly, many deep learning algorithms have been proposed in the fields of earth observation and remote sensing. This special session aims to solicit and present high quality innovative papers on the following topics, but not limited to:
Deep Learning for Environment Monitoring (e.g., Land, Forest, Soil, Water and Ocean)
Deep Learning for Natural Disaster Prediction and Detection
Deep Learning for Crop Health Monitoring
Deep Learning for Remote Sensing based Urban Computing
Deep Learning for Remote Sensing Object Detection
Deep Learning for Remote Sensing Image Classification
Deep Learning for Remote Sensing Image Change Detection
Deep Learning for Remote Sensing Image Compression
Deep Learning for Remote Sensing Image Fusion
Deep Learning for Remote Sensing Image Super-Resolution
Deep Learning for Satellite Missions and Global Navigation Satellite System (GNSS)
With the generous support from the SmartSat CRC and the AI4Space Research Network, a Best Paper Award has been established for the submissions of this special session. We are also planning a challenge and will provide more details in due course.
Zhiyong Wang, The University of Sydney
Mohammad Awrangjeb, Griffith University
Jun Zhou, Griffith University
Xiuping Jia, The University of New South Wales
Clinton Fookes, Queensland University of Technology
For all paper submissions please visit the DICTA2021 website to be directed to the submission portal.
On behalf of the DICTA2021 Organising Committee, we are excited to see you all and celebrate your research accomplishments.