2018 IEEE GRSS Data Fusion Contest
Advanced multi-sensor optical remote sensing for urban land use and land cover classification
The Contest: Goals and Organization
The 2018 IEEE GRSS Data Fusion Contest, organized by the Image Analysis and Data Fusion Technical Committee, aims to promote progress on fusion and analysis methodologies for multi-source remote sensing data.
The 2018 Data Fusion Contest consists of a classification benchmark. The task to be performed is urban land use and land cover classification. The following advanced multi-source optical remote sensing data are provided to the community:
- Multispectral LiDAR point cloud data, intensity rasters and digital surface models (DSMs) at a 0.5-m ground sampling distance (GSD)
- Hyperspectral data at a 1-m GSD
- Very high-resolution RGB imagery at a 5-cm GSD
To test their synergy as well as individual potential for urban land use and land cover classification, classification results can be submitted to three parallel and independent competitions:
- Data Fusion Classification Challenge (use of at least two data sets)
- Multispectral LiDAR Classification Challenge
- Hyperspectral Classification Challenge
Scientific papers describing the best entries (as quantified by the scores of the confusion matrix and accuracy parameters) will be included in the Technical Program of IGARSS 2018, presented in an oral Invited Session, and published in the IGARSS 2018 Proceedings.
The contest aims to promote innovation in classification algorithms, as well as to provide objective and fair comparisons among methods. The ranking is based on quantitative accuracy parameters computed with respect to undisclosed test samples. Participants will be given a limited time to submit their classification maps after the competition was started. The contest will consist of two steps:
- Step 1: Participants are provided with a subset of the data, including ground truth to train and test their algorithms.
- Step 2: Participants receive the full data set (without the corresponding ground truth) and submit their classification maps within two weeks from the release of the full data set. In parallel, they submit a short description of the approach used. After evaluation of the results, four winners are announced. Following this, they will have one month to write their manuscript that will be included in the IGARSS proceedings. Manuscripts are 4-page IEEE-style formatted. Each manuscript describes the addressed problem, the proposed method, and the experimental results.
January 15th Contest opening: release of training data (Step 1) March 13th Release of test data (Step 2): evaluation server is open. March 25th Submission of classification maps deadline:
the submission server is closed
March 30th Winner announcement
We provide (Acquired by the National Center for Airborne Laser Mapping, NCALM): The data were acquired by NCALM on February 16, 2017 between 16:31 and 18:18 GMT. Sensors used in this campaign include an Optech Titam MW (14SEN/CON340) with integrated camera (a LIDAR sensor operating at three different laser wavelengths), a DiMAC ULTRALIGHT+ (a very high resolution color imager) with a 70 mm focal length, and an ITRES CASI 1500 (a hyperspectral imager).
- Multispectral-LiDAR point cloud data at 1550 nm, 1064 nm, and 532 nm; Intensity rasters from first return per channel and DSMs at a 50-cm GSD.
- Hyperspectral data covering a 380-1050 nm spectral range with 48 bands at a 1-m GSD.
- Very high resolution RGB imagery at a 5-cm GSD. The image is organized into several separate tiles.
The data were acquired by NCALM on February 16, 2017 between 16:31 and 18:18 GMT. Sensors used in this campaign include an Optech Titam MW (14SEN/CON340) with integrated camera (a LIDAR sensor operating at three different laser wavelengths), a DiMAC ULTRALIGHT+ (a very high-resolution color imager) with a 70mm focal length, and an ITRES CASI 1500 (a hyperspectral imager). The sensors were aboard a Piper PA-31- 350 Navajo Chieftain aircraft.
Moreover, for the training region only, we also provide ground-truth corresponding to 20 urban land use and land cover classes. They are provided as raster at a 0.5-m GSD, superimposable to airborne images.
The ground truth for the test set remains undisclosed and will be used for evaluation of the results.
Urban Land Use and Land Cover Classes:
The classes in the contest are an advanced version of the ones of [Debes et al., 2014]:
- 0 – Unclassified
- 1 – Healthy grass
- 2 – Stressed grass
- 3 – Artificial turf
- 4 – Evergreen trees
- 5 – Deciduous trees
- 6 – Bare earth
- 7 – Water
- 8 – Residential buildings
- 9 – Non-residential buildings
- 10 – Roads
- 11 – Sidewalks
- 12 – Crosswalks
- 13 – Major thoroughfares
- 14 – Highways
- 15 – Railways
- 16 – Paved parking lots
- 17 – Unpaved parking lots
- 18 – Cars
- 19 – Trains
- 20 – Stadium seats
RGB and ground truth for the training region are shown in Fig. 1.
Fig. 1 (Left) Color composite of MS-LiDAR intensity data (R = 1550 nm, G = 1064 nm, B = 532 nm) and (right) ground truth for training.
[Debes et al., 2014] Debes, C. ; Merentitis, A. ; Heremans, R. ; Hahn, J. ; Frangiadakis, N. ; van Kasteren, T. ; Liao, W. ; Bellens, R. ; Pizurica, A. ; Gautama, S. ; Philips, W. ; Prasad, S. ; Du, Q. ; Pacifici, F. : Hyperspectral and LiDAR Data Fusion: Outcome of the 2013 GRSS Data Fusion Contest. IEEE J. Sel. Topics Appl. Earth Observ. Remote Sens., 7 (6) pp. 2405-2418, June 2014.
Results, Awards, and Prizes:
The following four teams will be declared as winners:
- The first and second ranked teams in Data Fusion Classification Challenge
- The first ranked team in Multispectral LiDAR Classification Challenge
- The first ranked team in Hyperspectral Classification Challenge
The authors of the four winning submissions will:
- Be awarded IEEE Certificates of Recognition. The award ceremony will take place during the Technical Committees and Chapter Chairs Dinner at IGARSS 2018, Valencia, Spain in July 2018
- Present their manuscripts in an oral Invited Session dedicated to the Contest at IGARSS 2018
- Publish their manuscripts in the Proceedings of IGARSS 2018
The first and second ranked teams among all classification challenges will co-author a journal paper (in a limit of 3 co-authors per submission), which will summarize the outcome of the Contest and will be submitted to IEEE JSTARS. To maximize impact and promote the potential of current multi-source remote sensing technologies, the open-access option will be used for this journal submission.
The authors of the winning submission (1st ranking) will receive as a prize an NVIDIA GPU graphic card .
The costs for open-access publication and for the winners’ participation to the Technical Committees and Chapter Chairs Dinner at IGARSS 2018 will be supported by the GRSS. The winner team prize is kindly sponsored by the GRSS.
The rules of the game:
- Data can be requested by registering for the Contest on the IEEE GRSS DASE website: http://dase.grss-ieee.org/. Participants must read and accept the Contest Terms and Conditions.
- Participants of the contest are intended to submit classification maps (in raster format, similar to the tif file of the training set) for the whole study area. These results will be submitted to the IEEE GRSS DASE website for evaluation: http://dase.grss-ieee.org/. Ranking between the participants will be based on the scores of the confusion matrix and accuracy parameters.
- The classification results are expected to be at a 0.5-m GSD for all classification challenges.
- Institutional or business E-mail accounts should be used for registration.
- One E-mail account is allowed for one team.
- At each first submission of an account, the email owner has to send an email that specifies the members of his/her team to email@example.com.
- The maximum number of trials of one team for each classification challenge is ten.
- Manual labeling is not allowed.
- Deadline for classification result submission is March 25, 2018, 23:59 UTC – 12 hours (e.g., March 25, 2018, 7:59 in New York City, 13:59 in Paris, or 19:59 in Beijing). Submission server will be opened from March 13, 2018.
- Each classification result submission will be authored by one or more co-authors (team of participants). One and only one submission originating from each team will be allowed to the Contest. Should multiple entries from the same team be received, then exclusively the first-ranked submission received will be considered.
- For the 4 winners, internal deadline for full paper submission is April 16, 2018, 23:59 UTC – 12 hours (e.g., April 16, 2018, 7:59 in New York City, 13:59 in Paris, or 19:59 in Beijing). IGARSS Full paper submission is May 4, 2018.
While submitting a classification result, each team will acknowledge that, should the result be among the 4 best ranking ones, at least one team member will participate to the Data Fusion invited session at IGARSS 2018.
Failure to follow any of these rules will automatically make the submission invalid, resulting in the manuscript not being evaluated.
Participants to the Contest are not requested to submit an extended abstract to IGARSS 2018 by the corresponding conference deadline in January 2018. Only contest winners (participants corresponding to the 4 best-ranking submissions) will submit a 4-page paper describing their approach to the Contest by April 16, 2018. The received manuscripts will be reviewed by the Award Committee of the Contest, and reviews sent to the winners. Then winners will submit the final version of the 4 full-paper to IGARSS Data Fusion Contest Invited Session by May 4, 2018, for inclusion in the IGARSS Technical Program and Proceedings.
The data set was collected by NCALM at the University of Houston (UH) on February 16, 2017, covering the University of Houston campus and its surrounding areas. The data was prepared and preprocessed by UH students, staff and faculty at NCALM and the Hyperspectral Image Analysis Laboratory.
The IADF TC chairs would like to thank NCALM and the Hyperspectral Image Analysis Laboratory at UH for providing the data, and their staff, students and faculty for preparing the data, and the IEEE GRSS for continuously supporting the annual Data Fusion Contest through funding and resources.
Contest Terms and Conditions
The data are provided only for the purpose of participation in the 2018 Data Fusion Contest. Participants acknowledge that they have read and agree to the following Contest Terms and Conditions:
- The owners of the data and of the copyright on the data are NCALM and the Hyperspectral Image Analysis Laboratory at UH.
- Class annotations maps are licensed under the Open Data Commons Open Database License (ODbL) by UH and IEEE GRSS.
- Any dissemination or distribution of the data packages by any registered user is strictly forbidden.
- The data can be used in scientific publications subject to approval by the IEEE GRSS Image Analysis and Data Fusion Technical Committee and by the data owners on a case-by- case basis. To submit a scientific publication for approval, the publication shall be sent as an attachment to an e-mail addressed to firstname.lastname@example.org and email@example.com.
- In any scientific publication using the data, the data shall be identified as “grss_dfc_2018” and shall be referenced as follows: “[REF. NO.] 2018 IEEE GRSS Data Fusion Contest. Online: http://www.grss-ieee.org/community/technical-committees/data-fusion”.
- Any scientific publication using the data shall include a section “Acknowledgement”. This section shall include the following sentence: “The authors would like to thank the National Center for Airborne Laser Mapping and the Hyperspectral Image Analysis Laboratory at the University of Houston for acquiring and providing the data used in this study, and the IEEE GRSS Image Analysis and Data Fusion Technical Committee.
- Any scientific publication using the data shall refer to the following paper: Y. Xu, B. Du, L. Zhang, D. Cerra, M. Pato, E. Carmona, S. Prasad, N. Yokoya, R. Hänsch, B. Le Saux., “Advanced Multi-Sensor Optical Remote Sensing for Urban Land Use and Land Cover Classification: Outcome of the 2018 IEEE GRSS Data Fusion Contest,” in IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing. https://doi.org/10.1109/