Skip to Main content Skip to Navigation
Conference papers

Fully convolutional and feedforward networks for the semantic segmentation of remotely sensed images

Abstract : This paper presents a novel semantic segmentation method of very high resolution remotely sensed images based on fully convolutional networks (FCNs) and feedforward neural networks (FFNNs). The proposed model aims to exploit the intrinsic multiscale information extracted at different convolutional blocks in an FCN by the integration of FFNNs, thus incorporating information at different scales. The purpose is to obtain accurate classification results with realistic data sets characterized by sparse ground truth (GT) data by taking benefit from multiscale and long-range spatial information. The final loss function is computed as a linear combination of the weighted cross-entropy losses of the FFNNs and of the FCN. The modeling of spatial-contextual information is further addressed by the introduction of an additional loss term which allows to integrate spatial information between neighboring pixels. The experimental validation is conducted with the ISPRS 2D Semantic Labeling Challenge data set over the city of Vaihingen, Germany. The results are promising, as the proposed approach obtains higher average classification results than the state-of-the-art techniques considered, especially in the case of scarce, suboptimal GTs.
Complete list of metadata
Contributor : Jules Mabon Connect in order to contact the contributor
Submitted on : Tuesday, July 12, 2022 - 1:58:33 PM
Last modification on : Tuesday, August 2, 2022 - 3:26:08 PM


Files produced by the author(s)


  • HAL Id : hal-03720693, version 1


Martina Pastorino, Gabriele Moser, Sebastiano B Serpico, Josiane Zerubia. Fully convolutional and feedforward networks for the semantic segmentation of remotely sensed images. ICIP 2022- IEEE International Conference on Image Processing, Oct 2022, Bordeaux, France. ⟨hal-03720693⟩



Record views


Files downloads