Chenming Li, Xiaoyu Qu, Yao Yang, Hongmin Gao, Yongchang Wang, Dan Yao, and Wenjing Yuan
SReLU activation function, multi-layer perceptron, high-resolution remote sensing image segmentation
This study proposes a new activation function, namely, S-type rectified linear unit activation function (SReLU), to alleviate the gradient dispersion of neural network model and improve the segmentation precision of high-resolution remote sensing images (HRIs). The advantages and defects of various activation functions in the neural network model are analysed and compared. A multi-layer percep- tron is designed on the basis of this activation function, and the principal component analysis is introduced to conduct segmentation experiments on an open high-resolution remote sensing dataset. Results show that the new activation function can accelerate the convergence of the neural network model and improve the accuracy of HRI segmentation effectively.
Important Links:
Go Back