How to translate text using browser tools
14 December 2020 Coastal Image Captioning
Qiaoqiao Yang, Guangxing Wang, Xiaoyu Zhang, Christos Grecos, Peng Ren
Author Affiliations +
Abstract

Yang, Q.; Wang, G.; Zhang, X.; Grecos, C., and Ren, P., 2020. Coastal image captioning. In: Jung, H.-S.; Lee, S.; Ryu, J.-H., and Cui, T. (eds.), Advances in Geospatial Research of Coastal Environments. Journal of Coastal Research, Special Issue No. 102, pp. 145-150. Coconut Creek (Florida), ISSN 0749-0208.

Coastal images convey immense semantic information of the corresponding coastal areas. This article presents a preliminary approach to coastal image captioning that describes salient semantic information of coastal images with accurate and meaningful sentences. Specifically, this article exploits a state-of-the-art method referred to as self-critical sequence training for coastal image captioning. Firstly, a convolutional neural network (CNN) produces fixed-length features from the coastal images. Secondly, the fixed-length features are fed into a long short-term memory (LSTM) to generate captions, i.e., the accurate and meaningful sentences. Finally, the LSTM is optimized in the context of reinforcement learning. Experimental evaluation on the Moye island remote sensing dataset validates the effectiveness.

©Coastal Education and Research Foundation, Inc. 2020
Qiaoqiao Yang, Guangxing Wang, Xiaoyu Zhang, Christos Grecos, and Peng Ren "Coastal Image Captioning," Journal of Coastal Research 102(sp1), 145-150, (14 December 2020). https://doi.org/10.2112/SI102-018.1
Received: 1 September 2020; Accepted: 20 October 2020; Published: 14 December 2020
KEYWORDS
coastal images
Image captioning
long short-term memory
reinforcement learning
RIGHTS & PERMISSIONS
Get copyright permission
Back to Top