Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/105505
PIRA download icon_1.1View/Download Full Text
Title: RANet : Region attention network for semantic segmentation
Authors: Shen, D
Ji, Y
Li, P 
Wang, Y
Lin, D
Issue Date: 2020
Source: Advances in neural information processing systems, 2020, v. 33, p. 13927-13938
Abstract: Recent semantic segmentation methods model the relationship between pixels to construct the contextual representations. In this paper, we introduce the \emph{Region Attention Network} (RANet), a novel attention network for modeling the relationship between object regions. RANet divides the image into object regions, where we select representative information. In contrast to the previous methods, RANet configures the information pathways between the pixels in different regions, enabling the region interaction to exchange the regional context for enhancing all of the pixels in the image. We train the construction of object regions, the selection of the representative regional contents, the configuration of information pathways and the context exchange between pixels, jointly, to improve the segmentation accuracy. We extensively evaluate our method on the challenging segmentation benchmarks, demonstrating that RANet effectively helps to achieve the state-of-the-art results.
Publisher: NeurIPS
Journal: Advances in neural information processing systems 
Description: 34th Conference on Neural Information Processing Systems (NeurIPS 2020), 6-12 December 2020, Online
Rights: Posted with permission of the author.
Appears in Collections:Conference Paper

Files in This Item:
File Description SizeFormat 
Shen_Ranet_Region_Attention.pdf24.91 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Show full item record

Page views

95
Last Week
5
Last month
Citations as of Nov 9, 2025

Downloads

66
Citations as of Nov 9, 2025

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.