Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/102178
DC Field | Value | Language |
---|---|---|
dc.contributor | Department of Building Environment and Energy Engineering | en_US |
dc.creator | Zhu, H | en_US |
dc.creator | Xie, W | en_US |
dc.creator | Li, J | en_US |
dc.creator | Shi, J | en_US |
dc.creator | Fu, M | en_US |
dc.creator | Qian, X | en_US |
dc.creator | Zhang, H | en_US |
dc.creator | Wang, K | en_US |
dc.creator | Chen, G | en_US |
dc.date.accessioned | 2023-10-11T04:14:36Z | - |
dc.date.available | 2023-10-11T04:14:36Z | - |
dc.identifier.issn | 1424-8220 | en_US |
dc.identifier.uri | http://hdl.handle.net/10397/102178 | - |
dc.language.iso | en | en_US |
dc.publisher | Molecular Diversity Preservation International (MDPI) | en_US |
dc.rights | © 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). | en_US |
dc.rights | The following publication Zhu, H., Xie, W., Li, J., Shi, J., Fu, M., Qian, X., ... & Chen, G. (2023). Advanced computer vision-based subsea gas leaks monitoring: a comparison of two approaches. Sensors, 23(5), 2566 is available at https://doi.org/10.3390/s23052566. | en_US |
dc.subject | Advanced computer vision | en_US |
dc.subject | Faster R-CNN | en_US |
dc.subject | Optical camera detection | en_US |
dc.subject | Subsea gas leak monitoring | en_US |
dc.subject | YOLOv4 | en_US |
dc.title | Advanced computer vision-based subsea gas leaks monitoring : a comparison of two approaches | en_US |
dc.type | Journal/Magazine Article | en_US |
dc.identifier.volume | 23 | en_US |
dc.identifier.issue | 5 | en_US |
dc.identifier.doi | 10.3390/s23052566 | en_US |
dcterms.abstract | Recent years have witnessed the increasing risk of subsea gas leaks with the development of offshore gas exploration, which poses a potential threat to human life, corporate assets, and the environment. The optical imaging-based monitoring approach has become widespread in the field of monitoring underwater gas leakage, but the shortcomings of huge labor costs and severe false alarms exist due to related operators’ operation and judgment. This study aimed to develop an advanced computer vision-based monitoring approach to achieve automatic and real-time monitoring of underwater gas leaks. A comparison analysis between the Faster Region Convolutional Neural Network (Faster R-CNN) and You Only Look Once version 4 (YOLOv4) was conducted. The results demonstrated that the Faster R-CNN model, developed with an image size of 1280 × 720 and no noise, was optimal for the automatic and real-time monitoring of underwater gas leakage. This optimal model could accurately classify small and large-shape leakage gas plumes from real-world datasets, and locate the area of these underwater gas plumes. | en_US |
dcterms.accessRights | open access | en_US |
dcterms.bibliographicCitation | Sensors (Switzerland), Mar. 2023, v. 23, no. 5, 2566 | en_US |
dcterms.isPartOf | Sensors (Switzerland) | en_US |
dcterms.issued | 2023-03 | - |
dc.identifier.scopus | 2-s2.0-85149718020 | - |
dc.identifier.pmid | 36904768 | - |
dc.identifier.artn | 2566 | en_US |
dc.description.validate | 202310 bckw | en_US |
dc.description.oa | Version of Record | en_US |
dc.identifier.FolderNumber | OA_Others | - |
dc.description.fundingSource | Others | en_US |
dc.description.fundingText | Hubei Province unveiling project | en_US |
dc.description.pubStatus | Published | en_US |
dc.description.oaCategory | CC | en_US |
Appears in Collections: | Journal/Magazine Article |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
sensors-23-02566-v2.pdf | 9.65 MB | Adobe PDF | View/Open |
Page views
118
Citations as of May 11, 2025
Downloads
35
Citations as of May 11, 2025
SCOPUSTM
Citations
9
Citations as of May 15, 2025
WEB OF SCIENCETM
Citations
6
Citations as of May 15, 2025

Google ScholarTM
Check
Altmetric
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.