Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/114433
DC FieldValueLanguage
dc.contributorDepartment of Building Environment and Energy Engineering-
dc.creatorDeng, Ren_US
dc.creatorDing, Sen_US
dc.creatorDing, Yen_US
dc.creatorWang, Men_US
dc.creatorHuang, Xen_US
dc.creatorUsmani, ASen_US
dc.date.accessioned2025-08-06T09:12:14Z-
dc.date.available2025-08-06T09:12:14Z-
dc.identifier.issn0957-4174en_US
dc.identifier.urihttp://hdl.handle.net/10397/114433-
dc.language.isoenen_US
dc.publisherElsevier Ltden_US
dc.subjectAutonomous navigationen_US
dc.subjectFirefighting roboten_US
dc.subjectSmart firefightingen_US
dc.subjectVision sharingen_US
dc.subjectVisual guidanceen_US
dc.titleAutonomous navigation of non-perception firefighting robot through CCTV-informed vision sharingen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.volume296en_US
dc.identifier.doi10.1016/j.eswa.2025.129210en_US
dcterms.abstractDeploying smart firefighting robots indoors is a prospective measure to realize fast-response and zero-casualty firefighting operations. Most existing firefighting robots require manual control or pre-input information for navigation. This work proposes a CCTV-informed autonomous navigation system for robot to reach target flame via the safest and fastest route. This proposed system uses the vision of building CCTV-camera network and provides a full-process navigation pipeline to guide imperceptive firefighting robots. Demonstrations show that the navigation system enables the robot to have a complete perception of floorplan and understand the evolution of fire scenes. The robot can autonomously move towards the target flame even if carry-on vision sensors are blocked or damaged. Finally, the dynamic system responsiveness is validated to prove its effectiveness in addressing randomly moving obstacles. This framework can be further integrated with smart building maintenance system, which offers low-cost, early-response, and more resilient solutions for firefighting and safety patrol robots.-
dcterms.accessRightsembargoed accessen_US
dcterms.bibliographicCitationExpert systems with applications, 15 Jan. 2026, v. 296, pt. D, 129210en_US
dcterms.isPartOfExpert systems with applicationsen_US
dcterms.issued2026-01-15-
dc.identifier.eissn1873-6793en_US
dc.identifier.artn129210en_US
dc.description.validate202508 bcch-
dc.identifier.FolderNumbera3963b-
dc.identifier.SubFormID51838-
dc.description.fundingSourceRGCen_US
dc.description.fundingSourceOthersen_US
dc.description.fundingTextKey-Area Research and Development Program of Guangdong Provinceen_US
dc.description.pubStatusPublisheden_US
dc.date.embargo2028-01-15en_US
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Journal/Magazine Article
Open Access Information
Status embargoed access
Embargo End Date 2028-01-15
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.