Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/40743
Title: Automatic flame tracking technique for atrium fire from video images
Authors: Li, J
Lu, P
Fong, NK 
Chow, W 
Wong, L 
Xu, D
Keywords: Video
Sensors
Issue Date: 2004
Publisher: SPIE-International Society for Optical Engineering
Source: Proceedings of SPIE : the International Society for Optical Engineering, 2005, v. 5637, 257 How to cite?
Journal: Proceedings of SPIE : the International Society for Optical Engineering 
Abstract: Smoke control is one of the important aspects in atrium fire. For an efficient smoke control strategy, it is very important to identify the smoke and fire source in a very short period of time. However, traditional methods such as point type detectors are not effective for smoke and fire detection in large space such as atrium. Therefore, video smoke and fire detection systems are proposed. For the development of the system, automatic extraction and tracking of flame are two important problems needed to be solved. Based on entropy theory, region growing and Otsu method, a new automatic integrated algorithm, which is used to track flame from video images, is proposed in this paper. It can successfully identify flames from different environment, different background and in different form. The experimental results show that this integrated algorithm has stronger robustness and wider adaptability. In addition, because of the low computational demand of this algorithm, it is also possible to be used as part of a robust, real-time smoke and fire detection system.
Description: Conference on Asia-Pacific Optical Communications, Beijing, China, 7-11 November 2004
URI: http://hdl.handle.net/10397/40743
ISSN: 0277-786X
EISSN: 1996-756X
DOI: 10.1117/12.573869
Appears in Collections:Conference Paper

Access
View full-text via PolyU eLinks SFX Query
Show full item record

Page view(s)

33
Last Week
1
Last month
Checked on Aug 21, 2017

Google ScholarTM

Check

Altmetric



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.