Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/116605
| DC Field | Value | Language |
|---|---|---|
| dc.contributor | Department of Civil and Environmental Engineering | - |
| dc.creator | Zhu, Z | en_US |
| dc.creator | Zhu, S | en_US |
| dc.date.accessioned | 2026-01-06T02:09:13Z | - |
| dc.date.available | 2026-01-06T02:09:13Z | - |
| dc.identifier.isbn | en_US | |
| dc.identifier.issn | 0888-3270 | en_US |
| dc.identifier.uri | http://hdl.handle.net/10397/116605 | - |
| dc.language.iso | en | en_US |
| dc.publisher | Academic Press | en_US |
| dc.subject | Asynchronous Kalman filtering | en_US |
| dc.subject | Response reconstruction | en_US |
| dc.subject | Sensor data recovery | en_US |
| dc.subject | Smoothing | en_US |
| dc.subject | Virtual sensing | en_US |
| dc.title | Asynchronous Kalman filtering for dynamic response reconstruction by fusing multi-type sensor data with arbitrary sampling frequencies | en_US |
| dc.type | Journal/Magazine Article | en_US |
| dc.identifier.spage | en_US | |
| dc.identifier.epage | en_US | |
| dc.identifier.volume | 215 | en_US |
| dc.identifier.issue | en_US | |
| dc.identifier.doi | 10.1016/j.ymssp.2024.111395 | en_US |
| dcterms.abstract | This study proposes a state-of-the-art asynchronous Kalman filtering (ASKF) technique for reconstructing the dynamic responses of multi-degree-of-freedom structures by fusing multi-type sensor data with arbitrary sampling frequencies. Response reconstruction technique, also known as state estimations or virtual sensing technique, has been gaining popularity in civil structural health monitoring (SHM). However, nearly all existing response reconstruction algorithms are designed with the assumption that all types of sensors work at the same sampling frequencies and sensor data are synchronized, which are often not satisfied in practical implementations. The proposed ASKF presents the first Kalman filter (KF)-based response reconstruction algorithm that directly performs the fusion of asynchronous sensor data sampled at arbitrary or even varying frequencies. The ASKF also enables the fusion and recovery of intermittent sensor data in the time domain. A new time vector is first formed by augmenting the observation time vectors of various sensor types. Then, different observation equations are defined and selected based on available observation data at each time step. Discretization is conducted at each time step, and simplification is made by truncating the Taylor polynomials. To improve the filter performance, the Rauch–Tung–Striebel smoothing procedure is applied in this presented ASKF algorithm. The effectiveness and robustness of the proposed algorithm have been verified through numerical and experimental studies of shear frames. | - |
| dcterms.accessRights | embargoed access | en_US |
| dcterms.bibliographicCitation | Mechanical systems and signal processing, 1 June 2024, v. 215, 111395 | en_US |
| dcterms.isPartOf | Mechanical systems and signal processing | en_US |
| dcterms.issued | 2024-06-01 | - |
| dc.identifier.scopus | 2-s2.0-85190065516 | - |
| dc.identifier.pmid | - | |
| dc.identifier.eissn | 1096-1216 | en_US |
| dc.identifier.artn | 111395 | en_US |
| dc.description.validate | 202601 bcch | - |
| dc.identifier.FolderNumber | a4247 | - |
| dc.identifier.SubFormID | 52431 | - |
| dc.description.fundingSource | RGC | en_US |
| dc.description.fundingSource | Others | en_US |
| dc.description.fundingText | This research was supported by the Research Grants Council of Hong Kong through the Theme-based Research Scheme (T22-501/23-R), Theme-based Research Scheme (T22-502/18-R), NSFC/RGC CRS (CRS_PolyU503/23), and by the Hong Kong Branch of the National Rail Transit Electrification and Automation Engineering Technology Research Center (No. K-BBY1). | en_US |
| dc.description.pubStatus | Published | en_US |
| dc.date.embargo | 2026-06-01 | en_US |
| dc.description.oaCategory | Green (AAM) | en_US |
| Appears in Collections: | Journal/Magazine Article | |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.



