Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/118361
| DC Field | Value | Language |
|---|---|---|
| dc.contributor | Department of Land Surveying and Geo-Informatics | - |
| dc.creator | Zhang, H | - |
| dc.creator | Liu, Z | - |
| dc.date.accessioned | 2026-04-09T06:11:55Z | - |
| dc.date.available | 2026-04-09T06:11:55Z | - |
| dc.identifier.issn | 1940-3151 | - |
| dc.identifier.uri | http://hdl.handle.net/10397/118361 | - |
| dc.language.iso | en | en_US |
| dc.publisher | American Institute of Aeronautics and Astronautics, Inc. | en_US |
| dc.rights | Copyright © 2024 by the American Institute of Aeronautics and Astronautics, Inc. All rights reserved. This is the final accepted manuscript of the following article: Zhang, H., & Liu, Z. (2025). Four-dimensional aircraft trajectory prediction with a generative deep learning and clustering approach. Journal of Aerospace Information Systems, 22(2), 90-102, which has been published in final form at https://doi.org/10.2514/1.I011454. | en_US |
| dc.title | Four-dimensional aircraft trajectory prediction with a generative deep learning and clustering approach | en_US |
| dc.type | Journal/Magazine Article | en_US |
| dc.description.otherinformation | Title on author's file: 4D Aircraft Trajectory Prediction with a Generative Deep Learning and Clustering Approach | - |
| dc.identifier.spage | 90 | - |
| dc.identifier.epage | 102 | - |
| dc.identifier.volume | 22 | - |
| dc.identifier.issue | 2 | - |
| dc.identifier.doi | 10.2514/1.I011454 | - |
| dcterms.abstract | Medium-and long-term four-dimensional (4D) aircraft trajectory prediction (TP) is a critical technology in air traffic management (ATM). This paper addresses the issue of existing medium-and long-term TP methods that are difficult to accurately fit aircraft trajectory data distributions. We propose a 4D TP method based on K-medoids clustering and conditional tabular generative adversarial networks (CTGAN), called C-CTGAN. Comparative experiments with four long short-term memory (LSTM)-based models and the original CTGAN model show that the proposed model’s TP accuracy is significantly higher than others when predicting medium-and long-term trajectories. When using the trajectory datasets without holding and a prediction time span of 10 min, compared to the convolutional neural network (CNN)-LSTM model, the C-CTGAN model reduces the mean absolute errors (MAEs) of core trajectory parameters, such as latitude, longitude, geometric altitude, and ground speed, by 69.89, 15.00, 74.07, and 84.21%, respectively. Compared to the original CTGAN model, the MAE is reduced by 20.43, 39.09, 31.98, and 17.07%, respectively. When using the trajectory datasets with holding, compared to the CNN-LSTM model, the C-CTGAN model shows MAE reductions of 14.08, 23.68, 31.46, and 2.86%, respectively. Compared to the original CTGAN, the reduction is 34.88, 2.69, 23.16, and 73.91%, respectively. | - |
| dcterms.accessRights | open access | en_US |
| dcterms.bibliographicCitation | Journal of aerospace information systems, Feb. 2025, v. 22, no. 2, p. 90-102 | - |
| dcterms.isPartOf | Journal of aerospace information systems | - |
| dcterms.issued | 2025-02 | - |
| dc.identifier.scopus | 2-s2.0-85218420302 | - |
| dc.identifier.eissn | 2327-3097 | - |
| dc.description.validate | 202604 bcjz | - |
| dc.description.oa | Accepted Manuscript | en_US |
| dc.identifier.SubFormID | G001401/2026-03 | en_US |
| dc.description.fundingSource | RGC | en_US |
| dc.description.fundingText | The grant support from the Hong Kong Research Grants Council (RGC) General Research Fund (GRF) (15212622/B-Q94L) is greatly acknowledged. | en_US |
| dc.description.pubStatus | Published | en_US |
| dc.description.oaCategory | Green (AAM) | en_US |
| Appears in Collections: | Journal/Magazine Article | |
Files in This Item:
| File | Description | Size | Format | |
|---|---|---|---|---|
| Zhang_Four-dimensional_Aircraft_Trajectory.pdf | Pre-Published version | 2.38 MB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.



