Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/98858
| DC Field | Value | Language |
|---|---|---|
| dc.contributor | Department of Aeronautical and Aviation Engineering | en_US |
| dc.creator | Hasan, F | en_US |
| dc.creator | Huang, H | en_US |
| dc.date.accessioned | 2023-06-01T06:04:31Z | - |
| dc.date.available | 2023-06-01T06:04:31Z | - |
| dc.identifier.uri | http://hdl.handle.net/10397/98858 | - |
| dc.language.iso | en | en_US |
| dc.publisher | Molecular Diversity Preservation International (MDPI) | en_US |
| dc.rights | © 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). | en_US |
| dc.rights | The following publication Hasan, F., & Huang, H. (2023). MALS-Net: A Multi-Head Attention-Based LSTM Sequence-to-Sequence Network for Socio-Temporal Interaction Modelling and Trajectory Prediction. Sensors, 23(1), 530 is available at https://doi.org/10.3390/s23010530. | en_US |
| dc.subject | Autonomous driving | en_US |
| dc.subject | LSTM | en_US |
| dc.subject | Multi-head attention | en_US |
| dc.subject | Transformer | en_US |
| dc.subject | Vehicle trajectory prediction | en_US |
| dc.title | MALS-Net : a Multi-Head Attention-based LSTM sequence-to-sequence network for socio-temporal interaction modelling and trajectory prediction | en_US |
| dc.type | Journal/Magazine Article | en_US |
| dc.identifier.volume | 23 | en_US |
| dc.identifier.issue | 1 | en_US |
| dc.identifier.doi | 10.3390/s23010530 | en_US |
| dcterms.abstract | Predicting the trajectories of surrounding vehicles is an essential task in autonomous driving, especially in a highway setting, where minor deviations in motion can cause serious road accidents. The future trajectory prediction is often not only based on historical trajectories but also on a representation of the interaction between neighbouring vehicles. Current state-of-the-art methods have extensively utilized RNNs, CNNs and GNNs to model this interaction and predict future trajectories, relying on a very popular dataset known as NGSIM, which, however, has been criticized for being noisy and prone to overfitting issues. Moreover, transformers, which gained popularity from their benchmark performance in various NLP tasks, have hardly been explored in this problem, presumably due to the accumulative errors in their autoregressive decoding nature of time-series forecasting. Therefore, we propose MALS-Net, a Multi-Head Attention-based LSTM Sequence-to-Sequence model that makes use of the transformer’s mechanism without suffering from accumulative errors by utilizing an attention-based LSTM encoder-decoder architecture. The proposed model was then evaluated in BLVD, a more practical dataset without the overfitting issue of NGSIM. Compared to other relevant approaches, our model exhibits state-of-the-art performance for both short and long-term prediction. | en_US |
| dcterms.accessRights | open access | en_US |
| dcterms.bibliographicCitation | Sensors, Jan. 2023, v. 23, no. 1, 530 | en_US |
| dcterms.isPartOf | Sensors | en_US |
| dcterms.issued | 2023-01 | - |
| dc.identifier.scopus | 2-s2.0-85145974156 | - |
| dc.identifier.pmid | 36617127 | - |
| dc.identifier.eissn | 1424-8220 | en_US |
| dc.identifier.artn | 530 | en_US |
| dc.description.validate | 202306 bckw | en_US |
| dc.description.oa | Version of Record | en_US |
| dc.identifier.FolderNumber | a2052 | - |
| dc.identifier.SubFormID | 46390 | - |
| dc.description.fundingSource | Others | en_US |
| dc.description.fundingText | The Hong Kong Polytechnic University College of Undergraduate Researchers & Innovators (PolyU CURI)’s Undergraduate Research & Innovation Scheme (URIS) | en_US |
| dc.description.pubStatus | Published | en_US |
| dc.description.oaCategory | CC | en_US |
| Appears in Collections: | Journal/Magazine Article | |
Files in This Item:
| File | Description | Size | Format | |
|---|---|---|---|---|
| sensors-23-00530-v3.pdf | 522.67 kB | Adobe PDF | View/Open |
Page views
145
Last Week
27
27
Last month
Citations as of Feb 9, 2026
Downloads
54
Citations as of Feb 9, 2026
SCOPUSTM
Citations
36
Citations as of May 8, 2026
WEB OF SCIENCETM
Citations
25
Citations as of Apr 23, 2026
Google ScholarTM
Check
Altmetric
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.



