Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/28020
Title: Bus arrival time prediction at bus stop with multiple routes
Authors: Yu, B
Lam, WHK 
Tam, ML
Keywords: Artificial neural network
Bus arrival time prediction
K nearest neighbours algorithm
Multiple bus routes
Support vector machine
Issue Date: 2011
Publisher: Pergamon Press
Source: Transportation research. Part C, Emerging technologies, 2011, v. 19, no. 6, p. 1157-1170 How to cite?
Journal: Transportation research. Part C, Emerging technologies 
Abstract: Provision of accurate bus arrival information is vital to passengers for reducing their anxieties and waiting times at bus stop. This paper proposes models to predict bus arrival times at the same bus stop but with different routes. In the proposed models, bus running times of multiple routes are used for predicting the bus arrival time of each of these bus routes. Several methods, which include support vector machine (SVM), artificial neural network (ANN), k nearest neighbours algorithm (k-NN) and linear regression (LR), are adopted for the bus arrival time prediction. Observation surveys are conducted to collect bus running and arrival time data for validation of the proposed models. The results show that the proposed models are more accurate than the models based on the bus running times of single route. Moreover, it is found that the SVM model performs the best among the four proposed models for predicting the bus arrival times at bus stop with multiple routes.
URI: http://hdl.handle.net/10397/28020
ISSN: 0968-090X
DOI: 10.1016/j.trc.2011.01.003
Appears in Collections:Journal/Magazine Article

Access
View full-text via PolyU eLinks SFX Query
Show full item record

SCOPUSTM   
Citations

103
Last Week
1
Last month
5
Citations as of Aug 21, 2017

WEB OF SCIENCETM
Citations

93
Last Week
1
Last month
2
Citations as of Aug 22, 2017

Page view(s)

43
Last Week
0
Last month
Checked on Aug 20, 2017

Google ScholarTM

Check

Altmetric



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.