Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/109883
DC Field | Value | Language |
---|---|---|
dc.contributor | Department of Land Surveying and Geo-Informatics | - |
dc.creator | Gao, QL | - |
dc.creator | Zhong, C | - |
dc.creator | Yue, Y | - |
dc.creator | Cao, R | - |
dc.creator | Zhang, B | - |
dc.date.accessioned | 2024-11-20T07:30:09Z | - |
dc.date.available | 2024-11-20T07:30:09Z | - |
dc.identifier.issn | 0143-6228 | - |
dc.identifier.uri | http://hdl.handle.net/10397/109883 | - |
dc.language.iso | en | en_US |
dc.publisher | Elsevier Ltd | en_US |
dc.rights | © 2023 The Authors. Published by Elsevier Ltd. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/). | en_US |
dc.rights | The following publication Gao, Q.-L., Zhong, C., Yue, Y., Cao, R., & Zhang, B. (2024). Income estimation based on human mobility patterns and machine learning models. Applied Geography, 163, 103179 is available at https://doi.org/10.1016/j.apgeog.2023.103179. | en_US |
dc.subject | Human mobility patterns | en_US |
dc.subject | Income estimation | en_US |
dc.subject | Machine learning | en_US |
dc.subject | Public transit | en_US |
dc.title | Income estimation based on human mobility patterns and machine learning models | en_US |
dc.type | Journal/Magazine Article | en_US |
dc.identifier.volume | 163 | - |
dc.identifier.doi | 10.1016/j.apgeog.2023.103179 | - |
dcterms.abstract | Sustainable and inclusive urban development requires a thorough understanding of income distribution and poverty. Recent related research has extensively explored the use of automatically generated sensor data to proxy economic activities. Notably, human mobility patterns have been found to exhibit strong associations with socioeconomic attributes and great potential for income estimation. However, the representation of complex human mobility patterns and their effectiveness in income estimation needs further investigation. To address this, we propose three representations of human mobility: mobility indicators, activity footprints, and travel graphs. These representations feed into various models, including XGBoost, a traditional machine learning model, a convolutional neural network (CNN), and a time-series graph neural network (GCRN). By leveraging public transit data from Shenzhen, our study demonstrates that graph-based representations and deep learning models outperform other approaches in income estimation. They excel in minimising information loss and handling complex data structures. Spatial contextual attributes, such as transport accessibility, are the most influential factors, while indicators related to activity extent, temporal rhythm, and intensity contribute comparatively less. In summary, this study highlights the potential of cutting-edge artificial intelligence tools and emerging human mobility data as an alternative approach to estimating income distribution and addressing poverty-related concerns. | - |
dcterms.accessRights | open access | en_US |
dcterms.bibliographicCitation | Applied geography, Feb. 2024, v. 163, 103179 | - |
dcterms.isPartOf | Applied geography | - |
dcterms.issued | 2024-02 | - |
dc.identifier.scopus | 2-s2.0-85180786503 | - |
dc.identifier.eissn | 1873-7730 | - |
dc.identifier.artn | 103179 | - |
dc.description.validate | 202411 bcch | - |
dc.description.oa | Version of Record | en_US |
dc.identifier.FolderNumber | OA_Scopus/WOS | en_US |
dc.description.fundingSource | Others | en_US |
dc.description.fundingText | National Natural Science Foundation of China; European Research Council (ERC) under the European Union's Horizon 2020 research and innovation programme; ESRC under JPI Urban Europe/NSFC | en_US |
dc.description.pubStatus | Published | en_US |
dc.description.oaCategory | CC | en_US |
Appears in Collections: | Journal/Magazine Article |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
1-s2.0-S0143622823003107-main.pdf | 4.43 MB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.