Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/115607
PIRA download icon_1.1View/Download Full Text
Title: Improved regression tree models using generalization error-based splitting criteria
Authors: Yang, Y 
Wang, S 
Laporte, G
Issue Date: 2025
Source: Naval research logistics, First published: 10 June 2025, Early View, https://doi.org/10.1002/nav.22270
Abstract: Despite the widespread application of machine learning (ML) approaches such as the regression tree (RT) in the field of data-driven optimization, overfitting may impair the effectiveness of ML models and thus hinder the deployment of ML for decision-making. In particular, we address the overfitting issue of the traditional RT splitting criterion with a limited sample size, which considers only the training mean squared error, and we accurately specify the mathematical formula for the generalization error. We introduce two novel splitting criteria based on generalization error, which offer higher-quality approximations of the generalization error than the traditional training error does. One criterion is formulated through a mathematical derivation based on the RT model, and the second is established through leave-one-out cross-validation (LOOCV). We construct RT models using our proposed generalization error-based splitting criteria from extensive ML benchmark instances and report the experimental results, including the models' computational efficiency, prediction accuracy, and robustness. Our findings endorse the superior efficacy and robustness of the RT model based on the refined LOOCV-informed splitting criterion, marking substantial improvements over those of the traditional RT model. Additionally, our tree structure analysis provides insights into how our proposed LOOCV-informed splitting criterion guides the model in striking a balance between a complex tree structure and accurate predictions.
Keywords: Generalization error
Leave-one-out cross-validation
Mean squared error
Regression tree
Publisher: John Wiley & Sons, Inc.
Journal: Naval research logistics 
ISSN: 0894-069X
EISSN: 1520-6750
DOI: 10.1002/nav.22270
Rights: This is an open access article under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/), which permits use, distribution and reproduction in any medium, provided the original work is properly cited.
© 2025 The Author(s). Naval Research Logistics published by Wiley Periodicals LLC.
The following publication Yang, Y., Wang, S. and Laporte, G. (2025), Improved Regression Tree Models Using Generalization Error-Based Splitting Criteria. Naval Research Logistics is available at https://doi.org/10.1002/nav.22270.
Appears in Collections:Journal/Magazine Article

Files in This Item:
File Description SizeFormat 
Yang_Improved_Regression_Tree.pdf662.39 kBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show full item record

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.