Skip to main navigation Skip to search Skip to main content

Addressing overfitting in classification models for transport mode choice prediction: a practical application in the Aburrá Valley, Colombia

  • Universidad Nacional de Colombia
  • Universidad Javeriana

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

Overfitting poses a significant limitation in mode choice prediction using classification models, often worsened by the proliferation of features from encoding categorical variables. While dimensionality reduction techniques are widely utilized, their effects on travel-mode choice models’ performance have yet to be comparatively studied. This research compares the impact of dimensionality reduction methods (PCA, CATPCA, FAMD, LDA) on the performance of multinomial models and various supervised learning classifiers (XGBoost, Random Forest, Naive Bayes, K-Nearest Neighbors, Multinomial Logit) for predicting travel mode choice. Utilizing survey data from the Aburrá Valley in Colombia, we detail the process of analyzing derived dimensions and selecting optimal models for both overall and class-specific predictions. Results indicate that dimension reduction enhances predictive power, particularly for less common transport modes, providing a strategy to address class imbalance without modifying data distribution. This methodology deepens understanding of travel behavior, offering valuable insights for modelers and policymakers in developing regions with similar characteristics.

Original languageEnglish
Pages (from-to)1213-1230
Number of pages18
JournalTransportation Letters
Volume17
Issue number7
DOIs
StatePublished - 2025

Keywords

  • Machine learning
  • dimension reduction
  • imbalanced data
  • supervised learning
  • transport modes
  • travel behavior

Fingerprint

Dive into the research topics of 'Addressing overfitting in classification models for transport mode choice prediction: a practical application in the Aburrá Valley, Colombia'. Together they form a unique fingerprint.

Cite this