Embedded feature selection for robust probability learning machines

dc.coverageDOI: 10.1016/j.patcog.2024.111157
dc.creatorCarrasco, Miguel
dc.creatorIvorra, Benjamin
dc.creatorLópez, Julio
dc.creatorRamos, Angel M.
dc.date2025
dc.date.accessioned2026-01-05T21:06:56Z
dc.date.available2026-01-05T21:06:56Z
dc.description<p>Methods: Feature selection is essential for building effective machine learning models in binary classification. Eliminating unnecessary features can reduce the risk of overfitting and improve classification performance. Moreover, the data we handle typically contains a stochastic component, making it important to develop robust models that are insensitive to data perturbations. Although there are numerous methods and tools for feature selection, relatively few studies address embedded feature selection within robust classification models using penalization techniques. Objective: In this work, we introduce robust classifiers with integrated feature selection capabilities, utilizing probability machines based on different penalization techniques, such as the ℓ<sub>1</sub>-norm or the elastic-net, combined with a novel Direct Feature Elimination process to improve model resilience and efficiency. Findings: Numerical experiments on standard datasets demonstrate the effectiveness and robustness of the proposed models in classification tasks even when using a reduced number of features. These experiments were evaluated using original performance indicators, highlighting the models’ ability to maintain high performance with fewer features. Novelty: The study discusses the trade-offs involved in combining different penalties to select the most relevant features while minimizing empirical risk. In particular, the integration of elastic-net and ℓ<sub>1</sub>-norm penalties within a unified framework, combined with the original Direct Feature Elimination approach, presents a novel method for improving both model accuracy and robustness.</p>eng
dc.identifierhttps://investigadores.uandes.cl/en/publications/4eb3e1a5-e41a-4494-9f72-fbf5b1ea1bd5
dc.identifier.urihttps://repositorio.uandes.cl/handle/uandes/62663
dc.languageeng
dc.rightsinfo:eu-repo/semantics/restrictedAccess
dc.sourcevol.159 (2025)
dc.subjectCobb–Douglas
dc.subjectFeature selection
dc.subjectMinimax Probability Machine
dc.subjectMinimum Error Minimax Probability Machine
dc.subjectSecond-order cone programming
dc.subjectSupport vector machines
dc.titleEmbedded feature selection for robust probability learning machineseng
dc.typeArticleeng
dc.typeArtículospa
Files
Collections