Repository logo
 

Case-Based Statistical Learning: A Non-Parametric Implementation with a Conditional-Error Rate SVM

Accepted version
Peer-reviewed

Type

Article

Change log

Authors

Gorriz, JM 
Ramirez, J 
Illan, IA 
Ortiz, A 

Abstract

© 2013 IEEE. Machine learning has been successfully applied to many areas of science and engineering. Some examples include time series prediction, optical character recognition, signal and image classification in biomedical applications for diagnosis and prognosis and so on. In the theory of semi-supervised learning, we have a training set and an unlabeled data, that are employed to fit a prediction model or learner, with the help of an iterative algorithm, such as the expectation-maximization algorithm. In this paper, a novel non-parametric approach of the so-called case-based statistical learning is proposed in a low-dimensional classification problem. This supervised feature selection scheme analyzes the discrete set of outcomes in the classification problem by hypothesis-testing and makes assumptions on these outcome values to obtain the most likely prediction model at the training stage. A novel prediction model is described in terms of the output scores of a confidence-based support vector machine classifier under class-hypothesis testing. To have a more accurate prediction by considering the unlabeled points, the distribution of unlabeled examples must be relevant for the classification problem. The estimation of the error rates from a well-trained support vector machines allows us to propose a non-parametric approach avoiding the use of Gaussian density function-based models in the likelihood ratio test.

Description

Keywords

Statistical learning and decision theory, support vector machines (SVM), hypothesis testing, partial least squares, conditional-error rate

Journal Title

IEEE Access

Conference Name

Journal ISSN

2169-3536
2169-3536

Volume Title

5

Publisher

Institute of Electrical and Electronics Engineers (IEEE)