Reliable Confidence Predictions Using Conformal Prediction
2016 (English)In: Advances in Knowledge Discovery and Data Mining: 20th Pacific-Asia Conference, PAKDD 2016, Auckland, New Zealand, April 19-22, 2016, Proceedings, Part I / [ed] James Bailey, Latifur Khan, Takashi Washio, Gill Dobbie, Joshua Zhexue Huang, Ruili Wang, Springer, 2016, 77-88 p.Conference paper (Refereed)
Conformal classifiers output confidence prediction regions, i.e., multi-valued predictions that are guaranteed to contain the true output value of each test pattern with some predefined probability. In order to fully utilize the predictions provided by a conformal classifier, it is essential that those predictions are reliable, i.e., that a user is able to assess the quality of the predictions made. Although conformal classifiers are statistically valid by default, the error probability of the prediction regions output are dependent on their size in such a way that smaller, and thus potentially more interesting, predictions are more likely to be incorrect. This paper proposes, and evaluates, a method for producing refined error probability estimates of prediction regions, that takes their size into account. The end result is a binary conformal confidence predictor that is able to provide accurate error probability estimates for those prediction regions containing only a single class label.
Place, publisher, year, edition, pages
Springer, 2016. 77-88 p.
Lecture Notes in Computer Science, ISSN 0302-9743 ; 9651
Research subject Computer and Systems Sciences
IdentifiersURN: urn:nbn:se:su:diva-137489DOI: 10.1007/978-3-319-31753-3_7ISBN: 978-3-319-31752-6ISBN: 978-3-319-31753-3OAI: oai:DiVA.org:su-137489DiVA: diva2:1062764
20th Pacific-Asia Conference, PAKDD 2016, Auckland, New Zealand, April 19-22, 2016