Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Classification Under Partial Reject Options
Stockholm University, Faculty of Science, Department of Mathematics.ORCID iD: 0000-0001-9662-507x
Stockholm University, Faculty of Science, Department of Mathematics.ORCID iD: 0000-0003-2767-8818
Number of Authors: 22023 (English)In: Journal of Classification, ISSN 0176-4268, E-ISSN 1432-1343, article id s00357-023-09455-xArticle in journal (Refereed) Epub ahead of print
Abstract [en]

In many applications there is ambiguity about which (if any) of a finite number N of hypotheses that best fits an observation. It is of interest then to possibly output awhole set of categories, that is, a scenario where the size of the classified set of categories ranges from 0 to N. Empty sets correspond to an outlier, sets of size 1 represent a firm decision that singles out one hypothesis, sets of size N correspond to a rejection to classify, whereas sets of sizes 2,..., N - 1 represent a partial rejection to classify, where some hypotheses are excluded from further analysis. In this paper, we review and unify several proposed methods of Bayesian set-valued classification, where the objective is to find the optimal Bayesian classifier that maximizes the expected reward. We study a large class of reward functions with rewards for sets that include the true category, whereas additive or multiplicative penalties are incurred for sets depending on their size. For models with one homogeneous block of hypotheses, we provide general expressions for the accompanying Bayesian classifier, several of which extend previous results in the literature. Then, we derive novel results for the more general setting when hypotheses are partitioned into blocks, where ambiguity within and between blocks are of different severity. We also discuss how well-known methods of classification, such as conformal prediction, indifference zones, and hierarchical classification, fit into our framework. Finally, set-valued classification is illustrated using an ornithological data set, with taxa partitioned into blocks and parameters estimated using MCMC. The associated reward function's tuning parameters are chosen through cross-validation.

Place, publisher, year, edition, pages
2023. article id s00357-023-09455-x
Keywords [en]
Blockwise cross-validation, Bayesian classification, Conformal prediction, Classes of hypotheses, Indifference zones, Markov Chain Monte Carlo, Reward functions with set-valued inputs, Set-valued classifiers
National Category
Mathematics Psychology
Identifiers
URN: urn:nbn:se:su:diva-225421DOI: 10.1007/s00357-023-09455-xISI: 001113203500001OAI: oai:DiVA.org:su-225421DiVA, id: diva2:1828356
Available from: 2024-01-16 Created: 2024-01-16 Last updated: 2024-01-16

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full text

Authority records

Karlsson, MånsHössjer, Ola

Search in DiVA

By author/editor
Karlsson, MånsHössjer, Ola
By organisation
Department of Mathematics
In the same journal
Journal of Classification
MathematicsPsychology

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 9 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf