Investigating Ensembles of Single-class Classifiers for Multi-class Classification

Loading...
Thumbnail Image

Authors

Novotny, Alexander

Issue Date

2023

Type

Thesis

Language

Keywords

Classification , Deep Learning , Ensemble , Machine Learning , Novelty Detection

Research Projects

Organizational Units

Journal Issue

Alternative Title

Abstract

Traditional methods of multi-class classification in machine learning involve the use of a monolithic feature extractor and classifier head trained on data from all of the classes at once. These architectures (especially the classifier head) are dependent on the number and types of classes, and are therefore rigid against changes to the class set. For best performance, one must retrain networks with these architectures from scratch, incurring a large cost in training time. As well, these networks can be biased towards classes with a large imbalance in training data compared to other classes. Instead, ensembles of so-called ''single-class'' classifiers can be used for multi-class classification by training an individual network for each class.We show that these ensembles of single-class classifiers are more flexible to changes to the class set than traditional models, and can be quickly retrained to consider small changes to the class set, such as by adding, removing, splitting, or fusing classes. As well, we show that these ensembles are less biased towards classes with large imbalances in their training data than traditional models. We also introduce a new, more powerful single-class classification architecture. These models are trained and tested on a plant disease dataset with high variance in the number of classes and amount of data in each class, as well as on an Alzheimer's dataset with low amounts of data and a large imbalance in data between classes.

Description

Citation

Publisher

License

Creative Commons Attribution 4.0 United States

Journal

Volume

Issue

PubMed ID

DOI

ISSN

EISSN