Novelty Detection and Analysis in Convolutional Neural Networks

97 pages While many computer vision researchers race to architect improved convolutional neural networks (CNNs) and increase task accuracy, relatively little work has been done to understand and quantify the types of errors networks make. Regarding classification networks, certain incorrect response...

Full description

Bibliographic Details
Main Author: Eshed, Noam
Other Authors: Hariharan, Bharath, Field, David
Format: Thesis
Language:English
Published: 2020
Subjects:
Online Access:https://hdl.handle.net/1813/70292
http://dissertations.umi.com/cornell:10906
https://doi.org/10.7298/7273-6v18
Description
Summary:97 pages While many computer vision researchers race to architect improved convolutional neural networks (CNNs) and increase task accuracy, relatively little work has been done to understand and quantify the types of errors networks make. Regarding classification networks, certain incorrect responses may be better than others. In an animal classification task, for instance, the network categorizes an input image as one out of a set of trained labels. When such a network gets the label incorrect, there are many wrong labels it can choose. There are, however, varying degrees of incorrectness; a network which classifies a German shepherd as a poodle is a better network than one which classifies the German shepherd as a blue whale. In this work, I explore the responses of CNNs to sets of images in both known classes and novel classes (those which the network was not trained on). I analyze the predicted labels for these sets of images, as well as the predicted label distribution over many images in a given class. This paper also includes a discussion of the hierarchy of predicted labels and true labels, and what a network’s response in terms of higher-level categories reveals about its ability to generalize. Finally, I will discuss how humans and networks each measure the similarity between images, and show novelty's effect on networks' agreement with people.