ePrints@IIScePrints@IISc Home | About | Browse | Latest Additions | Advanced Search | Contact | Help

Voronoi Networks and Their Probability of Misclassification

Krishna, K and Thathachar, MAL and Ramakrishnan, KR (2000) Voronoi Networks and Their Probability of Misclassification. In: IEEE Transactions on Neural Networks, 11 (6). 1361 -1372.

[img]
Preview
PDF
Voronoi.pdf

Download (292Kb)

Abstract

Nearest neighbor classifiers that use all the training samples for classification require large memory and demand large online testing computation. To reduce the memory requirements and the computation cost, many algorithms have been developed that perform nearest neighbor classification using only a small number of representative samples obtained from the training set. We call the classification model underlying all these algorithms as Voronoi networks (Vnets), because these algorithms discretize the feature space into Voronoi regions and assign the samples in each region to a class. In this paper we analyze the generalization capabilities of these networks by bounding the generalization error. The class of problems that can be "efficiently" solved by Vnets is characterized by the extent to which the set of points on the decision boundaries fill the feature space, thus quantifying how efficiently a problem can be solved using Vnets. We show that Vnets asymptotically converge to the Bayes classifier with arbitrarily high probability provided the number of representative samples grow slower than the square root of the number of training samples and also give the optimal growth rate of the number of representative samples. We redo the analysis for decision tree (DT) classifiers and compare them with Vnets. The bias/variance dilemma and the curse of dimensionality with respect to Vnets and DTs are also discussed.

Item Type: Journal Article
Additional Information: Copyright 1990 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.
Keywords: Neural networks;Pattern recognition;Statistical learning theory
Department/Centre: Division of Electrical Sciences > Electrical Engineering
Date Deposited: 16 Feb 2006
Last Modified: 19 Sep 2010 04:23
URI: http://eprints.iisc.ernet.in/id/eprint/5403

Actions (login required)

View Item View Item