Divergence based feature selection for multimodal class densities | IEEE Journals & Magazine | IEEE Xplore

Divergence based feature selection for multimodal class densities


Abstract:

A new feature selection procedure based on the Kullback J-divergence between two class conditional density functions approximated by a finite mixture of parameterized den...Show More

Abstract:

A new feature selection procedure based on the Kullback J-divergence between two class conditional density functions approximated by a finite mixture of parameterized densities of a special type is presented. This procedure is suitable especially for multimodal data. Apart from finding a feature subset of any cardinality without involving any search procedure, it also simultaneously yields a pseudo-Bayes decision rule. Its performance is tested on real data.
Page(s): 218 - 223
Date of Publication: 29 February 1996

ISSN Information:


References

References is not available for this document.