DE eng

Search in the Catalogues and Directories

Hits 1 – 1 of 1

1
Survey and Comparative Analysis of Entropy and Relative Entropy Thresholding Techniques
In: DTIC (2006)
Abstract: Entropy-based image thresholding has received considerable interest in recent years. Two types of entropy are generally used as thresholding criteria: Shannon's entropy and relative entropy, also known as Kullback-Leibler information distance, where the former measures uncertainty in an information source with an optimal threshold obtained by maximizing Shannon's entropy, whereas the latter measures the information discrepancy between two different sources with an optimal threshold obtained by minimizing relative entropy. Many thresholding methods have been developed for both criteria and reported in the literature. These two entropy-based thresholding criteria have been investigated and the relationship among entropy and relative entropy thresholding methods has been explored. In particular, a survey and comparative analysis is conducted among several widely used methods that include Pun and Kapur's maximum entropy, Kittler and Illingworth's minimum error thresholding, Pal and Pal's entropy thresholding and Chang et al.'s relative entropy thresholding methods. In order to objectively assess these methods, two measures, uniformity and shape, are used for performance evaluation. ; Published in IEE Proceedings Vision, Image and Signal Processing, v153 n6 p837-850, 6 Dec 2006.
Keyword: *COMPARISON; *ENTROPY; *IMAGE PROCESSING; *RELATIVE ENTROPY; *THRESHOLDING; Cybernetics; IMAGE SEGMENTATION; INFORMATION THEORY; REPRINTS; TEST AND EVALUATION; UNCERTAINTY
URL: http://www.dtic.mil/docs/citations/ADA464347
http://oai.dtic.mil/oai/oai?&verb=getRecord&metadataPrefix=html&identifier=ADA464347
BASE
Hide details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
1
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern