Skip to search formSkip to main content>Semantic Scholar Semantic Scholar's Logo

Search

You are currently offline. Some features of the site may not work correctly.

Semantic Scholar uses AI to extract papers important to this topic.

Highly Cited

2007

Highly Cited

2007

Chapter 3 deals with probability distributions, discrete and continuous densities, distribution functions, bivariate… Expand

Highly Cited

2004

Highly Cited

2004

Rough set theory is a relatively new mathematical tool for use in computer applications in circumstances which are characterized… Expand

Review

1997

Review

1997

We assume that H(f) is well-defined and is finite. The concept of differential entropy was introduced in Shannon’s original paper… Expand

Highly Cited

1994

Highly Cited

1994

Abstract In statistical physics, useful notions of entropy are defined with respect to some coarse-graining procedure over a… Expand

Highly Cited

1992

Highly Cited

1992

Adapted waveform analysis uses a library of orthonormal bases and an efficiency functional to match a basis to a given signal or… Expand

Review

1991

Review

1991

Preface to the Second Edition. Preface to the First Edition. Acknowledgments for the Second Edition. Acknowledgments for the… Expand

Highly Cited

1991

Highly Cited

1991

A novel class of information-theoretic divergence measures based on the Shannon entropy is introduced. Unlike the well-known… Expand

Highly Cited

1986

Highly Cited

1986

Abstract A new nonprobabilistic entropy measure is introduced in the context of fuzzy sets or messages. Fuzzy units, or fits… Expand

Highly Cited

1980

Highly Cited

1980

Jaynes's principle of maximum entropy and Kullbacks principle of minimum cross-entropy (minimum directed divergence) are shown to… Expand

Review

1977

Review

1977

1. Entropy and mutual information 2. Discrete memoryless channels and their capacity-cost functions 3. Discrete memoryless… Expand