An Extension to the Revised Approach in the Assessment of Informational Entropy

Creative Commons License

Baran T., Harmancioglu N. B., Çetinkaya C. P., Barbaros F.

ENTROPY, vol.19, no.12, 2017 (SCI-Expanded) identifier identifier

  • Publication Type: Article / Article
  • Volume: 19 Issue: 12
  • Publication Date: 2017
  • Doi Number: 10.3390/e19120634
  • Journal Name: ENTROPY
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus
  • Keywords: uncertainty, information, informational entropy, variation of information, continuous probability distribution functions, confidence intervals, DESIGN
  • Dokuz Eylül University Affiliated: Yes


This study attempts to extend the prevailing definition of informational entropy, where entropy relates to the amount of reduction of uncertainty or, indirectly, to the amount of information gained through measurements of a random variable. The approach adopted herein describes informational entropy not as an absolute measure of information, but as a measure of the variation of information. This makes it possible to obtain a single value for informational entropy, instead of several values that vary with the selection of the discretizing interval, when discrete probabilities of hydrological events are estimated through relative class frequencies and discretizing intervals. Furthermore, the present work introduces confidence limits for the informational entropy function, which facilitates a comparison between the uncertainties of various hydrological processes with different scales of magnitude and different probability structures. The work addresses hydrologists and environmental engineers more than it does mathematicians and statisticians. In particular, it is intended to help solve information-related problems in hydrological monitoring design and assessment. This paper first considers the selection of probability distributions of best fit to hydrological data, using generated synthetic time series. Next, it attempts to assess hydrometric monitoring duration in a netwrok, this time using observed runoff data series. In both applications, it focuses, basically, on the theoretical background for the extended definition of informational entropy. The methodology is shown to give valid results in each case.