About 1,220,000 results
Open links in new tab
  1. information theory - Intuitive explanation of entropy

    Mar 15, 2013 · For a verbose explanation of the intuition behind Shannon's entropy equation, you could check out this document: Understanding Shannon's Entropy metric for Information.

  2. What is information theoretic entropy and its physical significance ...

    The entropy of a message is a measurement of how much information it carries. One way of saying this (per your textbook) is to say that a message has high entropy if each word …

  3. information theory - How do the notions of uncertainty and …

    Nov 2, 2021 · 1 If one deals with the information content of news, one comes across the so-called entropy again and again on the internet. When this is explored further, it is often referred to as …

  4. information theory - In what situations does Shannon entropy …

    Oct 7, 2020 · in what situations would we find informational disorder/transmission (information entropy) increase or decrease, given that Shannon entropy is commonly viewed as a non …

  5. In information entropy, how do nats relate to any representation …

    Mar 26, 2022 · 1 Calculating the information entropy depends on taking the logarithms of probabilities in some base. If I use base 2, then the entropy is in "bits". The measure of bits is …

  6. information theory - How is the formula of Shannon Entropy …

    From this slide, it's said that the smallest possible number of bits per symbol is as the Shannon Entropy formula defined: I've read this post, and still not quite understand how is this formula …

  7. What is the role of the logarithm in Shannon's entropy?

    Apr 26, 2022 · Shannon's entropy is logarithmic because the CHANCES of multiple information events occurring simultaneously are multiplied ... but should all those events occur the total …

  8. entropy - Link between channel capacity and mutual information ...

    Aug 21, 2024 · Information Channel Capacity is defined as supremum mutual information (entropy reduction). Mutual Information measures the quantity of information in a static context …

  9. Understanding information entropy - Mathematics Stack Exchange

    Jun 27, 2017 · I have tried very very hard to get a good intuitive grasp on entropy. I read all the information theoretic intuitions on the web, but I have not exactly been able to understand …

  10. Converting between bits, nats and dits - Physics Forums

    Aug 12, 2011 · Given a number representing information entropy in some base, is there a well-defined way to convert this to the number which would represent the same entropy according …