"conditional entropy" meaning in All languages combined

See conditional entropy on Wiktionary

Noun [English]

Forms: conditional entropies [plural]
Head templates: {{en-noun}} conditional entropy (plural conditional entropies)
  1. (information theory) The portion of a random variable's own Shannon entropy which is independent from another, given, random variable. Wikipedia link: conditional entropy Categories (topical): Information theory Related terms: conditional probability
    Sense id: en-conditional_entropy-en-noun-Qk-LJtUn Categories (other): English entries with incorrect language header Topics: computing, engineering, information-theory, mathematics, natural-sciences, physical-sciences, sciences

Inflected forms

Download JSON data for conditional entropy meaning in All languages combined (1.6kB)

{
  "forms": [
    {
      "form": "conditional entropies",
      "tags": [
        "plural"
      ]
    }
  ],
  "head_templates": [
    {
      "args": {},
      "expansion": "conditional entropy (plural conditional entropies)",
      "name": "en-noun"
    }
  ],
  "lang": "English",
  "lang_code": "en",
  "pos": "noun",
  "senses": [
    {
      "categories": [
        {
          "kind": "other",
          "name": "English entries with incorrect language header",
          "parents": [
            "Entries with incorrect language header",
            "Entry maintenance"
          ],
          "source": "w"
        },
        {
          "kind": "topical",
          "langcode": "en",
          "name": "Information theory",
          "orig": "en:Information theory",
          "parents": [
            "Applied mathematics",
            "Mathematics",
            "Formal sciences",
            "Sciences",
            "All topics",
            "Fundamental"
          ],
          "source": "w"
        }
      ],
      "examples": [
        {
          "text": "The conditional entropy of random variable Y given X (i.e., conditioned by X), denoted as H(Y|X), is equal to H(Y)-I(Y;X) where I(Y;X) is the mutual information between Y and X."
        }
      ],
      "glosses": [
        "The portion of a random variable's own Shannon entropy which is independent from another, given, random variable."
      ],
      "id": "en-conditional_entropy-en-noun-Qk-LJtUn",
      "links": [
        [
          "information theory",
          "information theory"
        ],
        [
          "random variable",
          "random variable"
        ],
        [
          "Shannon entropy",
          "Shannon entropy"
        ]
      ],
      "raw_glosses": [
        "(information theory) The portion of a random variable's own Shannon entropy which is independent from another, given, random variable."
      ],
      "related": [
        {
          "word": "conditional probability"
        }
      ],
      "topics": [
        "computing",
        "engineering",
        "information-theory",
        "mathematics",
        "natural-sciences",
        "physical-sciences",
        "sciences"
      ],
      "wikipedia": [
        "conditional entropy"
      ]
    }
  ],
  "word": "conditional entropy"
}
{
  "forms": [
    {
      "form": "conditional entropies",
      "tags": [
        "plural"
      ]
    }
  ],
  "head_templates": [
    {
      "args": {},
      "expansion": "conditional entropy (plural conditional entropies)",
      "name": "en-noun"
    }
  ],
  "lang": "English",
  "lang_code": "en",
  "pos": "noun",
  "related": [
    {
      "word": "conditional probability"
    }
  ],
  "senses": [
    {
      "categories": [
        "English countable nouns",
        "English entries with incorrect language header",
        "English lemmas",
        "English multiword terms",
        "English nouns",
        "en:Information theory"
      ],
      "examples": [
        {
          "text": "The conditional entropy of random variable Y given X (i.e., conditioned by X), denoted as H(Y|X), is equal to H(Y)-I(Y;X) where I(Y;X) is the mutual information between Y and X."
        }
      ],
      "glosses": [
        "The portion of a random variable's own Shannon entropy which is independent from another, given, random variable."
      ],
      "links": [
        [
          "information theory",
          "information theory"
        ],
        [
          "random variable",
          "random variable"
        ],
        [
          "Shannon entropy",
          "Shannon entropy"
        ]
      ],
      "raw_glosses": [
        "(information theory) The portion of a random variable's own Shannon entropy which is independent from another, given, random variable."
      ],
      "topics": [
        "computing",
        "engineering",
        "information-theory",
        "mathematics",
        "natural-sciences",
        "physical-sciences",
        "sciences"
      ],
      "wikipedia": [
        "conditional entropy"
      ]
    }
  ],
  "word": "conditional entropy"
}

This page is a part of the kaikki.org machine-readable All languages combined dictionary. This dictionary is based on structured data extracted on 2024-05-12 from the enwiktionary dump dated 2024-05-02 using wiktextract (ae36afe and 304864d). The data shown on this site has been post-processed and various details (e.g., extra categories) removed, some information disambiguated, and additional data merged from other sources. See the raw data download page for the unprocessed wiktextract data.

If you use this data in academic research, please cite Tatu Ylonen: Wiktextract: Wiktionary as Machine-Readable Structured Data, Proceedings of the 13th Conference on Language Resources and Evaluation (LREC), pp. 1317-1325, Marseille, 20-25 June 2022. Linking to the relevant page(s) under https://kaikki.org would also be greatly appreciated.