"conditional entropy" meaning in All languages combined

See conditional entropy on Wiktionary

Noun [English]

Forms: conditional entropies [plural]
Head templates: {{en-noun}} conditional entropy (plural conditional entropies)
  1. (information theory) The portion of a random variable's own Shannon entropy which is independent from another, given, random variable. Wikipedia link: conditional entropy Categories (topical): Information theory Related terms: conditional probability

Inflected forms

{
  "forms": [
    {
      "form": "conditional entropies",
      "tags": [
        "plural"
      ]
    }
  ],
  "head_templates": [
    {
      "args": {},
      "expansion": "conditional entropy (plural conditional entropies)",
      "name": "en-noun"
    }
  ],
  "lang": "English",
  "lang_code": "en",
  "pos": "noun",
  "senses": [
    {
      "categories": [
        {
          "kind": "other",
          "name": "English entries with incorrect language header",
          "parents": [
            "Entries with incorrect language header",
            "Entry maintenance"
          ],
          "source": "w"
        },
        {
          "kind": "other",
          "name": "Pages with 1 entry",
          "parents": [],
          "source": "w"
        },
        {
          "kind": "other",
          "name": "Pages with entries",
          "parents": [],
          "source": "w"
        },
        {
          "kind": "topical",
          "langcode": "en",
          "name": "Information theory",
          "orig": "en:Information theory",
          "parents": [
            "Applied mathematics",
            "Mathematics",
            "Formal sciences",
            "Sciences",
            "All topics",
            "Fundamental"
          ],
          "source": "w"
        }
      ],
      "examples": [
        {
          "text": "The conditional entropy of random variable Y given X (i.e., conditioned by X), denoted as H(Y|X), is equal to H(Y)-I(Y;X) where I(Y;X) is the mutual information between Y and X."
        }
      ],
      "glosses": [
        "The portion of a random variable's own Shannon entropy which is independent from another, given, random variable."
      ],
      "id": "en-conditional_entropy-en-noun-Qk-LJtUn",
      "links": [
        [
          "information theory",
          "information theory"
        ],
        [
          "random variable",
          "random variable"
        ],
        [
          "Shannon entropy",
          "Shannon entropy"
        ]
      ],
      "raw_glosses": [
        "(information theory) The portion of a random variable's own Shannon entropy which is independent from another, given, random variable."
      ],
      "related": [
        {
          "word": "conditional probability"
        }
      ],
      "topics": [
        "computing",
        "engineering",
        "information-theory",
        "mathematics",
        "natural-sciences",
        "physical-sciences",
        "sciences"
      ],
      "wikipedia": [
        "conditional entropy"
      ]
    }
  ],
  "word": "conditional entropy"
}
{
  "forms": [
    {
      "form": "conditional entropies",
      "tags": [
        "plural"
      ]
    }
  ],
  "head_templates": [
    {
      "args": {},
      "expansion": "conditional entropy (plural conditional entropies)",
      "name": "en-noun"
    }
  ],
  "lang": "English",
  "lang_code": "en",
  "pos": "noun",
  "related": [
    {
      "word": "conditional probability"
    }
  ],
  "senses": [
    {
      "categories": [
        "English countable nouns",
        "English entries with incorrect language header",
        "English lemmas",
        "English multiword terms",
        "English nouns",
        "Pages with 1 entry",
        "Pages with entries",
        "en:Information theory"
      ],
      "examples": [
        {
          "text": "The conditional entropy of random variable Y given X (i.e., conditioned by X), denoted as H(Y|X), is equal to H(Y)-I(Y;X) where I(Y;X) is the mutual information between Y and X."
        }
      ],
      "glosses": [
        "The portion of a random variable's own Shannon entropy which is independent from another, given, random variable."
      ],
      "links": [
        [
          "information theory",
          "information theory"
        ],
        [
          "random variable",
          "random variable"
        ],
        [
          "Shannon entropy",
          "Shannon entropy"
        ]
      ],
      "raw_glosses": [
        "(information theory) The portion of a random variable's own Shannon entropy which is independent from another, given, random variable."
      ],
      "topics": [
        "computing",
        "engineering",
        "information-theory",
        "mathematics",
        "natural-sciences",
        "physical-sciences",
        "sciences"
      ],
      "wikipedia": [
        "conditional entropy"
      ]
    }
  ],
  "word": "conditional entropy"
}

Download raw JSONL data for conditional entropy meaning in All languages combined (1.3kB)


This page is a part of the kaikki.org machine-readable All languages combined dictionary. This dictionary is based on structured data extracted on 2024-12-21 from the enwiktionary dump dated 2024-12-04 using wiktextract (d8cb2f3 and 4e554ae). The data shown on this site has been post-processed and various details (e.g., extra categories) removed, some information disambiguated, and additional data merged from other sources. See the raw data download page for the unprocessed wiktextract data.

If you use this data in academic research, please cite Tatu Ylonen: Wiktextract: Wiktionary as Machine-Readable Structured Data, Proceedings of the 13th Conference on Language Resources and Evaluation (LREC), pp. 1317-1325, Marseille, 20-25 June 2022. Linking to the relevant page(s) under https://kaikki.org would also be greatly appreciated.