"joint entropy" meaning in English

See joint entropy in All languages combined, or Wiktionary

Noun

Forms: joint entropies [plural]
Head templates: {{en-noun|~}} joint entropy (countable and uncountable, plural joint entropies)
  1. (information theory) The Shannon entropy of a "script" whose "characters" are elements of the Cartesian product of the sets of characters of the component scripts. Wikipedia link: joint entropy Tags: countable, uncountable Categories (topical): Information theory Related terms: joint probability

Inflected forms

Download JSON data for joint entropy meaning in English (1.8kB)

{
  "forms": [
    {
      "form": "joint entropies",
      "tags": [
        "plural"
      ]
    }
  ],
  "head_templates": [
    {
      "args": {
        "1": "~"
      },
      "expansion": "joint entropy (countable and uncountable, plural joint entropies)",
      "name": "en-noun"
    }
  ],
  "lang": "English",
  "lang_code": "en",
  "pos": "noun",
  "senses": [
    {
      "categories": [
        {
          "kind": "other",
          "name": "English entries with incorrect language header",
          "parents": [
            "Entries with incorrect language header",
            "Entry maintenance"
          ],
          "source": "w"
        },
        {
          "kind": "topical",
          "langcode": "en",
          "name": "Information theory",
          "orig": "en:Information theory",
          "parents": [
            "Applied mathematics",
            "Mathematics",
            "Formal sciences",
            "Sciences",
            "All topics",
            "Fundamental"
          ],
          "source": "w"
        }
      ],
      "examples": [
        {
          "text": "If random variables X and Y are mutually independent, then their joint entropy H(X,Y) is just the sum H(X)+H(Y) of its component entropies. If they are not mutually independent, then their joint entropy will be H(X)+H(Y)-I(X;Y) where I(X;Y) is the mutual information of X and Y."
        }
      ],
      "glosses": [
        "The Shannon entropy of a \"script\" whose \"characters\" are elements of the Cartesian product of the sets of characters of the component scripts."
      ],
      "id": "en-joint_entropy-en-noun-kCXHYSjX",
      "links": [
        [
          "information theory",
          "information theory"
        ],
        [
          "Shannon entropy",
          "Shannon entropy"
        ],
        [
          "script",
          "script"
        ],
        [
          "character",
          "character"
        ],
        [
          "Cartesian product",
          "Cartesian product"
        ]
      ],
      "raw_glosses": [
        "(information theory) The Shannon entropy of a \"script\" whose \"characters\" are elements of the Cartesian product of the sets of characters of the component scripts."
      ],
      "related": [
        {
          "word": "joint probability"
        }
      ],
      "tags": [
        "countable",
        "uncountable"
      ],
      "topics": [
        "computing",
        "engineering",
        "information-theory",
        "mathematics",
        "natural-sciences",
        "physical-sciences",
        "sciences"
      ],
      "wikipedia": [
        "joint entropy"
      ]
    }
  ],
  "word": "joint entropy"
}
{
  "forms": [
    {
      "form": "joint entropies",
      "tags": [
        "plural"
      ]
    }
  ],
  "head_templates": [
    {
      "args": {
        "1": "~"
      },
      "expansion": "joint entropy (countable and uncountable, plural joint entropies)",
      "name": "en-noun"
    }
  ],
  "lang": "English",
  "lang_code": "en",
  "pos": "noun",
  "related": [
    {
      "word": "joint probability"
    }
  ],
  "senses": [
    {
      "categories": [
        "English countable nouns",
        "English entries with incorrect language header",
        "English lemmas",
        "English multiword terms",
        "English nouns",
        "English uncountable nouns",
        "en:Information theory"
      ],
      "examples": [
        {
          "text": "If random variables X and Y are mutually independent, then their joint entropy H(X,Y) is just the sum H(X)+H(Y) of its component entropies. If they are not mutually independent, then their joint entropy will be H(X)+H(Y)-I(X;Y) where I(X;Y) is the mutual information of X and Y."
        }
      ],
      "glosses": [
        "The Shannon entropy of a \"script\" whose \"characters\" are elements of the Cartesian product of the sets of characters of the component scripts."
      ],
      "links": [
        [
          "information theory",
          "information theory"
        ],
        [
          "Shannon entropy",
          "Shannon entropy"
        ],
        [
          "script",
          "script"
        ],
        [
          "character",
          "character"
        ],
        [
          "Cartesian product",
          "Cartesian product"
        ]
      ],
      "raw_glosses": [
        "(information theory) The Shannon entropy of a \"script\" whose \"characters\" are elements of the Cartesian product of the sets of characters of the component scripts."
      ],
      "tags": [
        "countable",
        "uncountable"
      ],
      "topics": [
        "computing",
        "engineering",
        "information-theory",
        "mathematics",
        "natural-sciences",
        "physical-sciences",
        "sciences"
      ],
      "wikipedia": [
        "joint entropy"
      ]
    }
  ],
  "word": "joint entropy"
}

This page is a part of the kaikki.org machine-readable English dictionary. This dictionary is based on structured data extracted on 2024-04-26 from the enwiktionary dump dated 2024-04-21 using wiktextract (93a6c53 and 21a9316). The data shown on this site has been post-processed and various details (e.g., extra categories) removed, some information disambiguated, and additional data merged from other sources. See the raw data download page for the unprocessed wiktextract data.

If you use this data in academic research, please cite Tatu Ylonen: Wiktextract: Wiktionary as Machine-Readable Structured Data, Proceedings of the 13th Conference on Language Resources and Evaluation (LREC), pp. 1317-1325, Marseille, 20-25 June 2022. Linking to the relevant page(s) under https://kaikki.org would also be greatly appreciated.