"mutual information" meaning in All languages combined

See mutual information on Wiktionary

Noun [English]

Forms: mutual informations [plural]
Head templates: {{en-noun|-|s}} mutual information (usually uncountable, plural mutual informations)
  1. (information theory) A measure of the entropic (informational) correlation between two random variables. Wikipedia link: mutual information Tags: uncountable, usually Categories (topical): Information theory Translations (measure of the entropic correlation): 互信息 (hù xìnxī) (Chinese Mandarin), 互資訊 (Chinese Mandarin), 互资讯 (hù zīxùn) (Chinese Mandarin)

Inflected forms

Alternative forms

{
  "forms": [
    {
      "form": "mutual informations",
      "tags": [
        "plural"
      ]
    }
  ],
  "head_templates": [
    {
      "args": {
        "1": "-",
        "2": "s"
      },
      "expansion": "mutual information (usually uncountable, plural mutual informations)",
      "name": "en-noun"
    }
  ],
  "lang": "English",
  "lang_code": "en",
  "pos": "noun",
  "senses": [
    {
      "categories": [
        {
          "kind": "other",
          "name": "English entries with incorrect language header",
          "parents": [
            "Entries with incorrect language header",
            "Entry maintenance"
          ],
          "source": "w"
        },
        {
          "kind": "other",
          "name": "Entries with translation boxes",
          "parents": [],
          "source": "w"
        },
        {
          "kind": "other",
          "name": "Pages with 1 entry",
          "parents": [],
          "source": "w"
        },
        {
          "kind": "other",
          "name": "Pages with entries",
          "parents": [],
          "source": "w"
        },
        {
          "kind": "other",
          "name": "Terms with Mandarin translations",
          "parents": [],
          "source": "w"
        },
        {
          "kind": "topical",
          "langcode": "en",
          "name": "Information theory",
          "orig": "en:Information theory",
          "parents": [
            "Applied mathematics",
            "Mathematics",
            "Formal sciences",
            "Sciences",
            "All topics",
            "Fundamental"
          ],
          "source": "w"
        }
      ],
      "examples": [
        {
          "text": "Mutual information I(X;Y) between two random variables X and Y is what is left over when their mutual conditional entropies H(Y|X) and H(X|Y) are subtracted from their joint entropy H(X,Y). It can be given by the formula I(X;Y)=-∑ₓ∑_yp_X,Y(x,y) log _bp_X,Y(x,y)/p_X|Y(x|y)p_Y|X(y|x)."
        },
        {
          "ref": "2018, Clarence Green, James Lambert, “Position vectors, homologous chromosomes and gamma rays: Promoting disciplinary literacy through Secondary Phrase Lists”, in English for Specific Purposes, →DOI, page 6:",
          "text": "From these lists, all combinations of the four major parts of speech were extracted and each phrase was checked for the frequency, dispersion and range criteria respectively. Those that passed the criteria thresholds then had their mutual information scores computed using the tool Collocate (Barlow, 2004) and those failing to meet the threshold were removed.",
          "type": "quote"
        }
      ],
      "glosses": [
        "A measure of the entropic (informational) correlation between two random variables."
      ],
      "id": "en-mutual_information-en-noun-M0Hy3OGG",
      "links": [
        [
          "information theory",
          "information theory"
        ],
        [
          "entropic",
          "entropic"
        ],
        [
          "correlation",
          "correlation"
        ],
        [
          "random variable",
          "random variable"
        ]
      ],
      "raw_glosses": [
        "(information theory) A measure of the entropic (informational) correlation between two random variables."
      ],
      "tags": [
        "uncountable",
        "usually"
      ],
      "topics": [
        "computing",
        "engineering",
        "information-theory",
        "mathematics",
        "natural-sciences",
        "physical-sciences",
        "sciences"
      ],
      "translations": [
        {
          "code": "cmn",
          "lang": "Chinese Mandarin",
          "roman": "hù xìnxī",
          "sense": "measure of the entropic correlation",
          "word": "互信息"
        },
        {
          "code": "cmn",
          "lang": "Chinese Mandarin",
          "sense": "measure of the entropic correlation",
          "word": "互資訊"
        },
        {
          "code": "cmn",
          "lang": "Chinese Mandarin",
          "roman": "hù zīxùn",
          "sense": "measure of the entropic correlation",
          "word": "互资讯"
        }
      ],
      "wikipedia": [
        "mutual information"
      ]
    }
  ],
  "word": "mutual information"
}
{
  "forms": [
    {
      "form": "mutual informations",
      "tags": [
        "plural"
      ]
    }
  ],
  "head_templates": [
    {
      "args": {
        "1": "-",
        "2": "s"
      },
      "expansion": "mutual information (usually uncountable, plural mutual informations)",
      "name": "en-noun"
    }
  ],
  "lang": "English",
  "lang_code": "en",
  "pos": "noun",
  "senses": [
    {
      "categories": [
        "English countable nouns",
        "English entries with incorrect language header",
        "English lemmas",
        "English multiword terms",
        "English nouns",
        "English terms with quotations",
        "English uncountable nouns",
        "Entries with translation boxes",
        "Pages with 1 entry",
        "Pages with entries",
        "Terms with Mandarin translations",
        "en:Information theory"
      ],
      "examples": [
        {
          "text": "Mutual information I(X;Y) between two random variables X and Y is what is left over when their mutual conditional entropies H(Y|X) and H(X|Y) are subtracted from their joint entropy H(X,Y). It can be given by the formula I(X;Y)=-∑ₓ∑_yp_X,Y(x,y) log _bp_X,Y(x,y)/p_X|Y(x|y)p_Y|X(y|x)."
        },
        {
          "ref": "2018, Clarence Green, James Lambert, “Position vectors, homologous chromosomes and gamma rays: Promoting disciplinary literacy through Secondary Phrase Lists”, in English for Specific Purposes, →DOI, page 6:",
          "text": "From these lists, all combinations of the four major parts of speech were extracted and each phrase was checked for the frequency, dispersion and range criteria respectively. Those that passed the criteria thresholds then had their mutual information scores computed using the tool Collocate (Barlow, 2004) and those failing to meet the threshold were removed.",
          "type": "quote"
        }
      ],
      "glosses": [
        "A measure of the entropic (informational) correlation between two random variables."
      ],
      "links": [
        [
          "information theory",
          "information theory"
        ],
        [
          "entropic",
          "entropic"
        ],
        [
          "correlation",
          "correlation"
        ],
        [
          "random variable",
          "random variable"
        ]
      ],
      "raw_glosses": [
        "(information theory) A measure of the entropic (informational) correlation between two random variables."
      ],
      "tags": [
        "uncountable",
        "usually"
      ],
      "topics": [
        "computing",
        "engineering",
        "information-theory",
        "mathematics",
        "natural-sciences",
        "physical-sciences",
        "sciences"
      ],
      "wikipedia": [
        "mutual information"
      ]
    }
  ],
  "translations": [
    {
      "code": "cmn",
      "lang": "Chinese Mandarin",
      "roman": "hù xìnxī",
      "sense": "measure of the entropic correlation",
      "word": "互信息"
    },
    {
      "code": "cmn",
      "lang": "Chinese Mandarin",
      "sense": "measure of the entropic correlation",
      "word": "互資訊"
    },
    {
      "code": "cmn",
      "lang": "Chinese Mandarin",
      "roman": "hù zīxùn",
      "sense": "measure of the entropic correlation",
      "word": "互资讯"
    }
  ],
  "word": "mutual information"
}

Download raw JSONL data for mutual information meaning in All languages combined (2.5kB)


This page is a part of the kaikki.org machine-readable All languages combined dictionary. This dictionary is based on structured data extracted on 2024-11-06 from the enwiktionary dump dated 2024-10-02 using wiktextract (fbeafe8 and 7f03c9b). The data shown on this site has been post-processed and various details (e.g., extra categories) removed, some information disambiguated, and additional data merged from other sources. See the raw data download page for the unprocessed wiktextract data.

If you use this data in academic research, please cite Tatu Ylonen: Wiktextract: Wiktionary as Machine-Readable Structured Data, Proceedings of the 13th Conference on Language Resources and Evaluation (LREC), pp. 1317-1325, Marseille, 20-25 June 2022. Linking to the relevant page(s) under https://kaikki.org would also be greatly appreciated.