"mutual information" meaning in English

See mutual information in All languages combined, or Wiktionary

Noun

Forms: mutual informations [plural]
Head templates: {{en-noun|-|s}} mutual information (usually uncountable, plural mutual informations)
  1. (information theory) A measure of the entropic (informational) correlation between two random variables. Wikipedia link: mutual information Tags: uncountable, usually Categories (topical): Information theory Translations (measure of the entropic correlation): 互信息 (hù xìnxī) (Chinese Mandarin), 互資訊 (Chinese Mandarin), 互资讯 (hù zīxùn) (Chinese Mandarin)
    Sense id: en-mutual_information-en-noun-M0Hy3OGG Categories (other): English entries with incorrect language header Topics: computing, engineering, information-theory, mathematics, natural-sciences, physical-sciences, sciences

Inflected forms

Alternative forms

Download JSON data for mutual information meaning in English (2.6kB)

{
  "forms": [
    {
      "form": "mutual informations",
      "tags": [
        "plural"
      ]
    }
  ],
  "head_templates": [
    {
      "args": {
        "1": "-",
        "2": "s"
      },
      "expansion": "mutual information (usually uncountable, plural mutual informations)",
      "name": "en-noun"
    }
  ],
  "lang": "English",
  "lang_code": "en",
  "pos": "noun",
  "senses": [
    {
      "categories": [
        {
          "kind": "other",
          "name": "English entries with incorrect language header",
          "parents": [
            "Entries with incorrect language header",
            "Entry maintenance"
          ],
          "source": "w"
        },
        {
          "kind": "topical",
          "langcode": "en",
          "name": "Information theory",
          "orig": "en:Information theory",
          "parents": [
            "Applied mathematics",
            "Mathematics",
            "Formal sciences",
            "Sciences",
            "All topics",
            "Fundamental"
          ],
          "source": "w"
        }
      ],
      "examples": [
        {
          "text": "Mutual information I(X;Y) between two random variables X and Y is what is left over when their mutual conditional entropies H(Y|X) and H(X|Y) are subtracted from their joint entropy H(X,Y). It can be given by the formula I(X;Y)=-∑ₓ∑_yp_X,Y(x,y) log _bp_X,Y(x,y)/p_X|Y(x|y)p_Y|X(y|x)."
        },
        {
          "ref": "2018, Clarence Green, James Lambert, “Position vectors, homologous chromosomes and gamma rays: Promoting disciplinary literacy through Secondary Phrase Lists”, in English for Specific Purposes, →DOI, page 6",
          "text": "From these lists, all combinations of the four major parts of speech were extracted and each phrase was checked for the frequency, dispersion and range criteria respectively. Those that passed the criteria thresholds then had their mutual information scores computed using the tool Collocate (Barlow, 2004) and those failing to meet the threshold were removed.",
          "type": "quotation"
        }
      ],
      "glosses": [
        "A measure of the entropic (informational) correlation between two random variables."
      ],
      "id": "en-mutual_information-en-noun-M0Hy3OGG",
      "links": [
        [
          "information theory",
          "information theory"
        ],
        [
          "entropic",
          "entropic"
        ],
        [
          "correlation",
          "correlation"
        ],
        [
          "random variable",
          "random variable"
        ]
      ],
      "raw_glosses": [
        "(information theory) A measure of the entropic (informational) correlation between two random variables."
      ],
      "tags": [
        "uncountable",
        "usually"
      ],
      "topics": [
        "computing",
        "engineering",
        "information-theory",
        "mathematics",
        "natural-sciences",
        "physical-sciences",
        "sciences"
      ],
      "translations": [
        {
          "code": "cmn",
          "lang": "Chinese Mandarin",
          "roman": "hù xìnxī",
          "sense": "measure of the entropic correlation",
          "word": "互信息"
        },
        {
          "code": "cmn",
          "lang": "Chinese Mandarin",
          "sense": "measure of the entropic correlation",
          "word": "互資訊"
        },
        {
          "code": "cmn",
          "lang": "Chinese Mandarin",
          "roman": "hù zīxùn",
          "sense": "measure of the entropic correlation",
          "word": "互资讯"
        }
      ],
      "wikipedia": [
        "mutual information"
      ]
    }
  ],
  "word": "mutual information"
}
{
  "forms": [
    {
      "form": "mutual informations",
      "tags": [
        "plural"
      ]
    }
  ],
  "head_templates": [
    {
      "args": {
        "1": "-",
        "2": "s"
      },
      "expansion": "mutual information (usually uncountable, plural mutual informations)",
      "name": "en-noun"
    }
  ],
  "lang": "English",
  "lang_code": "en",
  "pos": "noun",
  "senses": [
    {
      "categories": [
        "English countable nouns",
        "English entries with incorrect language header",
        "English lemmas",
        "English multiword terms",
        "English nouns",
        "English terms with quotations",
        "English uncountable nouns",
        "en:Information theory"
      ],
      "examples": [
        {
          "text": "Mutual information I(X;Y) between two random variables X and Y is what is left over when their mutual conditional entropies H(Y|X) and H(X|Y) are subtracted from their joint entropy H(X,Y). It can be given by the formula I(X;Y)=-∑ₓ∑_yp_X,Y(x,y) log _bp_X,Y(x,y)/p_X|Y(x|y)p_Y|X(y|x)."
        },
        {
          "ref": "2018, Clarence Green, James Lambert, “Position vectors, homologous chromosomes and gamma rays: Promoting disciplinary literacy through Secondary Phrase Lists”, in English for Specific Purposes, →DOI, page 6",
          "text": "From these lists, all combinations of the four major parts of speech were extracted and each phrase was checked for the frequency, dispersion and range criteria respectively. Those that passed the criteria thresholds then had their mutual information scores computed using the tool Collocate (Barlow, 2004) and those failing to meet the threshold were removed.",
          "type": "quotation"
        }
      ],
      "glosses": [
        "A measure of the entropic (informational) correlation between two random variables."
      ],
      "links": [
        [
          "information theory",
          "information theory"
        ],
        [
          "entropic",
          "entropic"
        ],
        [
          "correlation",
          "correlation"
        ],
        [
          "random variable",
          "random variable"
        ]
      ],
      "raw_glosses": [
        "(information theory) A measure of the entropic (informational) correlation between two random variables."
      ],
      "tags": [
        "uncountable",
        "usually"
      ],
      "topics": [
        "computing",
        "engineering",
        "information-theory",
        "mathematics",
        "natural-sciences",
        "physical-sciences",
        "sciences"
      ],
      "wikipedia": [
        "mutual information"
      ]
    }
  ],
  "translations": [
    {
      "code": "cmn",
      "lang": "Chinese Mandarin",
      "roman": "hù xìnxī",
      "sense": "measure of the entropic correlation",
      "word": "互信息"
    },
    {
      "code": "cmn",
      "lang": "Chinese Mandarin",
      "sense": "measure of the entropic correlation",
      "word": "互資訊"
    },
    {
      "code": "cmn",
      "lang": "Chinese Mandarin",
      "roman": "hù zīxùn",
      "sense": "measure of the entropic correlation",
      "word": "互资讯"
    }
  ],
  "word": "mutual information"
}

This page is a part of the kaikki.org machine-readable English dictionary. This dictionary is based on structured data extracted on 2024-05-06 from the enwiktionary dump dated 2024-05-02 using wiktextract (f4fd8c9 and c9440ce). The data shown on this site has been post-processed and various details (e.g., extra categories) removed, some information disambiguated, and additional data merged from other sources. See the raw data download page for the unprocessed wiktextract data.

If you use this data in academic research, please cite Tatu Ylonen: Wiktextract: Wiktionary as Machine-Readable Structured Data, Proceedings of the 13th Conference on Language Resources and Evaluation (LREC), pp. 1317-1325, Marseille, 20-25 June 2022. Linking to the relevant page(s) under https://kaikki.org would also be greatly appreciated.