"MoE" meaning in All languages combined

See MoE on Wiktionary

Noun [English]

Head templates: {{en-noun|?}} MoE
  1. Initialism of Ministry of Education. Tags: abbreviation, alt-of, initialism Alternative form of: Ministry of Education
    Sense id: en-MoE-en-noun-Bz2Sw6eL Categories (other): English links with redundant alt parameters, English links with redundant wikilinks
  2. (machine learning) Initialism of mixture of experts. Tags: abbreviation, alt-of, initialism Alternative form of: mixture of experts Categories (topical): Artificial intelligence
    Sense id: en-MoE-en-noun-xnwZWyrP Categories (other): English entries with incorrect language header Disambiguation of English entries with incorrect language header: 33 49 18
  3. (statistics) Initialism of margin of error. Tags: abbreviation, alt-of, initialism Alternative form of: margin of error Categories (topical): Statistics
    Sense id: en-MoE-en-noun-s1e1OXDO Topics: mathematics, sciences, statistics
The following are not (yet) sense-disambiguated
Synonyms: MOE

Alternative forms

Download JSON data for MoE meaning in All languages combined (2.6kB)

{
  "head_templates": [
    {
      "args": {
        "1": "?"
      },
      "expansion": "MoE",
      "name": "en-noun"
    }
  ],
  "lang": "English",
  "lang_code": "en",
  "pos": "noun",
  "senses": [
    {
      "alt_of": [
        {
          "word": "Ministry of Education"
        }
      ],
      "categories": [
        {
          "kind": "other",
          "name": "English links with redundant alt parameters",
          "parents": [
            "Links with redundant alt parameters",
            "Entry maintenance"
          ],
          "source": "w"
        },
        {
          "kind": "other",
          "name": "English links with redundant wikilinks",
          "parents": [
            "Links with redundant wikilinks",
            "Entry maintenance"
          ],
          "source": "w"
        }
      ],
      "glosses": [
        "Initialism of Ministry of Education."
      ],
      "id": "en-MoE-en-noun-Bz2Sw6eL",
      "links": [
        [
          "Ministry of Education",
          "w:Ministry of Education"
        ]
      ],
      "tags": [
        "abbreviation",
        "alt-of",
        "initialism"
      ]
    },
    {
      "alt_of": [
        {
          "word": "mixture of experts"
        }
      ],
      "categories": [
        {
          "kind": "topical",
          "langcode": "en",
          "name": "Artificial intelligence",
          "orig": "en:Artificial intelligence",
          "parents": [
            "Computer science",
            "Cybernetics",
            "Computing",
            "Sciences",
            "Applied mathematics",
            "Systems theory",
            "Technology",
            "All topics",
            "Mathematics",
            "Systems",
            "Fundamental",
            "Formal sciences",
            "Interdisciplinary fields",
            "Society"
          ],
          "source": "w"
        },
        {
          "_dis": "33 49 18",
          "kind": "other",
          "name": "English entries with incorrect language header",
          "parents": [
            "Entries with incorrect language header",
            "Entry maintenance"
          ],
          "source": "w+disamb"
        }
      ],
      "examples": [
        {
          "ref": "2023, Gerhard Paaß, Sven Giesselbach, Foundation Models for Natural Language Processing: Pre-trained Language Models Integrating Media, Springer Nature, page 130",
          "text": "GLaM [51] is an autoregressive mixture-of-experts (MoE) model with up to 1200B parameters.",
          "type": "quotation"
        }
      ],
      "glosses": [
        "Initialism of mixture of experts."
      ],
      "id": "en-MoE-en-noun-xnwZWyrP",
      "links": [
        [
          "machine learning",
          "machine learning"
        ],
        [
          "mixture of experts",
          "w:mixture of experts"
        ]
      ],
      "qualifier": "machine learning",
      "raw_glosses": [
        "(machine learning) Initialism of mixture of experts."
      ],
      "tags": [
        "abbreviation",
        "alt-of",
        "initialism"
      ]
    },
    {
      "alt_of": [
        {
          "word": "margin of error"
        }
      ],
      "categories": [
        {
          "kind": "topical",
          "langcode": "en",
          "name": "Statistics",
          "orig": "en:Statistics",
          "parents": [
            "Formal sciences",
            "Mathematics",
            "Sciences",
            "All topics",
            "Fundamental"
          ],
          "source": "w"
        }
      ],
      "glosses": [
        "Initialism of margin of error."
      ],
      "id": "en-MoE-en-noun-s1e1OXDO",
      "links": [
        [
          "statistics",
          "statistics"
        ],
        [
          "margin of error",
          "margin of error#English"
        ]
      ],
      "raw_glosses": [
        "(statistics) Initialism of margin of error."
      ],
      "tags": [
        "abbreviation",
        "alt-of",
        "initialism"
      ],
      "topics": [
        "mathematics",
        "sciences",
        "statistics"
      ]
    }
  ],
  "synonyms": [
    {
      "_dis1": "42 9 49",
      "word": "MOE"
    }
  ],
  "word": "MoE"
}
{
  "categories": [
    "English countable nouns",
    "English entries with incorrect language header",
    "English lemmas",
    "English nouns",
    "English nouns with unknown or uncertain plurals"
  ],
  "head_templates": [
    {
      "args": {
        "1": "?"
      },
      "expansion": "MoE",
      "name": "en-noun"
    }
  ],
  "lang": "English",
  "lang_code": "en",
  "pos": "noun",
  "senses": [
    {
      "alt_of": [
        {
          "word": "Ministry of Education"
        }
      ],
      "categories": [
        "English initialisms",
        "English links with redundant alt parameters",
        "English links with redundant wikilinks"
      ],
      "glosses": [
        "Initialism of Ministry of Education."
      ],
      "links": [
        [
          "Ministry of Education",
          "w:Ministry of Education"
        ]
      ],
      "tags": [
        "abbreviation",
        "alt-of",
        "initialism"
      ]
    },
    {
      "alt_of": [
        {
          "word": "mixture of experts"
        }
      ],
      "categories": [
        "English initialisms",
        "English terms with quotations",
        "en:Artificial intelligence"
      ],
      "examples": [
        {
          "ref": "2023, Gerhard Paaß, Sven Giesselbach, Foundation Models for Natural Language Processing: Pre-trained Language Models Integrating Media, Springer Nature, page 130",
          "text": "GLaM [51] is an autoregressive mixture-of-experts (MoE) model with up to 1200B parameters.",
          "type": "quotation"
        }
      ],
      "glosses": [
        "Initialism of mixture of experts."
      ],
      "links": [
        [
          "machine learning",
          "machine learning"
        ],
        [
          "mixture of experts",
          "w:mixture of experts"
        ]
      ],
      "qualifier": "machine learning",
      "raw_glosses": [
        "(machine learning) Initialism of mixture of experts."
      ],
      "tags": [
        "abbreviation",
        "alt-of",
        "initialism"
      ]
    },
    {
      "alt_of": [
        {
          "word": "margin of error"
        }
      ],
      "categories": [
        "English initialisms",
        "en:Statistics"
      ],
      "glosses": [
        "Initialism of margin of error."
      ],
      "links": [
        [
          "statistics",
          "statistics"
        ],
        [
          "margin of error",
          "margin of error#English"
        ]
      ],
      "raw_glosses": [
        "(statistics) Initialism of margin of error."
      ],
      "tags": [
        "abbreviation",
        "alt-of",
        "initialism"
      ],
      "topics": [
        "mathematics",
        "sciences",
        "statistics"
      ]
    }
  ],
  "synonyms": [
    {
      "word": "MOE"
    }
  ],
  "word": "MoE"
}

This page is a part of the kaikki.org machine-readable All languages combined dictionary. This dictionary is based on structured data extracted on 2024-05-03 from the enwiktionary dump dated 2024-05-02 using wiktextract (f4fd8c9 and c9440ce). The data shown on this site has been post-processed and various details (e.g., extra categories) removed, some information disambiguated, and additional data merged from other sources. See the raw data download page for the unprocessed wiktextract data.

If you use this data in academic research, please cite Tatu Ylonen: Wiktextract: Wiktionary as Machine-Readable Structured Data, Proceedings of the 13th Conference on Language Resources and Evaluation (LREC), pp. 1317-1325, Marseille, 20-25 June 2022. Linking to the relevant page(s) under https://kaikki.org would also be greatly appreciated.