"MoE" meaning in English

See MoE in All languages combined, or Wiktionary

Noun

IPA: /məʊ/ [UK], /moʊ/ [US] Forms: MOE [alternative]
Head templates: {{en-noun|?}} MoE
  1. Initialism of Ministry of Education. Tags: abbreviation, alt-of, initialism Alternative form of: Ministry of Education
    Sense id: en-MoE-en-noun-Bz2Sw6eL
  2. (machine learning) Initialism of mixture of experts. Tags: abbreviation, alt-of, initialism Alternative form of: mixture of experts
    Sense id: en-MoE-en-noun-xnwZWyrP Categories (other): Machine learning
  3. (statistics) Initialism of margin of error. Tags: abbreviation, alt-of, initialism Alternative form of: margin of error
    Sense id: en-MoE-en-noun-s1e1OXDO Categories (other): Statistics, English entries with incorrect language header, Pages with 1 entry, Pages with entries Disambiguation of English entries with incorrect language header: 11 32 57 Disambiguation of Pages with 1 entry: 9 27 64 Disambiguation of Pages with entries: 7 18 75 Topics: mathematics, sciences, statistics

Alternative forms

{
  "forms": [
    {
      "form": "MOE",
      "tags": [
        "alternative"
      ]
    }
  ],
  "head_templates": [
    {
      "args": {
        "1": "?"
      },
      "expansion": "MoE",
      "name": "en-noun"
    }
  ],
  "lang": "English",
  "lang_code": "en",
  "pos": "noun",
  "senses": [
    {
      "alt_of": [
        {
          "word": "Ministry of Education"
        }
      ],
      "categories": [],
      "glosses": [
        "Initialism of Ministry of Education."
      ],
      "id": "en-MoE-en-noun-Bz2Sw6eL",
      "links": [
        [
          "Ministry of Education",
          "w:Ministry of Education"
        ]
      ],
      "tags": [
        "abbreviation",
        "alt-of",
        "initialism"
      ]
    },
    {
      "alt_of": [
        {
          "word": "mixture of experts"
        }
      ],
      "categories": [
        {
          "kind": "other",
          "langcode": "en",
          "name": "Machine learning",
          "orig": "en:Machine learning",
          "parents": [],
          "source": "w"
        }
      ],
      "examples": [
        {
          "bold_text_offsets": [
            [
              51,
              54
            ]
          ],
          "ref": "2023, Gerhard Paaß, Sven Giesselbach, Foundation Models for Natural Language Processing: Pre-trained Language Models Integrating Media, Springer Nature, →ISBN, page 130:",
          "text": "GLaM [51] is an autoregressive mixture-of-experts (MoE) model with up to 1200B parameters.",
          "type": "quotation"
        }
      ],
      "glosses": [
        "Initialism of mixture of experts."
      ],
      "id": "en-MoE-en-noun-xnwZWyrP",
      "links": [
        [
          "machine learning",
          "machine learning"
        ],
        [
          "mixture of experts",
          "w:mixture of experts"
        ]
      ],
      "qualifier": "machine learning",
      "raw_glosses": [
        "(machine learning) Initialism of mixture of experts."
      ],
      "tags": [
        "abbreviation",
        "alt-of",
        "initialism"
      ]
    },
    {
      "alt_of": [
        {
          "word": "margin of error"
        }
      ],
      "categories": [
        {
          "kind": "other",
          "langcode": "en",
          "name": "Statistics",
          "orig": "en:Statistics",
          "parents": [],
          "source": "w"
        },
        {
          "_dis": "11 32 57",
          "kind": "other",
          "name": "English entries with incorrect language header",
          "parents": [],
          "source": "w+disamb"
        },
        {
          "_dis": "9 27 64",
          "kind": "other",
          "name": "Pages with 1 entry",
          "parents": [],
          "source": "w+disamb"
        },
        {
          "_dis": "7 18 75",
          "kind": "other",
          "name": "Pages with entries",
          "parents": [],
          "source": "w+disamb"
        }
      ],
      "glosses": [
        "Initialism of margin of error."
      ],
      "id": "en-MoE-en-noun-s1e1OXDO",
      "links": [
        [
          "statistics",
          "statistics"
        ],
        [
          "margin of error",
          "margin of error#English"
        ]
      ],
      "raw_glosses": [
        "(statistics) Initialism of margin of error."
      ],
      "tags": [
        "abbreviation",
        "alt-of",
        "initialism"
      ],
      "topics": [
        "mathematics",
        "sciences",
        "statistics"
      ]
    }
  ],
  "sounds": [
    {
      "ipa": "/məʊ/",
      "tags": [
        "UK"
      ]
    },
    {
      "ipa": "/moʊ/",
      "tags": [
        "US"
      ]
    },
    {
      "homophone": "mow"
    }
  ],
  "word": "MoE"
}
{
  "categories": [
    "English entries with incorrect language header",
    "English lemmas",
    "English nouns",
    "English nouns with unknown or uncertain plurals",
    "English terms with homophones",
    "Pages with 1 entry",
    "Pages with entries"
  ],
  "forms": [
    {
      "form": "MOE",
      "tags": [
        "alternative"
      ]
    }
  ],
  "head_templates": [
    {
      "args": {
        "1": "?"
      },
      "expansion": "MoE",
      "name": "en-noun"
    }
  ],
  "lang": "English",
  "lang_code": "en",
  "pos": "noun",
  "senses": [
    {
      "alt_of": [
        {
          "word": "Ministry of Education"
        }
      ],
      "categories": [
        "English initialisms"
      ],
      "glosses": [
        "Initialism of Ministry of Education."
      ],
      "links": [
        [
          "Ministry of Education",
          "w:Ministry of Education"
        ]
      ],
      "tags": [
        "abbreviation",
        "alt-of",
        "initialism"
      ]
    },
    {
      "alt_of": [
        {
          "word": "mixture of experts"
        }
      ],
      "categories": [
        "English initialisms",
        "English terms with quotations",
        "en:Machine learning"
      ],
      "examples": [
        {
          "bold_text_offsets": [
            [
              51,
              54
            ]
          ],
          "ref": "2023, Gerhard Paaß, Sven Giesselbach, Foundation Models for Natural Language Processing: Pre-trained Language Models Integrating Media, Springer Nature, →ISBN, page 130:",
          "text": "GLaM [51] is an autoregressive mixture-of-experts (MoE) model with up to 1200B parameters.",
          "type": "quotation"
        }
      ],
      "glosses": [
        "Initialism of mixture of experts."
      ],
      "links": [
        [
          "machine learning",
          "machine learning"
        ],
        [
          "mixture of experts",
          "w:mixture of experts"
        ]
      ],
      "qualifier": "machine learning",
      "raw_glosses": [
        "(machine learning) Initialism of mixture of experts."
      ],
      "tags": [
        "abbreviation",
        "alt-of",
        "initialism"
      ]
    },
    {
      "alt_of": [
        {
          "word": "margin of error"
        }
      ],
      "categories": [
        "English initialisms",
        "en:Statistics"
      ],
      "glosses": [
        "Initialism of margin of error."
      ],
      "links": [
        [
          "statistics",
          "statistics"
        ],
        [
          "margin of error",
          "margin of error#English"
        ]
      ],
      "raw_glosses": [
        "(statistics) Initialism of margin of error."
      ],
      "tags": [
        "abbreviation",
        "alt-of",
        "initialism"
      ],
      "topics": [
        "mathematics",
        "sciences",
        "statistics"
      ]
    }
  ],
  "sounds": [
    {
      "ipa": "/məʊ/",
      "tags": [
        "UK"
      ]
    },
    {
      "ipa": "/moʊ/",
      "tags": [
        "US"
      ]
    },
    {
      "homophone": "mow"
    }
  ],
  "word": "MoE"
}

Download raw JSONL data for MoE meaning in English (2.0kB)


This page is a part of the kaikki.org machine-readable English dictionary. This dictionary is based on structured data extracted on 2026-02-14 from the enwiktionary dump dated 2026-02-01 using wiktextract (f492ef9 and 59dc20b). The data shown on this site has been post-processed and various details (e.g., extra categories) removed, some information disambiguated, and additional data merged from other sources. See the raw data download page for the unprocessed wiktextract data.

If you use this data in academic research, please cite Tatu Ylonen: Wiktextract: Wiktionary as Machine-Readable Structured Data, Proceedings of the 13th Conference on Language Resources and Evaluation (LREC), pp. 1317-1325, Marseille, 20-25 June 2022. Linking to the relevant page(s) under https://kaikki.org would also be greatly appreciated.