See backprop on Wiktionary
{ "etymology_text": "Shortening of backpropagation.", "head_templates": [ { "args": { "1": "-" }, "expansion": "backprop (uncountable)", "name": "en-noun" } ], "lang": "English", "lang_code": "en", "pos": "noun", "senses": [ { "categories": [ { "kind": "other", "name": "English entries with incorrect language header", "parents": [ "Entries with incorrect language header", "Entry maintenance" ], "source": "w" }, { "kind": "other", "name": "Pages with 1 entry", "parents": [], "source": "w" }, { "kind": "other", "name": "Pages with entries", "parents": [], "source": "w" } ], "examples": [ { "ref": "2015, Sanjeev Arora, Yingyu Liang, Tengyu Ma, “Why are deep nets reversible: A simple theory, with implications for training”, in arXiv:", "text": "The generative model suggests a simple modification for training---use an input to produce several synthetic inputs with the same label, and include them in the backprop training.", "type": "quote" }, { "ref": "2020, Timothy P. Lillicrap, Adam Santoro, Luke Marris, Colin J. Akerman, Geoffrey Hinton, “Backpropagation and the brain”, in Nature:", "text": "In machine learning, backpropagation of error (‘backprop’) is the algorithm most often used to train deep neural networks and is the most successful learning procedure for these networks.", "type": "quote" } ], "glosses": [ "backpropagation" ], "id": "en-backprop-en-noun-K9Bj4-wG", "links": [ [ "backpropagation", "backpropagation" ] ], "tags": [ "uncountable" ] } ], "word": "backprop" }
{ "etymology_text": "Shortening of backpropagation.", "head_templates": [ { "args": { "1": "-" }, "expansion": "backprop (uncountable)", "name": "en-noun" } ], "lang": "English", "lang_code": "en", "pos": "noun", "senses": [ { "categories": [ "English entries with incorrect language header", "English lemmas", "English nouns", "English terms with quotations", "English uncountable nouns", "Pages with 1 entry", "Pages with entries" ], "examples": [ { "ref": "2015, Sanjeev Arora, Yingyu Liang, Tengyu Ma, “Why are deep nets reversible: A simple theory, with implications for training”, in arXiv:", "text": "The generative model suggests a simple modification for training---use an input to produce several synthetic inputs with the same label, and include them in the backprop training.", "type": "quote" }, { "ref": "2020, Timothy P. Lillicrap, Adam Santoro, Luke Marris, Colin J. Akerman, Geoffrey Hinton, “Backpropagation and the brain”, in Nature:", "text": "In machine learning, backpropagation of error (‘backprop’) is the algorithm most often used to train deep neural networks and is the most successful learning procedure for these networks.", "type": "quote" } ], "glosses": [ "backpropagation" ], "links": [ [ "backpropagation", "backpropagation" ] ], "tags": [ "uncountable" ] } ], "word": "backprop" }
Download raw JSONL data for backprop meaning in All languages combined (1.3kB)
This page is a part of the kaikki.org machine-readable All languages combined dictionary. This dictionary is based on structured data extracted on 2025-03-06 from the enwiktionary dump dated 2025-03-02 using wiktextract (b81b832 and 633533e). The data shown on this site has been post-processed and various details (e.g., extra categories) removed, some information disambiguated, and additional data merged from other sources. See the raw data download page for the unprocessed wiktextract data.
If you use this data in academic research, please cite Tatu Ylonen: Wiktextract: Wiktionary as Machine-Readable Structured Data, Proceedings of the 13th Conference on Language Resources and Evaluation (LREC), pp. 1317-1325, Marseille, 20-25 June 2022. Linking to the relevant page(s) under https://kaikki.org would also be greatly appreciated.