Abstract Wikipedia/Related and previous work/Natural language generation/ru
Abstract Wikipedia will generate natural language text from an abstract representation. This is not a novel idea, and it has been tried a number of times before.
This page aims to collect different existing approaches. It tries to summarize the core ideas of the different approaches, their advantages and disadvantages, and points to existing implementations. This page (by and for the community) will help to choose which approach to focus on first.
Implementations
- Arria NLG
- Википедия: Arria NLG [ de ] [ en ] [ nn ]
- Вебсайт: https://www.arria.com/
- Лицензия: Proprietary, 30 patents apply
- Поддерживаемые языки: English
- ASTROGEN
- Chimera
- Вебсайт: https://github.com/AmitMY/chimera
- Лицензия: MIT License
- Elvex
- FUF/SURGE
- Genl
- Вебсайт: http://kowey.github.io/GenI/
- GoPhi
- Вебсайт: https://github.com/rali-udem/gophi
- Grammar Explorer
- Grammatical Framework
- Википедия: Grammatical Framework [ en ] [ nn ]
- Вебсайт: https://www.grammaticalframework.org/
- Лицензия: GNU General Public License: see text
- Поддерживаемые языки: Afrikaans, Amharic (partial), Arabic (partial), Basque (partial), Bulgarian, Catalan, Chinese, Czech (partial), Danish, Dutch, English, Estonian, Finnish, French, German, Greek ancient (partial), Greek modern, Hebrew (fragments), Hindi, Hungarian (partial), Interlingua, Italian, Japanese, Korean (partial), Latin (partial), Latvian, Maltese, Mongolian, Nepali, Norwegian bokmål, Norwegian nynorsk, Persian, Polish, Punjabi, Romanian, Russian, Sindhi, Slovak (partial), Slovene (partial), Somali (partial), Spanish, Swahili (fragments), Swedish, Thai, Turkish (fragments), and Urdu.
- jsRealB
- KPML
- Вебсайт: http://www.fb10.uni-bremen.de/anglistik/langpro/kpml/README.html
- Поддерживаемые языки: (2014):
- More advanced: Czech, English, German?, Spanish
- Prototype: Bulgarian, Chinese, Dutch, Portuguese, Russian
- Less advanced: French, Greek, Japanese
- Linguistic Knowledge Builder
- Вебсайт: http://moin.delph-in.net/LkbTop
- Multimodal Unification Grammar
- NaturalOWL
- NLGen and NLGen2
- OpenCCG
- Вебсайт: http://openccg.sourceforge.net/
- rLDCP
- RoseaNLG
- Вебсайт: https://rosaenlg.org/
- Поддерживаемые языки: English, French, German and Italian
- Semantic Web Authoring Tool (SWAT)
- Википедия: WYSIWYM [ en ] [ nn ] A SWAT is a tool that implements the WYSIWYM (what you see is what you meant) interaction technique for developing formal representations based on successive refinements (by humans) of NLG outputs.
- Вебсайт: http://mcs.open.ac.uk/nlg/SWAT/
- Поддерживаемые языки: OWL Simplified English
- SimpleNLG
- Вебсайт: https://github.com/simplenlg/simplenlg
- Поддерживаемые языки: English, French
- SPUD
- Suregen-2
- Вебсайт: http://www.suregen.de/index.html
- Поддерживаемые языки: German, English
- Syntax Maker
- Вебсайт: https://github.com/mikahama/syntaxmaker
- Поддерживаемые языки: Finnish
- TGen
- Вебсайт: https://github.com/UFAL-DSG/tgen
- Universal Networking Language
- UralicNLP
- Вебсайт: https://uralicnlp.com/
https://github.com/mikahama/uralicNLP - Поддерживаемые языки: Finnish, Russian, German, English, Norwegian, Swedish, Arabic, Ingrian, Meadow & Eastern Mari, Votic, Olonets-Karelian, Erzya, Moksha, Hill Mari, Udmurt, Tundra Nenets, Komi-Permyak, North Sami, South Sami and Skolt Sami[1]
Theoretical background
Please note that the six topics listed above have articles only in the English Wikipedia (24 July 2020).
Natural language generation [ de ] [ en ] [ es ] [ fr ] [ 日本語 ] [ nn ] [ 中文 ] is a sub-field of natural language processing. See the broader topic on Scholia.[2]
Pipeline model
In their 2018 Survey,[3] Gatt[4] and Krahmer[5] begin by describing natural language generation as the "task of generating text or speech from non-linguistic input." They identify six sub-problems (after Reiter & Dale 1997, 2000[6]) [2.NLG Tasks, pp. 70-82]:[3]
- Content determination (content determination (Q5165077))
- Text structuring (document structuring (Q5287648))
- Sentence aggregation (aggregation (Q4692263))
- Lexicalisation (lexical choice (Q6537688))
- Referring expression generation (referring expression generation (Q7307185))
- Linguistic realisation (realization (Q7301282))
These six sub-problems can be seen as a segmentation of the “pipeline”, beginning with “early” tasks, aligned to the purpose of the linguistic output. The “late” tasks are more aligned to the final linguistic form. A summary form might be “What (1), ordered (2) and segmented (3) how, with which words (4&5), in which forms (6)”. Lexicalisation (4) is not clearly distinguished from “referring expression generation” (REG) (5) in this summary form. The key idea during REG is avoiding repetition and ambiguity, or managing the tension between those conflicting aims. This corresponds to the Gricean maxim (Grice, 1975[7]) that “speakers should make sure that their contributions are sufficiently informative for the purposes of the exchange, but not more so” (or, as Roger Sessions said (1950) after Albert Einstein (1933): “everything should be as simple as it can be but not simpler!”).
Content determination
Document structuring
Aggregation
Lexical choice
Referring expression generation
Realization
- “In linguistics, realization is the process by which some kind of surface representation is derived from its underlying representation; that is, the way in which some abstract object of linguistic analysis comes to be produced in actual language. Phonemes are often said to be realized by speech sounds. The different sounds that can realize a particular phoneme are called its allophones.”
- “Realization is also a subtask of natural language generation, which involves creating an actual text in a human language (English, French, etc.) from a syntactic representation.”
- Английская Википедия
- (Wikipedia contributors, “Realization”, Wikipedia, The Free Encyclopedia, 26 May 2020, 02:46 UTC, <https://en.wikipedia.org/w/index.php?title=Realization&oldid=958866516> [accessed 31 August 2020].)
Black-box approach
In a later survey, Gârbacea and Mei[8] suggested “Neural language generation” as an emerging sub-field of NLG. Eleven of the papers cited in their survey have titles with “neural language” in them, the earliest from 2016 (Édouard Grave, Armand Joulin, and Nicolas Usunier)[9]. The earliest citation in which “neural language generation” appears is from 2017 (Jessica Ficler and Yoav Goldberg)[10].
In mid 2020, “neural language generation” is not mature enough to be used to generate natural language renditions of language-neutral content.
Ссылки
- Jessica Ficler and Yoav Goldberg, 2017[10]
- Édouard Grave, Armand Joulin, and Nicolas Usunier, 2016[9]
- Gârbacea and Mei, 2020[8]
- Gardent et al., 2017[11]
- Gatt & Krahmer, 2018[3]
- Grice, 1975[7]
- Reiter & Dale, 2000[6] (PDF ends at the end of the first section.)
Внешние ссылки
- ACL Special Interest Group on Natural Language Generation ACL is the Association for Computational Linguistics.
- Ehud Reiter's Blog Ehud Reiter has no English Wikipedia page (apart from his user page).
- Natural Language Generation (CLAN Group), School of Natural and Computing Sciences, The University of Aberdeen.
- Institute for Language, Cognition and Computation (ILCC), School of Informatics, The University of Edinburgh.
- Harvard NLP, Harvard University.
- The Interaction Lab, School of Mathematical and Computer Sciences, Heriot-Watt University.
- Institute of Linguistics and Language Technology, University of Malta (Albert Gatt, Director).
- The Open University Natural Language Generation Group.
- TALN Research Group, Department of Information and Communication Technologies,Universitat Pompeu Fabra, Barcelona.
- The Natural Language Processing Group, The University of Sheffield.
- The Natural Language Group, Information Sciences Institute, University of Southern California.
- SyNaLP (Symbolic and statistical NLP), Laboratoire Lorrain d'Informatique et ses Applications (LORIA).
- Paul G. Allen School of Computer Science and Engineering, University of Washington.
Примечания
- ↑ https://models.uralicnlp.com/nightly/
- ↑ The Scholia view on Natural-language generation lacked the standard sources and leading authors on 27 July 2020. Instead, see Google Scholar.
- ↑ a b c Gatt, Albert; Krahmer, Emiel (January 2018), "Survey of the State of the Art in Natural Language Generation: Core tasks, applications and evaluation", Journal of Artificial Intelligence Research 61: 65–170, archived from the original on 2020-06-23, retrieved 2020-07-24
- ↑ Gatt's publications
- ↑ Emiel Krahmer (Q51689943) selected publications
- ↑ a b Reiter, EB; Dale, R (2000), Building Natural-Language Generation Systems. (PDF), Cambridge University Press., archived from the original (PDF) on 2019-07-11, retrieved 2020-07-27
- ↑ a b Grice, H. Paul (1975), Logic and conversation (PDF), retrieved 2020-08-10
- ↑ a b Gârbacea, Cristina; Mei, Qiaozhu, Neural Language Generation: Formulation, Methods, and Evaluation (PDF), pp. 1–70, retrieved 2020-08-08,
Compared to the survey of (Gatt and Krahmer, 2018), our overview is a more comprehensive and updated coverage of neural network methods and evaluation centered around the novel problem definitions and task formulations.
- ↑ a b Grave, Édouard; Joulin, Armand; Usunier, Nicolas (2016), Improving neural language models with a continuous cache (PDF)
- ↑ a b Ficler, Jessica; Goldberg, Yoav (2017), "Controlling linguistic style aspects in neural language generation" (PDF), Proceedings of the Workshop on Stylistic Variation: 94–104. Published slightly earlier that year was Van-Khanh Tran and Le-Minh Nguyen. 2017.
Ficler, Jessica; Goldberg, Yoav (2017), Semantic Refinement GRU-based Neural Language Generation for Spoken Dialogue Systems (PDF) - ↑ Gardent, Claire; Shimorina, Anastasia; Narayan, Shashi; Perez-Beltrachini, Laura (2017), "The WebNLG Challenge: Generating Text from RDF data." (PDF), Proceedings of the 10th International Conference on Natural Language Generation: 124–133