Human Language Understanding & Reasoning.
In: Daedalus: Journal of the American Academy of Arts & Sciences, Jg. 151 (2022-04-01), Heft 2, S. 127-138
Online
academicJournal
Zugriff:
The last decade has yielded dramatic and quite surprising breakthroughs in natural language processing through the use of simple artificial neural network computations, replicated on a very large scale and trained over exceedingly large amounts of data. The resulting pretrained language models, such as BERT and GPT-3, have provided a powerful universal language understanding and generation base, which can easily be adapted to many understanding, writing, and reasoning tasks. These models show the first inklings of a more general form of artificial intelligence, which may lead to powerful foundation models in domains of sensory experience beyond just language. [ABSTRACT FROM AUTHOR]
Copyright of Daedalus: Journal of the American Academy of Arts & Sciences is the property of MIT Press and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
Titel: |
Human Language Understanding & Reasoning.
|
---|---|
Autor/in / Beteiligte Person: | Manning, Christopher D. |
Link: | |
Zeitschrift: | Daedalus: Journal of the American Academy of Arts & Sciences, Jg. 151 (2022-04-01), Heft 2, S. 127-138 |
Veröffentlichung: | 2022 |
Medientyp: | academicJournal |
ISSN: | 0011-5266 (print) |
DOI: | 10.1162/daed_a_01905 |
Schlagwort: |
|
Sonstiges: |
|