The Successor Representation and Temporal Context.
In: Neural Computation, Jg. 24 (2012-04-01), Heft 4, S. 1-16
academicJournal
Zugriff:
The successor representation was introduced into reinforcement learning by Dayan (1993) as a means of facilitating generalization between states with similar successors. Although reinforcement learning in general has been used extensively as a model of psychological and neural processes, the psychological validity of the successor representation has yet to be explored. An interesting possibility is that the successor representation can be used not only for reinforcement learning but for episodic learning as well. Our main contribution is to show that a variant of the temporal context model (TCM; Howard & Kahana, 2002), an infiuential model of episodic memory, can be understood as directly estimating the successor representation using the temporal difference learning algorithm (Sutton & Barto, 1998). This insight leads to a generalization of TCM and new experimental predictions. In addition to casting a newnormative light on TCM, this equivalence suggests a previously unexplored point of contact between different learning systems. [ABSTRACT FROM AUTHOR]
Copyright of Neural Computation is the property of MIT Press and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
Titel: |
The Successor Representation and Temporal Context.
|
---|---|
Autor/in / Beteiligte Person: | Gershman, Samuel J. ; Moore, Christopher D. ; Todd, Michael T. ; Norman, Kenneth A. ; Sederberg, Per B. |
Zeitschrift: | Neural Computation, Jg. 24 (2012-04-01), Heft 4, S. 1-16 |
Veröffentlichung: | 2012 |
Medientyp: | academicJournal |
ISSN: | 0899-7667 (print) |
Schlagwort: |
|
Sonstiges: |
|