- Shopping Bag ( 0 items )
From the Publisher"A fascinating exploration of the conceptual systems that emerge when a neural network is trained to predict the properties of objects. Rogers and McClelland show that incremental, error-driven learning leads to internal distributed representations that explain a whole variety of empirical phenomena. Their network exhibits remarkable common sense in the way it generalizes, and distinctly human frailty in the way its knowledge disintegrates when it is physically degraded. Conceptual tendencies that many researchers assume to be innate are produced by the interplay between error-driven learning and the higher-order statistical structure of the set of facts that the network learns. The book uses very little technical jargon and the reasoning is clear, detailed, and compelling."—Geoffrey Hinton, FRS, Canada Research Chair in Machine Learning,Department of Computer Science, University of TorontoPlease note: The fourth sentence may be omitted for space purposes.
"This book — by one of the founders of the field of computational developmental psychology — provides a comprehensive and thorough overview of the field, as well as an articulate and persuasive statement of Shultz's own approach. Both beginning students and advanced researchers will find it stimulating, informative, and thought-provoking."—Jeff Elman, Professor of CognitiveScience, University of California, San Diego