Proceedings of the 1993 Connectionist Models Summer School

Hardcover (Print)
Used and New from Other Sellers
Used and New from Other Sellers
from $4.71
Usually ships in 1-2 business days
(Save 95%)
Other sellers (Hardcover)
  • All (4) from $4.71   
  • New (1) from $168.96   
  • Used (3) from $4.71   
Close
Sort by
Page 1 of 1
Showing All
Note: Marketplace items are not eligible for any BN.com coupons and promotions
$168.96
Seller since 2011

Feedback rating:

(920)

Condition:

New — never opened or used in original packaging.

Like New — packaging may have been opened. A "Like New" item is suitable to give as a gift.

Very Good — may have minor signs of wear on packaging but item works perfectly and has no damage.

Good — item is in good condition but packaging may have signs of shelf wear/aging or torn packaging. All specific defects should be noted in the Comments section associated with each item.

Acceptable — item is in working order but may show signs of wear such as scratches or torn packaging. All specific defects should be noted in the Comments section associated with each item.

Used — An item that has been opened and may show signs of wear. All specific defects should be noted in the Comments section associated with each item.

Refurbished — A used item that has been renewed or updated and verified to be in proper working condition. Not necessarily completed by the original manufacturer.

New
Brand new and unread! Join our growing list of satisfied customers!

Ships from: Phoenix, MD

Usually ships in 1-2 business days

  • Standard, 48 States
  • Standard (AK, HI)
Page 1 of 1
Showing All
Close
Sort by

Overview

The result of the 1993 Connectionist Models Summer School, the papers in this volume exemplify the tremendous breadth and depth of research underway in the field of neural networks. Although the slant of the summer school has always leaned toward cognitive science and artificial intelligence, the diverse scientific backgrounds and research interests of accepted students and invited faculty reflect the broad spectrum of areas contributing to neural networks, including artificial intelligence, cognitive science, computer science, engineering, mathematics, neuroscience, and physics. Providing an accurate picture of the state of the art in this fast-moving field, the proceedings of this intense two-week program of lectures, workshops, and informal discussions contains timely and high-quality work by the best and the brightest in the neural networks field.

Read More Show Less

Product Details

  • ISBN-13: 9780805815900
  • Publisher: Taylor & Francis
  • Publication date: 11/28/1993
  • Pages: 424
  • Lexile: 1410L (what's this?)
  • Product dimensions: 8.50 (w) x 10.96 (h) x 0.89 (d)

Table of Contents

Contents: Part I:Neuroscience. T. Rebotier, J. Droulez, Sigma-Pi Properties of Spiking Neurons. H.S. Wan, D.S. Touretzky, A.D. Redish, Towards a Computational Theory of Rat Navigation. H.T. Blair, Evaluating Connectionist Models in Psychology and Neuroscience. Part II:Vision. J. Sirosh, R. Miikkulainen, Self-Organizing Feature Maps with Lateral Connections: Modeling Ocular Dominance. A.K. Bhattacharjya, B. Roysam, Joint Solution of Low, Intermediate, and High Level Vision Tasks by Global Optimization: Application to Computer Vision at Low SNR. T.B. Ghiselli-Crippa, P.W. Munro, Learning Global Spatial Structures from Local Associations. Part III:Cognitive Modeling. D. Ascher, A Connectionist Model of Auditory Morse Code Perception. V. Dragoi, J.E.R. Staddon, A Competitive Neural Network Model for the Process of Recurrent Choice. A.M. Lindemann, A Neural Network Simulation of Numerical Verbal-to-Arabic Transcoding. T. Lund, Combining Models of Single-Digit Arithmetic and Magnitude Comparison. I.E. Dror, Neural Network Models as Tools for Understanding High-Level Cognition: Developing Paradigms for Cognitive Interpretation of Neural Network Models. Part IV:Language. F.J. Eisenhart, Modeling Language as Sensorimotor Coordination. A. Govindjee, G. Dell, Structure and Content in Word Production: Why It's Hard to Say Dlorm. P. Gupta, Investigating Phonological Representations: A Modeling Agenda. H. Schütze, Y. Singer, Part-of-Speech Tagging Using a Variable Context Markov Model. M. Spivey-Knowlton, Quantitative Predictions from a Constraint-Based Theory of Syntactic Ambiguity Resolution. B.B. Tesar, Optimality Semantics. Part V:Symbolic Computation and Rules. K.G. Daugherty, M. Hare, What's in a Rule? The Past Tense by Some Other Name Might Be Called a Connectionist Net. A. Almor, M. Rindner, On the Proper Treatment of Symbolism — A Lesson from Linguistics. L.F. Niklasson, Structure Sensitivity in Connectionist Models. M. Crucianu, Looking for Structured Representations in Recurrent Networks. I. Tchoumatchenko, Back Propagation with Understandable Results. M.W. Craven, J.W. Shavlik, Understanding Neural Networks via Rule Extraction and Pruning. A-H. Tan, Rule Learning and Extraction with Self-Organizing Neural Networks. Part VI:Recurrent Networks and Temporal Pattern Processing. J.F. Kolen, Recurrent Networks: State Machines or Iterated Function Systems? F. Cummins, R.F. Port, On the Treatment of Time in Recurrent Neural Networks. J.D. McAuley, Finding Metrical Structure in Time. C. Stevens, J. Wiles, Representations of Tonal Music: A Case Study in the Development of Temporal Relationships. M.A.S. Potts, D.S. Broomhead, J.P. Huke, Applications of Radial Basis Function Fitting to the Analysis of Dynamical Systems. M.E. Young, T.M. Bailey, Event Prediction: Faster Learning in a Layered Hebbian Network with Memory. Part VII:Control. S. Thrun, A. Schwartz, Issues in Using Function Approximation for Reinforcement Learning. P. Sabes, Approximating Q-Values with Basis Function Representations. K.L. Markey, Efficient Learning of Multiple Degree-of-Freedom Control Problems with Quasi-Independent Q-Agents. A.L. Tascillo, V.A. Skormin, Neural Adaptive Control of Systems with Drifting Parameters. Part VIII:Learning Algorithms and Architectures. R.C. O'Reilly, Temporally Local Unsupervised Learning: The MaxIn Algorithm for Maximizing Input Information. V.R. de Sa, Minimizing Disagreement for Self-Supervised Classification. S.N. Lindstaedt, Comparison of Two Unsupervised Neural Network Models for Redundancy Reduction. Z. Ghahramani, Solving Inverse Problems Using an EM Approach to Density Estimation. M. Finke, K-R. Müller, Estimating A-Posteriori Probabilities Using Stochastic Network Models. Part IX:Learning Theory. A.S. Weigend, On Overfitting and the Effective Number of Hidden Units. R. Dodier, Increase of Apparent Complexity Is Due to Decrease of Training Set Error. G.B. Orr, T.K. Leen, Momentum and Optimal Stochastic Search. R. Garcés, Scheme to Improve the Generalization Error. M.P. Perrone, General Averaging Results for Convex Optimization. R.A. Caruana, Multitask Connectionist Learning. Z. Cataltepe, Y.S. Abu-Mostafa, Estimating Learning Performance Using Hints. Part X:Simulation Tools. A. Jagota, A Simulator for Asynchronous Hopfield Models. A. Linden, An Object-Oriented Dataflow Approach for Better Designs of Neural Net Architectures.

Read More Show Less

Customer Reviews

Be the first to write a review
( 0 )
Rating Distribution

5 Star

(0)

4 Star

(0)

3 Star

(0)

2 Star

(0)

1 Star

(0)

Your Rating:

Your Name: Create a Pen Name or

Barnes & Noble.com Review Rules

Our reader reviews allow you to share your comments on titles you liked, or didn't, with others. By submitting an online review, you are representing to Barnes & Noble.com that all information contained in your review is original and accurate in all respects, and that the submission of such content by you and the posting of such content by Barnes & Noble.com does not and will not violate the rights of any third party. Please follow the rules below to help ensure that your review can be posted.

Reviews by Our Customers Under the Age of 13

We highly value and respect everyone's opinion concerning the titles we offer. However, we cannot allow persons under the age of 13 to have accounts at BN.com or to post customer reviews. Please see our Terms of Use for more details.

What to exclude from your review:

Please do not write about reviews, commentary, or information posted on the product page. If you see any errors in the information on the product page, please send us an email.

Reviews should not contain any of the following:

  • - HTML tags, profanity, obscenities, vulgarities, or comments that defame anyone
  • - Time-sensitive information such as tour dates, signings, lectures, etc.
  • - Single-word reviews. Other people will read your review to discover why you liked or didn't like the title. Be descriptive.
  • - Comments focusing on the author or that may ruin the ending for others
  • - Phone numbers, addresses, URLs
  • - Pricing and availability information or alternative ordering information
  • - Advertisements or commercial solicitation

Reminder:

  • - By submitting a review, you grant to Barnes & Noble.com and its sublicensees the royalty-free, perpetual, irrevocable right and license to use the review in accordance with the Barnes & Noble.com Terms of Use.
  • - Barnes & Noble.com reserves the right not to post any review -- particularly those that do not follow the terms and conditions of these Rules. Barnes & Noble.com also reserves the right to remove any review at any time without notice.
  • - See Terms of Use for other conditions and disclaimers.
Search for Products You'd Like to Recommend

Recommend other products that relate to your review. Just search for them below and share!

Create a Pen Name

Your Pen Name is your unique identity on BN.com. It will appear on the reviews you write and other website activities. Your Pen Name cannot be edited, changed or deleted once submitted.

 
Your Pen Name can be any combination of alphanumeric characters (plus - and _), and must be at least two characters long.

Continue Anonymously

    If you find inappropriate content, please report it to Barnes & Noble
    Why is this product inappropriate?
    Comments (optional)