BN.com Gift Guide

The Informational Complexity of Learning: Perspectives on Neural Networks and Generative Grammar / Edition 1

Hardcover (Print)
Used and New from Other Sellers
Used and New from Other Sellers
from $63.31
Usually ships in 1-2 business days
(Save 61%)
Other sellers (Hardcover)
  • All (11) from $63.31   
  • New (4) from $134.11   
  • Used (7) from $63.31   
Close
Sort by
Page 1 of 2
Showing 1 – 9 of 11 (2 pages)
Note: Marketplace items are not eligible for any BN.com coupons and promotions
$63.31
Seller since 2014

Feedback rating:

(6)

Condition:

New — never opened or used in original packaging.

Like New — packaging may have been opened. A "Like New" item is suitable to give as a gift.

Very Good — may have minor signs of wear on packaging but item works perfectly and has no damage.

Good — item is in good condition but packaging may have signs of shelf wear/aging or torn packaging. All specific defects should be noted in the Comments section associated with each item.

Acceptable — item is in working order but may show signs of wear such as scratches or torn packaging. All specific defects should be noted in the Comments section associated with each item.

Used — An item that has been opened and may show signs of wear. All specific defects should be noted in the Comments section associated with each item.

Refurbished — A used item that has been renewed or updated and verified to be in proper working condition. Not necessarily completed by the original manufacturer.

Acceptable
1997 Hardcover Fair Fading spine, Item is intact, but may show shelf wear. Pages may include notes and highlighting. May or may not include supplemental or companion material. ... Access codes may or may not work. Connecting readers since 1972. Customer service is our top priority. Read more Show Less

Ships from: Wauwatosa, WI

Usually ships in 1-2 business days

  • Standard, 48 States
  • Standard (AK, HI)
  • Express, 48 States
  • Express (AK, HI)
$65.29
Seller since 2009

Feedback rating:

(3472)

Condition: Good
Ships same day or next business day! UPS expedited shipping available (Priority Mail for AK/HI/APO/PO Boxes). Used sticker & some writing and/or highlighting. Used books may not ... include working access code or dust jacket Read more Show Less

Ships from: Columbia, MO

Usually ships in 1-2 business days

  • Standard, 48 States
  • Standard (AK, HI)
  • Express, 48 States
  • Express (AK, HI)
$67.37
Seller since 2005

Feedback rating:

(49521)

Condition: Very Good
Ships same day or next business day via UPS (Priority Mail for AK/HI/APO/PO Boxes)! Used sticker and some writing and/or highlighting. Used books may not include working access ... code or dust jacket. Read more Show Less

Ships from: Columbia, MO

Usually ships in 1-2 business days

  • Standard, 48 States
  • Standard (AK, HI)
  • Express, 48 States
  • Express (AK, HI)
$67.67
Seller since 2006

Feedback rating:

(60901)

Condition: Very Good
Former Library book. Great condition for a used book! Minimal wear. 100% Money Back Guarantee. Shipped to over one million happy customers. Your purchase benefits world literacy!

Ships from: Mishawaka, IN

Usually ships in 1-2 business days

  • Canadian
  • International
  • Standard, 48 States
  • Standard (AK, HI)
  • Express, 48 States
  • Express (AK, HI)
$132.79
Seller since 2008

Feedback rating:

(17845)

Condition: Acceptable
Used, Acceptable Condition, may show signs of wear and previous use. Please allow 4-14 business days for delivery. 100% Money Back Guarantee, Over 1,000,000 customers served.

Ships from: Westminster, MD

Usually ships in 1-2 business days

  • Canadian
  • International
  • Standard, 48 States
  • Standard (AK, HI)
$134.11
Seller since 2008

Feedback rating:

(17845)

Condition: New
Brand New, Perfect Condition, Please allow 4-14 business days for delivery. 100% Money Back Guarantee, Over 1,000,000 customers served.

Ships from: Westminster, MD

Usually ships in 1-2 business days

  • Canadian
  • International
  • Standard, 48 States
  • Standard (AK, HI)
$149.82
Seller since 2011

Feedback rating:

(52)

Condition: New
Ships next business day! Brand New!

Ships from: Vancouver, WA

Usually ships in 1-2 business days

  • Standard, 48 States
  • Standard (AK, HI)
  • Express, 48 States
  • Express (AK, HI)
$154.59
Seller since 2009

Feedback rating:

(10631)

Condition: New
New Book. Shipped from US within 4 to 14 business days. Established seller since 2000

Ships from: Secaucus, NJ

Usually ships in 1-2 business days

  • Standard, 48 States
  • Standard (AK, HI)
$183.73
Seller since 2014

Feedback rating:

(7)

Condition: Very Good
1997 Hardback NEAR FINE Hardback, This listing is a new book, a title currently in-print which we order directly and immediately from the publisher. *****PLEASE NOTE: This item ... is shipping from an authorized seller in Europe. In the event that a return is necessary, you will be able to return your item within the US. To learn more about our European sellers and policies see the BookQuest FAQ section***** Read more Show Less

Ships from: Stroud, Glos, United Kingdom

Usually ships in 1-2 business days

  • Canadian
  • Standard, 48 States
  • Standard (AK, HI)
Page 1 of 2
Showing 1 – 9 of 11 (2 pages)
Close
Sort by

Overview

Among other topics, The Informational Complexity of Learning: Perspectives on Neural Networks and Generative Grammar brings together two important but very different learning problems within the same analytical framework. The first concerns the problem of learning functional mappings using neural networks, followed by learning natural language grammars in the principles and parameters tradition of Chomsky. These two learning problems are seemingly very different. Neural networks are real-valued, infinite-dimensional, continuous mappings. On the other hand, grammars are boolean-valued, finite-dimensional, discrete symbolic mappings. Furthermore the research communities that work in the two areas almost never overlap. The book's objective is to bridge this gap. It uses the formal techniques developed in statistical learning theory and theoretical computer science over the last decade to analyze both kinds of learning problems. By asking the same question - how much information does it take to learn - of both problems, it highlights their similarities and differences. Specific results include model selection in neural networks, active learning, language learning and evolutionary models of language change.
Read More Show Less

Product Details

  • ISBN-13: 9780792380818
  • Publisher: Springer US
  • Publication date: 1/1/1998
  • Edition description: 1998
  • Edition number: 1
  • Pages: 224
  • Product dimensions: 9.21 (w) x 6.14 (h) x 0.69 (d)

Table of Contents

1. Introduction.- 1.1 The Components of a Learning Paradigm.- 1.1.1 Concepts, Hypotheses, and Learners.- 1.1.2 Generalization, Learnability, Successful learning.- 1.1.3 Informational Complexity.- 1.2 Parametric Hypothesis Spaces.- 1.3 Technical Contents and Major Contributions.- 1.3.1 A Final Word.- 2. Generalization Error For Neural Nets.- 2.1 Introduction.- 2.2 Definitions and Statement of the Problem.- 2.2.1 Random Variables and Probability Distributions.- 2.2.2 Learning from Examples and Estimators.- 2.2.3 The Expected Risk and the Regression Function.- 2.2.4 The Empirical Risk.- 2.2.5 The Problem.- 2.2.6 Bounding the Generalization Error.- 2.2.7 A Note on Models and Model Complexity.- 2.3 Stating the Problem for Radial Basis Functions.- 2.4 Main Result.- 2.5 Remarks.- 2.5.1 Observations on the Main Result.- 2.5.2 Extensions.- 2.5.3 Connections with Other Results.- 2.6 Implications of the Theorem in Practice: Putting In the Numbers.- 2.6.1 Rate of Growth of n for Guaranteed Convergence.- 2.6.2 Optimal Choice of n.- 2.6.3 Experiments.- 2.7 Conclusion.- 2-A Notations.- 2-B A Useful Decomposition of the Expected Risk.- 2-C A Useful Inequality.- 2-D Proof of the Main Theorem.- 2-D.1 Bounding the approximation error.- 2-D.2 Bounding the estimation error.- 2-D.3 Bounding the generalization error.- 3. Active Learning.- 3.1 A General Framework For Active Approximation.- 3.1.1 Preliminaries.- 3.1.2 The Problem of Collecting Examples.- 3.1.3 In Context.- 3.2 Example 1: A Class of Monotonically Increasing Bounded Functions.- 3.2.1 Lower Bound for Passive Learning.- 3.2.2 Active Learning Algorithms.- 3.2.2.1 Derivation of an optimal sampling strategy.- 3.2.3 Empirical Simulations, and other Investigations.- 3.2.3.1 Distribution of Points selected.- 3.2.3.2 Classical Optimal Recovery.- 3.2.3.3 Error Rates and Sample Complexities for some Arbitrary Functions: Some Simulations.- 3.3 Example 2: A Class of Functions with Bounded First Derivative.- 3.3.1 Lower Bounds.- 3.3.2 Active Learning Algorithms.- 3.3.2.1 Derivation of an optimal sampling strategy.- 3.3.3 Some Simulations.- 3.3.3.1 Distribution of points selected.- 3.3.3.2 Error Rates:.- 3.4 Conclusions, Extensions, and Open Problems.- 3.5 A Simple Example.- 3.6 Generalizations.- 3.6.1 Localized Function Classes.- 3.6.2 The General—-focusing strategy;.- 3.6.3 Generalizations and Open Problems.- 4. Language Learning.- 4.1 Language Learning and The Poverty of Stimulus.- 4.2 Constrained Grammars-Principles and Parameters.- 4.2.1 Example: A 3-parameter System from Syntax.- 4.2.2 Example: Parameterized Metrical Stress in Phonology.- 4.3 Learning in the Principles and Parameters Framework.- 4.4 Formal Analysis of the Triggering Learning Algorithm.- 4.4.1 Background.- 4.4.2 The Markov formulation.- 4.4.2.1 Parameterized Grammars and their Corresponding Markov Chains.- 4.4.2.2 Markov Chain Criteria for Learnability.- 4.4.2.3 The Markov chain for the 3-parameter Example.- 4.4.3 Derivation of the transition probabilities for the Markov TLA structure.- 4.4.3.1 Formalization.- 4.4.3.2 Additional Properties of the Learning System.- 4.5 Characterizing Convergence Times for the Markov Chain Model.- 4.5.1 Some Transition Matrices and Their Convergence Curves.- 4.5.2 Absorption Times.- 4.5.3 Eigenvalue Rates of Convergence.- 4.5.3.1 Eigenvalues and Eigenvectors.- 4.5.3.2 Representation of Tk.- 4.5.3.3 Initial Conditions and Limiting Distributions.- 4.5.3.4 Rate of Convergence.- 4.5.3.5 Transition Matrix Recipes:.- 4.6 Exploring Other Points.- 4.6.1 Changing the Algorithm.- 4.6.2 Distributional Assumptions.- 4.6.3 Natural Distributions-CHILDES COORPUS.- 4.7 Batch Learning Upper and Lower Bounds: An Aside.- 4.8 Conclusions, Open Questions, and Future Directions.- 4-A Unembedded Sentences For Parametric Grammars.- 4-B Memoryless Algorithms and Markov Chains.- 4-C Proof of Learnability Theorem.- 4-C.1 Markov state terminology.- 4-C.2 Canonical Decomposition.- 4-D Formal Proof.- 5. Language Change.- 5.1 Introduction.- 5.2 Language Change in Parametric Systems.- 5.3 Example 1: A Three Parameter System.- 5.3.1 Starting with Homogeneous Populations:.- 5.3.1.1 A = TLA; Pi = Uniform; Finite Sample = 128.- 5.3.1.2 A = Greedy, No S.V.; Pi = Uniform; Finite Sample = 128.- 5.3.1.3 A = a) R.W. b) S. V. only; Pi = Uniform; Finite Sample = 128.- 5.3.1.4 Rates of Change.- 5.3.2 Non-homogeneous Populations: Phase-Space Plots.- 5.3.2.1 Phase-Space Plots: Grammatical Trajectories.- 5.3.2.2 Issues of Stability.- 5.4 Example 2: The Case of Modern French:.- 5.4.1 The Parametric Subspace and Data.- 5.4.2 The Case of Diachronie Syntax Change in French.- 5.4.3 Some Dynamical System Simulations.- 5.4.3.1 Homogeneous Populations [Initial—Old French].- 5.4.3.2 Heterogeneous Populations (Mixtures).- 5.5 Conclusions.- 6. Conclusions.- 6.1 Emergent Themes.- 6.2 Extensions.- 6.3 A Concluding Note.- References.

Read More Show Less

Customer Reviews

Be the first to write a review
( 0 )
Rating Distribution

5 Star

(0)

4 Star

(0)

3 Star

(0)

2 Star

(0)

1 Star

(0)

Your Rating:

Your Name: Create a Pen Name or

Barnes & Noble.com Review Rules

Our reader reviews allow you to share your comments on titles you liked, or didn't, with others. By submitting an online review, you are representing to Barnes & Noble.com that all information contained in your review is original and accurate in all respects, and that the submission of such content by you and the posting of such content by Barnes & Noble.com does not and will not violate the rights of any third party. Please follow the rules below to help ensure that your review can be posted.

Reviews by Our Customers Under the Age of 13

We highly value and respect everyone's opinion concerning the titles we offer. However, we cannot allow persons under the age of 13 to have accounts at BN.com or to post customer reviews. Please see our Terms of Use for more details.

What to exclude from your review:

Please do not write about reviews, commentary, or information posted on the product page. If you see any errors in the information on the product page, please send us an email.

Reviews should not contain any of the following:

  • - HTML tags, profanity, obscenities, vulgarities, or comments that defame anyone
  • - Time-sensitive information such as tour dates, signings, lectures, etc.
  • - Single-word reviews. Other people will read your review to discover why you liked or didn't like the title. Be descriptive.
  • - Comments focusing on the author or that may ruin the ending for others
  • - Phone numbers, addresses, URLs
  • - Pricing and availability information or alternative ordering information
  • - Advertisements or commercial solicitation

Reminder:

  • - By submitting a review, you grant to Barnes & Noble.com and its sublicensees the royalty-free, perpetual, irrevocable right and license to use the review in accordance with the Barnes & Noble.com Terms of Use.
  • - Barnes & Noble.com reserves the right not to post any review -- particularly those that do not follow the terms and conditions of these Rules. Barnes & Noble.com also reserves the right to remove any review at any time without notice.
  • - See Terms of Use for other conditions and disclaimers.
Search for Products You'd Like to Recommend

Recommend other products that relate to your review. Just search for them below and share!

Create a Pen Name

Your Pen Name is your unique identity on BN.com. It will appear on the reviews you write and other website activities. Your Pen Name cannot be edited, changed or deleted once submitted.

 
Your Pen Name can be any combination of alphanumeric characters (plus - and _), and must be at least two characters long.

Continue Anonymously

    If you find inappropriate content, please report it to Barnes & Noble
    Why is this product inappropriate?
    Comments (optional)