Foundations of Statistical Natural Language Processing / Edition 1

Hardcover (Print)
Used and New from Other Sellers
Used and New from Other Sellers
from $52.30
Usually ships in 1-2 business days
(Save 44%)
Other sellers (Hardcover)
  • All (13) from $52.30   
  • New (6) from $71.25   
  • Used (7) from $52.30   

Overview

Statistical approaches to processing natural language text have become dominant in recent years. This foundational text is the first comprehensive introduction to statistical natural language processing (NLP) to appear. The book contains all the theory and algorithms needed for building NLP tools. It provides broad but rigorous coverage of mathematical and linguistic foundations, as well as detailed discussion of statistical methods, allowing students and researchers to construct their own implementations. The book covers collocation finding, word sense disambiguation, probabilistic parsing, information retrieval, and other applications.

Read More Show Less

What People Are Saying

From the Publisher
"Statistical natural-language processing is, in my estimation, one ofthe most fast-moving and exciting areas of computer science thesedays. Anyone who wants to learn this field would be well advised toget this book. For that matter, the same goes for anyone who isalready in the field. I know that it is going to be one of the mostwell-thumbed books on my bookshelf."Eugene Charniak , Department of Computer Science, Brown University
Eugene Charniak
Statistical natural-language processing is, in my estimation, one of the most fast-moving and exciting areas of computer science these days. Anyone who wants to learn this field would be well advised to get this book. For that matter, the same goes for anyone who is already in the field. I know that it is going to be one of the most well-thumbed books on my bookshelf.
Read More Show Less

Product Details

  • ISBN-13: 9780262133609
  • Publisher: MIT Press
  • Publication date: 6/18/1999
  • Edition description: New Edition
  • Edition number: 1
  • Pages: 720
  • Sales rank: 918,457
  • Product dimensions: 8.00 (w) x 9.00 (h) x 0.87 (d)

Table of Contents

List of Tables
List of Figures
Table of Notations
Preface
Road Map
I Preliminaries
1 Introduction
1.1 Rationalist and Empiricist Approaches to Language
1.2 Scientific Content
1.3 The Ambiguity of Language: Why NLP Is Difficult
1.4 Dirty Hands
1.5 Further Reading
1.6 Exercises
2 Mathematical Foundations
2.1 Elementary Probability Theory
2.2 Essential Information Theory
2.3 Further Reading
3 Linguistics Essentials
3.1 Parts of Speech and Morphology
3.2 Phrase Structure
3.3 Semantics and Pragmatics
3.4 Other Areas
3.5 Further Reading
3.6 Exercises
4 CorpusBased Work
4.1 Getting Set Up
4.2 Looking at Text
4.3 MarkedUp Data
4.4 Further Reading
4.5 Exercises
II Words
5 Collocations
5.1 Frequency
5.2 Mean and Variance
5.3 Hypothesis Testing
5.4 Mutual Information
5.5 The Notion of Collocation
5.6 Further Reading
6 Statistical Inference: ngram Models over Sparse
Data

6.1 Bins: Forming Equivalence Classes
6.2 Statistical Estimators
6.3 Combining Estimators
6.4 Conclusions
6.5 Further Reading
6.6 Exercises
7 Word Sense Disambiguation
7.1 Methodological Preliminaries
7.2 Supervised Disambiguation
7.3 DictionaryBased Disambiguation
7.4 Unsupervised Disambiguation
7.5 What Is a Word Sense?
7.6 Further Reading
7.7 Exercises
8 Lexical Acquisition
8.1 Evaluation Measures
8.2 Verb Subcategorization
8.3 Attachment Ambiguity
8.4 Selectional Preferences
8.5 Semantic Similarity
8.6 The Role of Lexical Acquisition in StatisticalNLP
8.7 Further Reading
III Grammar
9 Markov Models
9.1 Markov Models
9.2 Hidden Markov Models
9.3 The Three Fundamental Questions for HMMs
9.4 HMMs: Implementation, Properties, and Variants
9.5 Further Reading
10 PartofSpeech Tagging
10.1 The Information Sources in Tagging
10.2 Markov Model Taggers
10.3 Hidden Markov Model Taggers
10.4 TransformationBased Learning of Tags
10.5 Other Methods, Other Languages
10.6 Tagging Accuracy and Uses of Taggers
10.7 Further Reading
10.8 Exercises
11 Probabilistic Context Free Grammars
11.1 Some Features of PCFGs
11.2 Questions for PCFGs
11.3 The Probability of a String
11.4 Problems with the InsideOutside Algorithm
11.5 Further Reading
11.6 Exercises
12 Probabilistic Parsing
12.1 Some Concepts
12.2 Some Approaches
12.3 Further Reading
12.4 Exercises
IV Applications and Techniques
13 Statistical Alignment and Machine
Translation

13.1 Text Alignment
13.2 Word Alignment
13.3 Statistical Machine Translation
13.4 Further Reading
14 Clustering
14.1 Hierarchical Clustering
14.2 NonHierarchical Clustering
14.3 Further Reading
14.4 Exercises
15 Topics in Information Retrieval
15.1 Some Background on Information Retrieval
15.2 The Vector Space Models
15.3 Term Distribution Models
15.4 Latent Semantic Indexing
15.5 Discourse Segmentation
15.6 Further Reading
15.7 Exercises
16 Text Categorization
16.1 Decision Trees
16.2 Maximum Entropy Modeling
16.3 Perceptrons
16.4 k Nearest Neighbor Classification
16.5 Further Reading
Tiny Statistical Tables
Bibliography
Index
Read More Show Less

Customer Reviews

Be the first to write a review
( 0 )
Rating Distribution

5 Star

(0)

4 Star

(0)

3 Star

(0)

2 Star

(0)

1 Star

(0)

Your Rating:

Your Name: Create a Pen Name or

Barnes & Noble.com Review Rules

Our reader reviews allow you to share your comments on titles you liked, or didn't, with others. By submitting an online review, you are representing to Barnes & Noble.com that all information contained in your review is original and accurate in all respects, and that the submission of such content by you and the posting of such content by Barnes & Noble.com does not and will not violate the rights of any third party. Please follow the rules below to help ensure that your review can be posted.

Reviews by Our Customers Under the Age of 13

We highly value and respect everyone's opinion concerning the titles we offer. However, we cannot allow persons under the age of 13 to have accounts at BN.com or to post customer reviews. Please see our Terms of Use for more details.

What to exclude from your review:

Please do not write about reviews, commentary, or information posted on the product page. If you see any errors in the information on the product page, please send us an email.

Reviews should not contain any of the following:

  • - HTML tags, profanity, obscenities, vulgarities, or comments that defame anyone
  • - Time-sensitive information such as tour dates, signings, lectures, etc.
  • - Single-word reviews. Other people will read your review to discover why you liked or didn't like the title. Be descriptive.
  • - Comments focusing on the author or that may ruin the ending for others
  • - Phone numbers, addresses, URLs
  • - Pricing and availability information or alternative ordering information
  • - Advertisements or commercial solicitation

Reminder:

  • - By submitting a review, you grant to Barnes & Noble.com and its sublicensees the royalty-free, perpetual, irrevocable right and license to use the review in accordance with the Barnes & Noble.com Terms of Use.
  • - Barnes & Noble.com reserves the right not to post any review -- particularly those that do not follow the terms and conditions of these Rules. Barnes & Noble.com also reserves the right to remove any review at any time without notice.
  • - See Terms of Use for other conditions and disclaimers.
Search for Products You'd Like to Recommend

Recommend other products that relate to your review. Just search for them below and share!

Create a Pen Name

Your Pen Name is your unique identity on BN.com. It will appear on the reviews you write and other website activities. Your Pen Name cannot be edited, changed or deleted once submitted.

 
Your Pen Name can be any combination of alphanumeric characters (plus - and _), and must be at least two characters long.

Continue Anonymously

    If you find inappropriate content, please report it to Barnes & Noble
    Why is this product inappropriate?
    Comments (optional)