• Complain

Jacob Eisenstein - Introduction to Natural Language Processing (Adaptive Computation and Machine Learning series)

Here you can read online Jacob Eisenstein - Introduction to Natural Language Processing (Adaptive Computation and Machine Learning series) full text of the book (entire story) in english for free. Download pdf and epub, get meaning, cover and reviews about this ebook. year: 2019, publisher: The MIT Press, genre: Children. Description of the work, (preface) as well as reviews are available. Best literature library LitArk.com created for fans of good reading and offers a wide selection of genres:

Romance novel Science fiction Adventure Detective Science History Home and family Prose Art Politics Computer Non-fiction Religion Business Children Humor

Choose a favorite category and find really read worthwhile books. Enjoy immersion in the world of imagination, feel the emotions of the characters or learn something new for yourself, make an fascinating discovery.

No cover
  • Book:
    Introduction to Natural Language Processing (Adaptive Computation and Machine Learning series)
  • Author:
  • Publisher:
    The MIT Press
  • Genre:
  • Year:
    2019
  • Rating:
    4 / 5
  • Favourites:
    Add to favourites
  • Your mark:
    • 80
    • 1
    • 2
    • 3
    • 4
    • 5

Introduction to Natural Language Processing (Adaptive Computation and Machine Learning series): summary, description and annotation

We offer to read an annotation, description, summary or preface (depends on what the author of the book "Introduction to Natural Language Processing (Adaptive Computation and Machine Learning series)" wrote himself). If you haven't found the necessary information about the book — write in the comments, we will try to find it.

A survey of computational methods for understanding, generating, and manipulating human language, which offers a synthesis of classical representations and algorithms with contemporary machine learning techniques.

This textbook provides a technical perspective on natural language processingmethods for building computer software that understands, generates, and manipulates human language. It emphasizes contemporary data-driven approaches, focusing on techniques from supervised and unsupervised machine learning. The first section establishes a foundation in machine learning by building a set of tools that will be used throughout the book and applying them to word-based textual analysis. The second section introduces structured representations of language, including sequences, trees, and graphs. The third section explores different approaches to the representation and analysis of linguistic meaning, ranging from formal logic to neural word embeddings. The final section offers chapter-length treatments of three transformative applications of natural language processing: information extraction, machine translation, and text generation. End-of-chapter exercises include both paper-and-pencil analysis and software implementation.

The text synthesizes and distills a broad and diverse research literature, linking contemporary machine learning techniques with the fields linguistic and computational foundations. It is suitable for use in advanced undergraduate and graduate-level courses and as a reference for software engineers and data scientists. Readers should have a background in computer programming and college-level mathematics. After mastering the material presented, students will have the technical skill to build and analyze novel natural language processing systems and to understand the latest research in the field.

Jacob Eisenstein: author's other books


Who wrote Introduction to Natural Language Processing (Adaptive Computation and Machine Learning series)? Find out the surname, the name of the author of the book and a list of all author's works by series.

Introduction to Natural Language Processing (Adaptive Computation and Machine Learning series) — read online for free the complete book (whole text) full work

Below is the text of the book, divided by pages. System saving the place of the last page read, allows you to conveniently read the book "Introduction to Natural Language Processing (Adaptive Computation and Machine Learning series)" online for free, without having to search again every time where you left off. Put a bookmark, and you can go to the page where you finished reading at any time.

Light

Font size:

Reset

Interval:

Bookmark:

Make
Table of Contents
Guide
Page List
Introduction to Natural Language Processing
Introduction to Natural Language Processing
Jacob Eisenstein
The MIT Press
Cambridge, Massachusetts
London, England

2019 The Massachusetts Institute of Technology

All rights reserved. No part of this book may be reproduced in any form by any electronic or

mechanical means (including photocopying, recording, or information storage and retrieval) without

permission in writing from the publisher.

This book was set in Times New Roman by Westchester Publishing Services.

Library of Congress Cataloging-in-Publication Data

Names: Eisenstein, Jacob, author.

Title: Introduction to natural language processing / Jacob Eisenstein.

Description: Cambridge, MA : T he MIT Press, [2019] Series: Adaptive computation

|

and machine learning Includes bibliographical references and index.

|

Identifiers: LCCN 2018059552 ISBN 9780262042840 (hardcov er : alk. paper)

|

Subjects: LCSH: Natural language processing (Computer science)

Classification: LCC QA76.9.N38 E46 2019 DDC 006.3/5dc23

|

LC record available at https://lccn.loc.gov/2018059552

Contents
Preface ix
Notation xiii
1 Introduction 1
1.1 Natural Language Processing and Its Neighbors
1.2 Three Themes in Natural Language Processing
I LEARNING 11
2 Linear Text Classication 13
2.1 The Bag of Words
2.2 Nave Bayes
2.3 Discriminative Learning
2.4 Loss Functions and Large-Margin Classication
2.5 Logistic Regression
2.6 Optimization
2.7 *Additional Topics in Classication
2.8 Summary of Learning Algorithms
3 Nonlinear Classication 47
3.1 Feedforward Neural Networks
3.2 Designing Neural Networks
3.3 Learning Neural Networks
3.4 Convolutional Neural Networks
4 Linguistic Applications of Classication 67
4.1 Sentiment and Opinion Analysis
4.2 Word Sense Disambiguation
4.3 Design Decisions for Text Classication
4.4 Evaluating Classiers
4.5 Building Datasets
vi Contents
5 Learning without Supervision 91
5.1 Unsupervised Learning
5.2 Applications of Expectation-Maximization
5.3 Semi-Supervised Learning
5.4 Domain Adaptation
5.5 *Other Approaches to Learning with Latent Variables
II SEQUENCES AND TREES 117
6 Language Models 119
6.1 N -Gram Language Models
6.2 Smoothing and Discounting
6.3 Recurrent Neural Network Language Models
6.4 Evaluating Language Models
6.5 Out-of-Vocabulary Words
7 Sequence Labeling 137
7.1 Sequence Labeling as Classication
7.2 Sequence Labeling as Structure Prediction
7.3 The Viterbi Algorithm
7.4 Hidden Markov Models
7.5 Discriminative Sequence Labeling with Features
7.6 Neural Sequence Labeling
7.7 *Unsupervised Sequence Labeling
8 Applications of Sequence Labeling 167
8.1 Part-of-Speech Tagging
8.2 Morphosyntactic Attributes
8.3 Named Entity Recognition
8.4 Tokenization
8.5 Code Switching
8.6 Dialogue Acts
9 Formal Language Theory 183
9.1 Regular Languages
9.2 Context-Free Languages
9.3 *Mildly Context-Sensitive Languages
10 Context-Free Parsing 215
10.1 Deterministic Bottom-Up Parsing
10.2 Ambiguity
10.3 Weighted Context-Free Grammars
10.4 Learning Weighted Context-Free Grammars
10.5 Grammar Renement
10.6 Beyond Context-Free Parsing
Contents vii
11 Dependency Parsing 243
11.1 Dependency Grammar
11.2 Graph-Based Dependency Parsing
11.3 Transition-Based Dependency Parsing
11.4 Applications
III MEANING 267
12 Logical Semantics 269
12.1 Meaning and Denotation
12.2 Logical Representations of Meaning
12.3 Semantic Parsing and the Lambda Calculus
12.4 Learning Semantic Parsers
13 Predicate-Argument Semantics 289
13.1 Semantic Roles
13.2 Semantic Role Labeling
13.3 Abstract Meaning Representation
14 Distributional and Distributed Semantics 309
14.1 The Distributional Hypothesis
14.2 Design Decisions for Word Representations
14.3 Latent Semantic Analysis
14.4 Brown Clusters
14.5 Neural Word Embeddings
14.6 Evaluating Word Embeddings
14.7 Distributed Representations beyond Distributional Statistics
14.8 Distributed Representations of Multiword Units
15 Reference Resolution 333
15.1 Forms of Referring Expressions
15.2 Algorithms for Coreference Resolution
15.3 Representations for Coreference Resolution
15.4 Evaluating Coreference Resolution
16 Discourse 357
16.1 Segments
16.2 Entities and Reference
16.3 Relations
IV APPLICATIONS 377
17 Information Extraction 379
17.1 Entities
17.2 Relations
viii Contents
17.3 Events
17.4 Hedges, Denials, and Hypotheticals
17.5 Question Answering and Machine Reading
18 Machine Translation 405
18.1 Machine Translation as a Task
18.2 Statistical Machine Translation
18.3 Neural Machine Translation
18.4 Decoding
18.5 Training toward the Evaluation Metric
19 Text Generation 431
19.1 Data-to-Text Generation
19.2 Text-to-Text Generation
19.3 Dialogue
Appendix A: Probability 447
A.1 Probabilities of Event Combinations
A.2 Conditional Probability and Bayes Rule
A.3 Independence
A.4 Random Variables
A.5 Expectations
A.6 Modeling and Estimation
Appendix B: Numerical Optimization 455
B.1 Gradient Descent
B.2 Constrained Optimization
B.3 Example: Passive-Aggressive Online Learning
Bibliography 459
Index 509
Preface
The goal of this text is focus on a core subset of the natural language processing, unied by
the concepts of learning and search. A remarkable number of problems in natural language
processing can be solved by a compact set of methods:
Search Viterbi, CKY, minimum spanning tree, shift-reduce, integer linear programming,
beam search.
Learning Maximum likelihood estimation, logistic regression, perceptron, expectation-
maximization, matrix factorization, backpropagation.
This text explains how these methods work and how they can be applied to a wide range of
tasks: document classication, word sense disambiguation, part-of-speech tagging, named
entity recognition, parsing, coreference resolution, relation extraction, discourse analysis,
language modeling, and machine translation.
Background
Because natural language processing draws on many different intellectual traditions, almost
everyone who approaches it feels underprepared in one way or another. Here is a summary
of what is expected, and where you can learn more:
Mathematics and machine learning The text assumes a background in multivariate cal-
culus and linear algebra: vectors, matrices, derivatives, and partial derivatives. You should
Next page
Light

Font size:

Reset

Interval:

Bookmark:

Make

Similar books «Introduction to Natural Language Processing (Adaptive Computation and Machine Learning series)»

Look at similar books to Introduction to Natural Language Processing (Adaptive Computation and Machine Learning series). We have selected literature similar in name and meaning in the hope of providing readers with more options to find new, interesting, not yet read works.


Reviews about «Introduction to Natural Language Processing (Adaptive Computation and Machine Learning series)»

Discussion, reviews of the book Introduction to Natural Language Processing (Adaptive Computation and Machine Learning series) and just readers' own opinions. Leave your comments, write what you think about the work, its meaning or the main characters. Specify what exactly you liked and what you didn't like, and why you think so.