Get hands-on knowledge of how BERT (Bidirectional Encoder Representations from Transformers) can be used to develop question answering (QA) systems by using natural language processing (NLP) and deep learning.

The book begins with an overview of the technology landscape behind BERT. It takes you through the basics of NLP, including natural language understanding with tokenization, stemming, and lemmatization, and bag of words. Next, you'll look at neural networks for NLP starting with its variants such as recurrent neural networks, encoders and decoders, bi-directional encoders and decoders, and transformer models. Along the way, you'll cover word embedding and their types along with the basics of BERT.

After this solid foundation, you'll be ready to take a deep dive into BERT algorithms such as masked language models and next sentence prediction. You'll see different BERT variations followed by a hands-on example of a question answering system.

Hands-on Question Answering Systems with BERT is a good starting point for developers and data scientists who want to develop and design NLP systems using BERT. It provides step-by-step guidance for using BERT.

You will:

  • Examine the fundamentals of word embeddings
  • Apply neural networks and BERT for various NLP tasks
  • Develop a question-answering system from scratch
  • Train question-answering systems for your own data



Autorentext

Navin is the chief architect for HCL DryICE Autonomics. He is an innovator, thought leader, author, and consultant in the areas of AI, machine learning, cloud computing, big data analytics, and software product development. He is responsible for IP development and service delivery in the areas of AI and machine learning, automation, AIOPS, public cloud GCP, AWS, and Microsoft Azure. Navin has authored 15+ books in the areas of cloud computing , cognitive virtual agents, IBM Watson, GCP, containers, and microservices. 

Amit Agrawal is a senior data scientist and researcher delivering solutions in the fields of AI and machine learning. He is responsible for designing end-to-end solutions and architecture for enterprise products. He has also authored and reviewed books in the area of cognitive virtual assistants.





Inhalt
Chapter 1: Introduction to Natural Language Processing
Chapter Goal: To introduce basics of natural language processing
1.1What is natural language processing
1.2What is natural language understanding
1.3Natural language processing tasks
1.3.1Tokenization
1.3.2Stemming and lemmatization
1.3.3Bag of words
1.3.4Word / Sentence vectorization

Chapter 2: Introduction to Word Embeddings
Chapter Goal: To introduce the basics of word embeddings
3.1What is word embeddings
3.2Different methods of word embeddings
3.2.1Word2vec
3.2.2Glove
3.2.3Elmo
3.2.4Universal sentence encoders
3.2.5BERT
3.3Bidirectional Encoder Representations from Transformers (BERT)
3.3.1BERT base
3.3.2BERT - large

Chapter 3: BERT Algorithms Explained
Chapter Goal: Details on BERT model algorithms
4.1Masked language model
4.2Next sentence prediction (NSP)
4.3Text classification using BERT
4.4Various types of BERT based models
4.4.1ALBERT
4.4.2ROBERT
4.4.3DistilBERT

Chapter 4: BERT Model Applications - Question Answering System
Chapter Goal: Details on question answering system
5.1 Introduction
5.2 Types of QA systems
5.3 QA system design using BERT
5.4 DrQA system
5.5 DeepPavlov QA system

Chapter 5: BERT Model Applications - Other tasks
Chapter Goal: Details on NLP tasks performed by BERT.
6.1 Introduction
6.2 Other NLP Tasks:
6.2.1 Sentiment analysis
6.2.2. Named entity recognition
6.2.3 Tag generation
6.2.4 Classification
6.2.5 Text summarization
6.2.6 Language translation 

Chapter 6: Future of BERT models
Chapter Goal: Provides an introduction to the new advances in the areas NLP using BERT
7.1 BERT - Future capabilities





Titel
Hands-on Question Answering Systems with BERT
Untertitel
Applications in Neural Networks and Natural Language Processing
EAN
9781484266649
Format
E-Book (pdf)
Hersteller
Veröffentlichung
12.01.2021
Digitaler Kopierschutz
Wasserzeichen
Dateigrösse
4.87 MB
Anzahl Seiten
184