Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 22

INSTITUTE OF AERONAUTICAL ENGINEERING

AAT TECH TALK

Name N.Upender
Rollno 22955A6706
Subject Compiler Design
Subject Code ACSC04
TITLE

Learning executable semantic parsers for natural


language understanding
Introduction
 Compiler design is the process of creating a program that can convert source code written in one
programming language into machine code that can be executed by a computer.
 In the field of natural language processing (NLP).compiler design plays an important role in
creating efficient and accurate language models.
 Compiler design and natural language processing (NLP) are two distinct but fascinating fields
within computer science. However, combining elements from both can lead to innovative
advancements, especially in creating more intuitive programming environments and improving
code understanding and generation.
 The integration of NLP with compiler design holds significant promise for the future of software
development. By leveraging natural language understanding.
Executable semantic parsers for natural language Processing.

1. Lexical Analysis: Breaks code into tokens.


2. Syntax Analysis: Ensures tokens form valid structures.
3. Semantic Analysis: Checks for correct usage of language rules.
4. Intermediate Code Generation: Produces a simpler representation.
5. Optimization: Improves code performance.
6. Code Generation: Converts to machine code.
7. Linking and Loading: Prepares the code for execution.
Semantic parsing components.
Executor: computes the denotation (action) y = [z]c given a logical form z and context c.
This defines the semantic representation (logical forms along with their denotations).

Grammar: a set of rules G that produces D(x, c), a set of candidate derivations of logical forms.

Model: specifies a distribution p subscript theta (d|x, c) over derivations d parameterized by theta.

Parser: searches for high probability derivations d under the model p subscript theta.

Learner: estimates the parameters q (and possibly rules in G) given training examples .
Executor:
Let the semantic representation be the language of mathematics, and the
executor is the standard interpretation, where the interpretations of predicates
are given.

Grammar:
The grammar G connects utterances to possible derivations of logical forms.
Formally, the grammar is a set of rules of the form a ⇒ b.
Model:

 The model scores the set of candidate derivations generated by the grammar.
 A common choice used by virtually all existing semantic parsers are log-linear models
(generalizations of logistic regressions).
 In a log-linear model, define a feature vector for each possible derivation . The parameter vector,
which defines a weight for each feature representing how reliable that feature is their weighted
combination score.
Parser:
 Given a trained model pq , the parser (approximately) computes the highest probability
derivation(s) for an utterance x under pq .
 Assume the utterance x is represented as a sequence of tokens (words). A standard approach is
to use a chart parser, which recursively builds derivations for each span of the utterance.

Learner:
 While the parser turns parameters into derivations, the learner solves the inverse problem.
 The dominant paradigm in machine learning is to set up an objective function and optimize it. A
standard principle is to maximize the likelihood of the training data.
Semantic Parsing System

 The components of a semantic parsing system. Observe that the components are relatively loosely
coupled.
 The executor is concerned purely with what we want to express independent of how it would be
expressed in natural language.
 The grammar describes how candidate logical forms are constructed from the utterance but does
not provide algorithmic guidance nor specify a way to score the candidates.
 The model focuses on a particular derivation and defines features that could be helpful for
predicting accurately. The parser and the learner provide algorithms largely independent of
semantic representations. This modularity allows us to improve each component in isolation
semantic representations

 One of the main difficulties with semantic parsing is the divergence between the structure of
the natural language and the logical forms purely compositional semantics will not work.
 This has led to some efforts to introduce an intermediate layer between utterances and
logical forms.
 One idea is to use general paraphrasing models to map input utterances to the “canonical
utterances” of logical forms.
 This reduces semantic parsing to a text-only problem for which there is much more data and
resources. One could also use domain-general logical forms that capture the basic predicate-
argument structures of sentences.
 Much of the progress in semantic parsing has been due to being able to learn from weaker
supervision. In the framework we presented, this supervision is the desired actions y (for
example, answers to questions). One can use a large corpus of text to exploit even weaker
supervision.
 More generally, one can think about language interpretation in a reinforcement learning
setting,9 where an agent presented with an utterance in some context performs some action,
and receives a corresponding reward signal. This framework highlights the importance of
context-dependence in language interpretation.
 Due to their empirical success, there has been a recent surge of interest in using recurrent
neural networks and their extensions for solving NLP tasks such as machine translation and
question answering.
Natural language understanding
 Natural language understanding (NLU) is a branch of artificial intelligence (AI) that uses computer
software to understand input in the form of sentences using text or speech. NLU enables human-
computer interaction by analyzing language versus just words.
 NLU enables computers to understand the sentiments expressed in a natural language used by
humans, such as English, French or Mandarin, without the formalized syntax of computer languages.
 NLU also enables computers to communicate back to humans in their own languages.
 A basic form of NLU is called parsing, which takes written text and converts it into a structured
format for computers to understand. Instead of relying on computer language syntax, NLU enables a
computer to comprehend and respond to human-written text.
 One of the main purposes of NLU is to create chat- and voice-enabled bots that can interact with
people without supervision. Many startups, as well as major IT companies, such as Amazon, Apple,
Google and Microsoft, either have or are working on NLU projects and language models.
 NLP involves the interaction between computers and human (natural) languages. The main
components of NLP include:
1. Tokenization: Splitting text into individual words or phrases.
2. Part-of-Speech Tagging: Identifying the grammatical category of each word (e.g., noun,
verb).
3. Named Entity Recognition (NER): Identifying entities such as names, dates, and
organizations in text.
4. Parsing: Analyzing the grammatical structure of sentences.
5. Sentiment Analysis: Determining the sentiment expressed in text (e.g., positive, negative).
6. Machine Translation: Translating text from one language to another.
7. Text Generation: Generating human-like text from a model.
Applications
 Customer Support Chatbots: Chatbots that provide instant support and responses to
customer queries.
 Virtual Assistants: Voice-activated assistants, such as Siri, Alexa, or Google Assistant, that
perform tasks in real-time based on spoken commands.
 Autonomous Vehicles: Vehicles that use natural language commands to assist with
navigation and other driving functions.
 Financial Services: Real-time processing of financial queries and transactions through
natural language interfaces.
 Healthcare: Systems that provide real-time analysis and response to medical queries and
tasks.
Conclusion

 Executable semantic parsers are a pivotal advancement in the field of natural language
understanding, providing a robust mechanism for converting human language into
executable commands or code.
 These systems are designed to bridge the gap between how humans communicate and how
machines operate, making interactions more intuitive and effective. By leveraging advanced
machine learning techniques, such as supervised and reinforcement learning, these parsers
can learn to interpret and execute complex instructions accurately.
 This ability to process and act on natural language inputs has profound implications across
numerous real-time applications, enhancing both user experience and operational efficiency.

You might also like