Professional Documents
Culture Documents
Compiler Design File
Compiler Design File
EXPERIMENT NO.1
AIM: To study about LEX and YACC compilers.
THEORY:
Lex (software)
Lex is a computer program that generates lexical analyzers ("scanners" or "lexers"). Lex helps
write programs whose control flow is directed by instances of regular expressions in the input
stream. It is well suited for editor-script type transformations and for segmenting input in
preparation for a parsing routine.
Lex source is a table of regular expressions and corresponding program fragments. The table is
translated to a program which reads an input stream, copying it to an output stream and
partitioning the input into strings which match the given expressions. As each such string is
recognized the corresponding program fragment is executed. The recognition of the expressions
is performed by a deterministic finite automaton generated by Lex. The program fragments
written by the user are executed in the order in which the corresponding regular expressions
occur in the input stream. Lex reads an input stream specifying the lexical analyzer and outputs
source code implementing the lexer in the C programming language.
Open source:
Though originally distributed as proprietary software, some versions of Lex are now open
source. Open source versions of Lex, based on the original AT&T code are now distributed as
open source systems such as OpenSolaris and Plan 9 from Bell Labs. One popular open source
version of Lex, called flex, or the "fast lexical analyzer", is not derived from proprietary code.
Structure of a Lex file:
The structure of a Lex file is intentionally similar to that of a yacc file; files are divided into three
sections, separated by lines that contain only two percent signs, as follows:
The definition section defines macros and imports header files written in C. It is also
possible to write any C code here, which will be copied verbatim into the generated source
file.
The rules section associates regular expression patterns with C statements. When the lexer
sees text in the input matching a given pattern, it will execute the associated C code.
The C code section contains C statements and functions that are copied verbatim to the
generated source file. These statements presumably contain code called by the rules in the
rules section. In large programs it is more convenient to place this code in a separate file
linked in at compile time.
RollNo.
Page 1
RollNo.
Page 2
EXPERIMENT NO.2
AIM: Write a program for dividing the given input program into lexemes.
#include<stdio.h>
#include<conio.h>
#include<ctype.h>
#include<string.h>
#include<stdlib.h>
RollNo.
Page 3
Page 4
RollNo.
Page 5
RollNo.
Page 6
RollNo.
Page 7
RollNo.
Page 8
RollNo.
Page 9
RollNo.
Page 10
OUTPUT:
RollNo.
Page 11
RollNo.
Page 12
OUTPUT:
RollNo.
Page 13
EXPERIMENT NO. 5
AIM: Write a program to check whether a string is a keyword or not.
#include<stdio.h>
#include<conio.h>
#include<string.h>
void main()
{
int i,flag=0,m;
char s[5][10]={"if","else","goto","continue","return"},st[10];
clrscr();
printf("\n enter the string :");
gets(st);
for(i=0;i<5;i++)
{
m=strcmp(st,s[i]);
if(m==0)
flag=1;
}
if(flag==0)
printf("\n it is not a keyword");
else
printf("\n it is a keyword");
getch();
}
RollNo.
Page 14
OUTPUT:
RollNo.
Page 15
RollNo.
Page 16
RollNo.
Page 17
Page 18
OUTPUT:
RollNo.
Page 19
RollNo.
Page 20
Page 21
Page 22
OUTPUT:
RollNo.
Page 23
RollNo.
Page 24
Page 25
OUTPUT:
RollNo.
Page 26
RollNo.
Page 27
Page 28
Page 29
Page 30
RollNo.
Page 31
RollNo.
Page 32
Page 33
OUTPUT:
RollNo.
Page 34
RollNo.
Page 35
Page 36
OUTPUT:
RollNo.
Page 37
RollNo.
Page 38