Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 27

ACKNOWLEDGEMENT

I take the opportunity to express my cordial gratitude and deep sense of


indebtedness to my guide Mr. Piyush Jain for the valuable guidance and inspiring throughout the
project duration. I feel thankful to him for innovative ideas, which led to successful completion of
this project work. I feel proud and fortune to work under such an outstanding mentor in the field of
“VLSI DESIGN”. He has always welcomed my problem and helped me to clear my doubt. I will
always be grateful to him for providing me moral support and sufficient time.

I own sincere thanks to the Head of Department Mr. Jitendra Soni and
Principal Dr. D.S. Raghav. Their enthusiasm, guidance and encouragement have been invaluable
to this project.

I would also like to thanks all other faculty members and staff of the department
for their directly or indirectly support. My almost thanks go to my family for their caring
encouragement and moral support throughout my academic year.

Last but not the least, we wish to thanks all those noble hearts who directly
or indirectly helped us to complete this project work.
INTRODUCTION
Very-large-scale integration (VLSI) is the process of creating integrated circuits by
combining thousands of transistors into a single chip. VLSI began in the 1970s when complex
semiconductor and communication technologies were being developed. The microprocessor is a
VLSI device. The term is no longer as common as it once was, as chips have increased in
complexity into billions of transistors.

The first
semiconductor chips held two
transistors each. Subsequent
advances added more and more
transistors, and, as a consequence,
more individual functions or systems
were integrated over time. The first
integrated circuits held only a few
devices, perhaps as many as ten
diodes, transistors, resistors and capacitors, making it possible to fabricate one or more logic
gates on a single device. Now known retrospectively as small-scale integration (SSI),
improvements in technique led to devices with hundreds of logic gates, known as medium-scale
integration (MSI). Further improvements led to large-scale integration (LSI), i.e. systems with at
least a thousand logic gates. Current technology has moved far past this mark and today's
microprocessors have many millions of gates and billions of individual transistors.

At one time, there was an effort to name and calibrate various levels of large-
scale integration above VLSI. Terms like ultra-large-scale integration (ULSI) were used. But the
huge number of gates and transistors available on common devices has rendered such fine
distinctions moot. Terms suggesting greater than VLSI levels of integration are no longer in
widespread use. Even VLSI is now somewhat quaint, given the common assumption that all
microprocessors are VLSI or better.

As of early 2008, billion-transistor processors are commercially available.


This is expected to become more commonplace as semiconductor fabrication moves from the
current generation of processes to the next generations (while experiencing new challenges
such as increased variation across process corners). A notable example is Nvidia's 280 series
GPU. This GPU is unique in the fact that almost all of its 1.4 billion transistors are used for logic,
in contrast to the Itanium, whose large transistor count is largely due to its 24 MB L3 cache.
Current designs, as opposed to the earliest devices, use extensive design automation and
automated logic synthesis to lay out the transistors, enabling higher levels of complexity in the
resulting logic functionality. Certain high-performance logic blocks like the SRAM (Static Random
Access Memory) cell, however, are still designed by hand to ensure the highest efficiency
(sometimes by bending or breaking established design rules to obtain the last bit of performance

by trading stability.

VLSI TECHNOLOGY

VLSI Technology, was a company which designed and manufactured custom


and mi-custom ICs. The company was based in Silicon Valley, with headquarters at 1109 McKay
Drive in San Jose, California. Along with LSI Logic, VLSI Technology defined the leading edge of
the application-specific integrated circuit (ASIC) business, which accelerated the push of powerful
embedded systems into affordable products.

The company was founded in 1979 by a trio from Fairchild Semiconductor by way of
Synertek - Jack Balletto, Dan Floyd, Gunnar Wetlesen - and by Doug Fairbairn of Xerox PARC
and Lambda (later VLSI Design) magazine.

Alfred J. Stein became the CEO of the company in 1982. Subsequently VLSI built its first fab
in San Jose; eventually a second fab was built in San Antonio, Texas .

The company was later acquired by Royal Philips and survives to this day as part of NXP
Semiconductors.

Advanced tools for VLSI design:-


A VLSI VL82C106 Super I/O chip.

The original business plan was to be a contract wafer fabrication company, but the
venture investors wanted the company to develop IC design tools to help fill the foundry.Thanks
to its Caltech and UC Berkeley students, VLSI was an important pioneer in the electronic design
automation industry. It offered a sophisticated package of tools, originally based on the 'lambda-
based' design style advocated by Carver Mead and Lynn Conway.

VLSI became an early vendor of standard cell (cell-based technology) to the merchant
market in the early 80s where the other ASIC-focused company, LSI Logic, was a leader in gate
arrays. Prior to VLSI's cell-based offering, the technology had been primarily available only within
large vertically integrated companies with semiconductor units such as AT&T and IBM.

VLSI's design tools eventually included not only design entry and simulation but
eventually cell-based routing (chip compiler), a datapath compiler, SRAM and ROM compilers,
and a state machine compiler. The tools were an integrated design solution for IC design and not
just point tools, or more general purpose system tools. A designer could edit transistor-level
polygons and/or logic schematics, then run DRC and LVS, extract parasitics from the layout and
run Spice simulation, then back-annotate the timing or gate size changes into the logic schematic
database. Characterization tools were integrated to generate FrameMaker Data Sheets for
Libraries. VLSI eventually spun off the CAD and Library operation into Compass Design
Automation but it never reached IPO before it was purchased by Avanti Corp.

VLSI's physical design tools were critical not only to its ASIC business, but also in
setting the bar for the commercial EDA industry. When VLSI and its main ASIC competitor, LSI
Logic, were establishing the ASIC industry, commercially-available tools could not deliver the
productivity necessary to support the physical design of hundreds of ASIC designs each year
without the deployment of a substantial number of layout engineers. The companies'
development of automated layout tools was a rational "make because there's nothing to buy"
decision. The EDA industry finally caught up in the late 1980s when Tangent Systems released
its TanCell and TanGate products. In 1989, Tangent was acquired by Cadence Design Systems .

Unfortunately, for all VLSI's initial competence in design tools, they were not
leaders in semiconductor manufacturing technology. VLSI had not been timely in developing a
1.0 µm manufacturing process as the rest of the industry moved to that geometry in the late 80s.
VLSI entered a long-term technology parthership with Hitachi and finally released a 1.0 µm
process and cell library (actually more of a 1.2 µm library with a 1.0 µm gate).

As VLSI struggled to gain parity with the rest of the industry in semiconductor
technology, the design flow was moving rapidly to a Verilog HDL and synthesis flow. Cadence
acquired Gateway, the leader in Verilog hardware design language (HDL) and Synopsys was
dominating the exploding field of design synthesis. As VLSI's tools were being eclipsed, VLSI
waited too long to open the tools up to other fabs and Compass Design Automation was never a
viable competitor to industry leaders.

Meanwhile, VLSI entered the merchant high speed static RAM SRAM market as they
needed a product to drive the semiconductor process technology development. All the large
semiconductor companies built high speed SRAMs with cost structures VLSI could never match.
VLSI withdrew once it was clear that the Hitachi process technology partnership was working.
ARM Ltd was formed in 1990 as a semiconductor intellectual property licensor, backed by Acorn,
Apple and VLSI. VLSI became a licensee of the powerful ARM processor and ARM finally funded
processor tools. Initial adoption of the ARM processor was slow. Few applications could justify
the overhead of an embedded 32 bit processor. In fact, despite the addition of further licensees,
the ARM processor enjoyed little market success until they developed the novel 'thumb'
extensions. Ericsson adopted the ARM processor in a VLSI chipset for its GSM handset designs
in the early 1990s. It was the GSM boost that is the foundation of ARM the company/technology
that it is today. Only in PC chipsets, did VLSI dominate in the early 90s. This product was
developed by five engineers using the 'Megacells" in the VLSI library that led to a business unit at
VLSI that almost equaled its ASIC business in revenue. VLSI eventually ceded the market to Intel
because Intel was able to package-sell its processors, chipsets, and even board level products
together.

VLSI also had an early partnership with PMC, a design group that had been nurtured
of British Columbia Bell. When PMC wanted to divest its semiconductor intellectual property
venture, VLSI's bid was beaten by a creative deal by Sierra Semiconductor. The telecom
business unit management at VLSI opted to go it alone. PMC Sierra became one of the most
important telecom ASSP vendors. Scientists and innovations from the 'design technology' part of
VLSI found their way to Cadence Design Systems (by way of Redwood Design Automation).

Global expansion:-

VLSI maintained operations throughout the USA, and in Britain, France,


Germany, Italy, Japan, Singapore and Taiwan. One of its key sites was in Tempe, Arizona, where
a family of highly successful chipsets was developed for the IBM PC.

In 1990, VLSI Technology, along with Acorn Computers and Apple Computer were
the founding partners in ARM Ltd. Ericsson of Sweden, after many years of fruitful collaboration,
was by 1998 VLSI's largest customer, with annual revenue of $120 million. VLSI's datapath
compiler (VDP) was the value-added differentiator that opened the door at Ericsson in 1987/8.
The silicon revenue and GPM enabled by VDP must make it one of the most successful pieces of
customer-configurable, non-memory silicon intellectual property (SIP) in the history of the
industry. Within the Wireless Products division, based at Sophia-Antipolis in France, VLSI
developed a range of algorithms and circuits for the GSM standard and for cordless standards
such as the European DECT and the Japanese PHS. Stimulated by its growth and success in the
wireless handset IC area, Philips Electronics acquired VLSI in June 1999, for about $1 billion.
The former components survive to this day as part of Philips spin-off NXP Semiconductor.
Transistor Si

INTEGRATED CIRCUIT

In electronics, an integrated circuit (also known as IC, chip, or microchip) is a


miniaturized electronic circuit (consisting mainly of semiconductor devices, as well as passive
components) that has been manufactured in the surface of a thin substrate of semiconductor
material. Integrated circuits are used in almost all electronic equipment in use today and have
revolutionized the world of electronics. Computers, cellular phones, and other digital appliances
are now inextricable parts of the structure of modern societies, made possible by the low cost of
production of integrated circuits.

A hybrid integrated circuit is a miniaturized electronic circuit constructed of individual


semiconductor devices, as well as passive components, bonded to a substrate or circuit board. A
monolithic integrated circuit is made of devices manufactured by diffusion of trace elements into a
single piece of semiconductor substrate, a chip.

1. Introduction:-
Synthetic detail of an integrated circuit through four layers of planarized copper
interconnect, down to the polysilicon (pink), wells (greyish), and substrate (green).Integrated
circuits were made possible by experimental discoveries which showed that semiconductor
devices could perform the functions of vacuum tubes and by mid-20th-century technology
advancements in semiconductor device fabrication. The integration of large numbers of tiny
transistors into a small chip was an enormous improvement over the manual assembly of circuits
using electronic components. The integrated circuit's mass production capability, reliability, and
building-block approach to circuit design ensured the rapid adoption of standardized ICs in place
of designs using discrete transistors.

There are two main advantages of ICs over discrete circuits:


cost and performance. Cost is low because the chips, with all their components, are printed as a
unit by photolithography rather than being constructed one transistor at a time. Furthermore,
much less material is used to construct a packaged IC die than a discrete circuit. Performance is
high since the components switch quickly and consume little power (compared to their discrete
counterparts) because the components are small and positioned close together. As of 2006, chip
areas range from a few square millimeters to around 350 mm2, with up to 1 million transistors per
mm2.

2. Invention:-

Early developments of the integrated circuit go back to 1949, when the


German engineer Werner Jacobi (Siemens AG) filed a patent for an integrated-circuit-like
semiconductor amplifying device showing five transistors on a common substrate arranged in a 2-
stage amplifier arrangement. Jacobi discloses small and cheap hearing aids as typical industrial
applications of his patent. A commercial use of his patent has not been reported.

The idea of the integrated circuit was conceived by a radar scientist working for the
Royal Radar Establishment of the British Ministry of Defence, Geoffrey W.A. Dummer (1909–
2002), who published it at the Symposium on Progress in Quality Electronic Components in
Washington, D.C. on May 7, 1952. He gave many symposia publicly to propagate his ideas.
Dummer unsuccessfully attempted to build such a circuit in 1956.

A precursor idea to the IC was to create small ceramic squares (wafers), each one
containing a single miniaturized component. Components could then be integrated and wired into
a bidimensional or tridimensional compact grid. This idea, which looked very promising in 1957,
was proposed to the US Army by Jack Kilby, and led to the short-lived Micromodule Program
(similar to 1951's Project Tinkertoy). However, as the project was gaining momentum, Kilby came
up with a new, revolutionary design: the IC.

Robert Noyce credited Kurt Lehovec of Sprague Electric for the principle of p-n junction
isolation caused by the action of a biased p-n junction (the diode) as a key concept behind
the IC.

3.Generations:-

In the early days of integrated circuits, only a few transistors could be placed on a chip,
as the scale used was large because of the contemporary technology. As the degree of
integration was small, the design was done easily. Later on, millions, and today billions, of
transistors could be placed on one chip, and to make a good design became a task to be planned

thoroughly. This gave rise to new design methods.

4.Type’s of IC ( SSI, MSI and LSI ):-

The first integrated circuits contained only a few transistors. Called "Small-Scale
Integration" (SSI), digital circuits containing transistors numbering in the tens provided a few
logic gates for example, while early linear ICs such as the Plessey SL201 or the Philips TAA320
had as few as two transistors. The term Large Scale Integration was first used by IBM scientist
Rolf Landauer when describing the theoretical concept, from there came the terms for SSI, MSI,
VLSI, and ULSI.

SSI circuits were crucial to early aerospace projects, and vice-versa. Both the Minuteman missile
and Apollo program needed lightweight digital computers for their inertial guidance systems; the
Apollo guidance computer led and motivated the integrated-circuit technology while the
Minuteman missile forced it into mass-production. The Minuteman missile program and various
other Navy programs accounted for the total $4 million integrated circuit market in 1962, and by
1968, U.S. Government space and defense spending still accounted for 37% of the $312 million
total production. The demand by the U.S. Government supported the nascent integrated circuit
market until costs fell enough to allow firms to penetrate the industrial and eventually the
consumer markets. The average price per integrated circuit dropped from $50.00 in 1962 to $2.33
in 1968. Integrated Circuits began to appear in consumer products by the turn of the decade, a
typical application being FM inter-carrier sound processing in television receivers.

The next step in the development of integrated circuits, taken in the late 1960s,
introduced devices which contained hundreds of transistors on each chip, called "Medium-Scale
Integration" (MSI).They were attractive economically because while they cost little more to
produce than SSI devices, they allowed more complex systems to be produced using smaller
circuit boards, less assembly work (because of fewer separate components), and a number of
other advantages.Further development, driven by the same economic factors, led to "Large-
Scale Integration" (LSI) in the mid 1970s, with tens of thousands of transistors per chip.

Integrated circuits such as 1K-bit RAMs, calculator chips, and the first microprocessors,
that began to be manufactured in moderate quantities in the early 1970s, had under 4000
transistors. True LSI circuits, approaching 10000 transistors, began to be produced around 1974,
for computer main memories and second-generation microprocessors.

Moore's law

Moore's law describes a long-term trend in the history of computing


hardware. The number of transistors that can be placed inexpensively on an integrated circuit has
doubled approximately every two years. The trend has continued for more than half a century and
is not expected to stop until 2015 or later, digital cameras. All of these are improving at (roughly)
exponential rates as well. This has dramatically increased the usefulness of digital electronics in
nearly every segment of the world economy. Moore's law precisely describes a driving force of
technological and social change in the late 20th and early 21st centuries.

The law is named after Intel co-founder Gordon E. Moore, who described the
trend in his 1965 paper. The paper noted that number of components in integrated circuits had
doubled every year from the invention of the integrated circuit in 1958 until 1965 and predicted
that the trend would continue "for at least ten years".His prediction has proved to be uncannily
accurate, in part because the law is now used in the semiconductor industry to guide long-term
planning and to set targets for research and development. This fact would support an alternative
view that the "law" unfolds as a self-fulfilling prophecy, where the goal set by the prediction charts
the course for realized capability.
VHDL
VHDL was originally developed at the behest of the U.S Department of Defense in
order to document the behavior of the ASICs that supplier companies were including in
equipment. That is to say, VHDL was developed as an alternative to huge, complex manuals
which were subject to implementation-specific details.

The idea of being able to simulate this documentation was so obviously attractive that logic
simulators were developed that could read the VHDL files. The next step was the development of
logic synthesis tools that read the VHDL, and output a definition of the physical implementation of
the circuit. Because the Department of Defense required as much of the syntax as possible to be
based on Ada, in order to avoid re-inventing concepts that had already been thoroughly tested in
the development of Ada, VHDL borrows heavily from the Ada programming language in both
concepts and syntax.

The initial version of VHDL, designed to IEEE standard 1076-1987, included


a wide range of data types, including numerical (integer and real), logical (bit and boolean),
character and time, plus arrays of bit called bit_vector and of character called string. A problem
not solved by this edition, however, was "multi-valued logic", where a signal's drive strength
(none, weak or strong) and unknown values are also considered. This required IEEE standard
1164, which defined the 9-value logic types: scalar std_ulogic and its vector version
std_ulogic_vector.

The an updated IEEE 1076, in 1993, made the syntax more consistent, allowed more
flexibility in naming, extended the character type to allow ISO-8859-1 printable characters, added
the xnor operator, etc. Minor changes in the standard (2000 and 2002) added the idea of
protected types (similar to the concept of class in C++) and removed some restrictions from port
mapping rules.

In addition to IEEE standard 1164, several child standards were introduced to extend
functionality of the language. IEEE standard 1076.2 added better handling of real and complex
data types. IEEE standard 1076.3 introduced signed and unsigned types to facilitate arithmetical
operations on vectors. IEEE standard 1076.1 (known as VHDL-AMS) provided analog and mixed-
signal circuit design extensions.

Some other standards support wider use of VHDL, notably VITAL (VHDL Initiative
Towards ASIC Libraries) and microwave circuit design extensions.
In June 2006, VHDL Technical Committee of Accellera (delegated by IEEE to
work on next update of the standard) approved so called Draft 3.0 of VHDL-2006. While
maintaining full compatibility with older versions, this proposed standard provides numerous
extensions that make writing and managing VHDL code easier. Key changes include
incorporation of child standards (1164, 1076.2, 1076.3) into the main 1076 standard, an extended
set of operators, more flexible syntax of case and generate statements, incorporation of VHPI
(interface to C/C++ languages) and a subset of PSL (Property Specification Language). These
changes should improve quality of synthesizable VHDL code, make testbenches more flexible,
and allow wider use of VHDL for system-level descriptions.

In February 2008, Accellera approved VHDL 4.0 also informally known as VHDL 2008,
which addressed more than 90 issues discovered during the trial period for version 3.0 and
includes enhanced generic types. In 2008, Accellera released VHDL 4.0 to the IEEE for balloting
for inclusion in IEEE 1076-2008. The VHDL standard IEEE 1076-2008 was published in
September 2008.

1. Design:-

VHDL is commonly used to write text models that describe a logic circuit. Such a
model is processed by a synthesis program, only if it is part of the logic design. A simulation
program is used to test the logic design using simulation models to represent the logic circuits
that interface to the design. This collection of simulation models is commonly called a testbench.

VHDL has constructs to handle the parallelism inherent in hardware designs, but these
constructs (processes) differ in syntax from the parallel constructs in Ada (tasks). Like Ada, VHDL
is strongly typed and is not case sensitive. In order to directly represent operations which are
common in hardware, there are many features of VHDL which are not found in Ada, such as an
extended set of Boolean operators including nand and nor. VHDL also allows arrays to be
indexed in either ascending or descending direction; Both conventions are used in hardware,
whereas in Ada and most programming languages only ascending indexing is available.

VHDL has file input and output capabilities, and can be used as a general-purpose
language for text processing, but files are more commonly used by a simulation testbench for
stimulus or verification data. There are some VHDL compilers which build executable binaries. In
this case, it might be possible to use VHDL to write a testbench to verify the functionality of the
design using files on the host computer to define stimuli, to interact with the user, and to compare
results with those expected. However, most designers leave this job to the simulator.
It is relatively easy for an inexperienced developer to produce code that simulates successfully
but that cannot be synthesized into a real device, or is too large to be practical. One particular
pitfall is the accidental production of transparent latches rather than D-type flip-flops as storage
elements.

One can design hardware in a VHDL IDE (for FPGA implementation such as
Xilinx ISE, Altera Quartus, Synopsys Synplify or Mentor Graphics HDL Designer) to produce the
RTL schematic of the desired circuit. After that, the generated schematic can be verified using
simulation software which shows the waveforms of inputs and outputs of the circuit after
generating the appropriate testbench. To generate an appropriate testbench for a particular circuit
or VHDL code, the inputs have to be defined correctly. For example, for clock input, a loop
process or an iterative statement is required.

A final point is that when a VHDL model is translated into the "gates and wires" that are
mapped onto a programmable logic device such as a CPLD or FPGA, then it is the actual
hardware being configured, rather than the VHDL code being "executed" as if on some form of a
processor chip.

2.Design examples:-

In VHDL, a design consists at a minimum of an entity which describes the interface


and an architecture which contains the actual implementation. In addition, most designs import
library modules. Some designs also contain multiple architectures and configurations.

A simple AND gate in VHDL would look something like this:

-- (this is a VHDL comment)

-- import std_logic from the IEEE library


library IEEE;
use IEEE.std_logic_1164.all;

-- this is the entity


entity ANDGATE is
port (
IN1 : in std_logic;
IN2 : in std_logic;
OUT1: out std_logic);
end ANDGATE;
architecture RTL of ANDGATE is
begin

OUT1 <= IN1 and IN2;

end RTL;

While the example above may seem very verbose to HDL beginners, many parts are
either optional or need to be written only once. Generally simple functions like this are part of a
larger behavioral module, instead of having a separate module for something so simple. In
addition, use of elements such as the std_logic type might at first seem to be an overkill. One
could easily use the built-in bit type and avoid the library import in the beginning. However, using
this 9-valued logic (U,X,0,1,Z,W,H,L,-) instead of simple bits (0,1) offers a very powerful
simulation and debugging tool to the designer which currently does not exist in any other HDL.

In the examples that follow, you will see that VHDL code can be written in a very
compact form. However, the experienced designers usually avoid these compact forms and use a
more verbose coding style for the sake of readability and maintainability. Another advantage to
the verbose coding style is the smaller amount of resources used when programming to a
Programmable Logic Device such as a CPLD

MUX templates

The multiplexer, or 'MUX' as it is usually called, is a simple construct very common


in hardware design. The example below demonstrates a simple two to one MUX, with inputs A
and B, selector S and output X:

-- template 1:
X <= A when S = '1' else B;

-- template 2:
with S select
X <= A when '1',
B when others;

-- template 3:
process(A,B,S)
begin
case S is
when '1' => X <= A;
when others => X <= B;
end case;
end process;

-- template 4:
process(A,B,S)
begin
if S = '1' then
X <= A;
else
X <= B;
end if;
end process;

-- template 5 - 4:1 MUX, where S is a 2-bit std_logic_vector :


process(A,B,C,D,S)
begin
case S is
when "00" => X <= A;
when "01" => X <= B;
when "10" => X <= C;
when others => X <= D;
end case;
end process;

The three last templates make use of what VHDL calls 'sequential' code. The sequential sections
are always placed inside a process and have a slightly different syntax which may resemble more
traditional programming languages.
Latch templates

A transparent latch is basically one bit of memory which is updated when an enable signal is
raised:

-- latch template 1:
Q <= D when Enable = '1' else Q;

-- latch template 2:
process(D,Enable)
begin
if Enable = '1' then
Q <= D;
end if;
end process;

A SR-latch uses a set and reset signal instead:

-- SR-latch template 1:
Q <= '1' when S = '1' else
'0' when R = '1' else Q;

-- SR-latch template 2:
process(S,R)
begin
if S = '1' then
Q <= '1';
elsif R = '1' then
Q <= '0';
end if;
end process;

Template 2 has an implicit "else Q <= Q;" which may be explicitly added if desired.

-- This one is a RS-latch (i.e. reset dominates)


process(S,R)
begin
if R = '1' then
Q <= '0';
elsif S = '1' then
Q <= '1';
end if;
end process;

D-type flip-flops

The D-type flip-flop samples an incoming signal at the rising or falling edge of a clock. The DFF is
the basis for all synchronous logic.

-- simplest DFF template (not recommended)


Q <= D when rising_edge(CLK);

-- recommended DFF template:


process(CLK)
begin
-- use falling_edge(CLK) to sample at the falling edge instead
if rising_edge(CLK) then
Q <= D;
end if;
end process;

-- alternative DFF template:


process
begin
wait until CLK='1';
Q <= D;
end process;

-- alternative template expands the ''rising_edge'' function above:


process(CLK)
begin
if CLK = '1' and CLK'event then--use rising edge, use "if CLK = '0' and CLK'event" instead for
falling edge
Q <= D;
end if;
end process;

Some flip-flops also have asynchronous or synchronous Set and Reset signals:

-- "Textbook" template for asynchronous reset.


-- This style is prone to error if some signals assigned under the rising_edge
-- condition are omitted (either intentionally or mistakenly) under the reset
-- condition. Such signals will synthesize as flip-flops having feedback MUXes
-- or clock enables (see below), which was probably not intended.
-- This is very similar to the 'transparent latch' mistake mentioned earlier.
process(CLK, RESET)
begin
if RESET = '1' then -- or '0' if RESET is active low...
Q <= '0';
elsif rising_edge(CLK) then
Q <= D;
end if;
end process;

-- A safer description of reset uses overwrite rather than


-- if-else semantics and avoids the gotcha described above:
process(CLK, RESET)
begin
if rising_edge(CLK) then
Q <= D;
end if;
if RESET = '1' then -- or '0' if RESET is active low...
Q <= '0';
end if;
end process;

-- template for synchronous reset:


process(CLK)
begin
if rising_edge(CLK) then
Q <= D;
if RESET = '1' then -- or '0' if RESET is active low...
Q <= '0';
end if;
end if;
end process;

Another common feature for flip-flops is an Enable signal:

-- template for flip-flop with clock enable:


process(CLK)
begin
if rising_edge(CLK) then
if Enable = '1' then -- or '0' if Enable is active low...
Q <= D;
end if;
end if;
end process;

Flip-flops can also be described with a combination of features:

-- template with clock enable and asynchronous reset combined:


process(CLK, RESET)
begin
if rising_edge(CLK) then
if Enable = '1' then -- or '0' if Enable is active low...
Q <= D;
end if;
end if;
if RESET = '1' then -- or '0' if RESET is active low...
Q <= '0';
end if;
end process;

Example: a counter

The following example is an up-counter with asynchronous reset, parallel load and configurable
width. It demonstrates the use of the 'unsigned' type and VHDL generics. The generics are very
close to arguments or templates in other traditional programming languages like C or C++.

library IEEE;
use IEEE.std_logic_1164.all;
use IEEE.numeric_std.all; -- for the unsigned type

entity counter_example is
generic ( WIDTH : integer := 32);
port (
CLK, RESET, LOAD : in std_logic;
DATA : in unsigned(WIDTH-1 downto 0);
Q : out unsigned(WIDTH-1 downto 0));
end entity counter_example;

architecture counter_example_a of counter_example is


signal cnt : unsigned(WIDTH-1 downto 0);
begin
process(RESET, CLK) is
begin
if RESET = '1' then
cnt <= (others => '0');
elsif rising_edge(CLK) then
if LOAD = '1' then
cnt <= DATA;
else
cnt <= cnt + 1;
end if;
end if;
end process;

Q <= cnt;

end architecture counter_example_a;


Introduction to VHDL Simulation and Synthesis

1. Introduction :-

The purpose of this lab is to introduce you to VHDL simulation and synthesis
using the ALDEC VHDL simulator and the Xilinx foundation software for synthesis. There are
several defenitions that may be helpful:

• Simulation is the execution of a model in the software environment. This is done using
the ALDEC VHDL simulator.
• A test bench is a program whose purpose is to verify that the behavior of our system is as
expected. The test bench is used in ALDEC to simulate our design by specifying the
inputs into the system.
• Synthesis is the process of translating a design description to another level of
abstraction, i.e, from behaviour to structure. We achieved synthesis by using a Synthesis
tool like Foundation Express which outputs a netlist. It is similar to the compilation of a
high level programming language like C into assembly code.

In this lab, you will implement a behavioral description of a 2-bit counter (00, 01, 10, 11, 00, ...).
The counter's output is fed to a 2-4 decoder. This decoder's output is the output of the top level
entity (see picutre above). The output of the top level entity is then fed to four led's. The leds
should correspond as follows:

counter output L0 L1 L2 L3
reset 0 0 0 0
00 0 0 0 1
01 0 0 1 0
10 0 1 0 0
11 1 0 0 0

The counter and decoder should be written behaviorally. A behavioral style architecture specifies
what a particular system does but provides no information on how the design is implemented (i.e.
don't use AND, OR, NAND, ... gates to implement your design). In this lab, you will specify an
entity and test it, synthesize your design using FPGA Express and download it onto an XS40
board using the Xilinx Foundation software.

III. Schematic:
On the XS40 board:

• Pin 2 provides power to the other components.


• Pin 52 provides ground to the other components.

No power supply is needed for this lab, you can think of pins 2 and 52 as on "onboard power
supply." In addition the button is used as the clock signal. The onboard clk is 12MHz, this is too
fast for us to see the lights blink. Instead of using a clock divider to slow down the signal, we
create our own clock signal using the button (Don't forget to change your .ucf file to represent our
pseudo clock).

2. Procedure:-
simulation

1. Complete the code provided. It should describe a 2-bit counter and a 2-4 decoder in
VHDL, constructing a top-level structural entity containing these two components.
2. Write a suitable testbench and simulate it to ensure that it is correct (Use ALDEC VHDL).

synthesis

1. Wire up the circuit as shown. You may need the circuit diagram or the the manual for the
XS40 boards.
2. Make the neccessary additions to your lights.ucf file to reflect the LED connections.
3. Download your program unto the XS40 board.

Advantages:-
• Greater Functionality

It is possible to achieve greater functionality with a simpler hardware design. The required logic
can be stored in memory and hence the cost of supporting additional features is reduced to the
cost of the memory required to store the logic design. This is very much useful in mobile
communication domain where protocol can be easily modified to newer protocol and stored in
memory and then hardware can be reconfigured to achieve the required functionality. Compelling
advantage includes increased speed, reduced energy and power consumption. A study reports
that depending on the particular device used moving critical software loops to reconfigurable
hardware results in average energy savings of 35% to 70% with an average speedup of 3 to 7
times.

• Embedded Characteristics

In general-purpose computing processors common piece of silicon could be configured, after


fabrication, to solve any computing task. This meant many applications could share commodity
economics for the production of a single IC and the same IC could be used to solve different
problems at different points in time. General-purpose computing meant engineers could program
the component to do things which the original IC manufacturers never conceived. Embedded
systems developers are much benefited from reconfigurable computing systems, especially with
the introduction of soft cores which can contain one or more instruction processors. [4]

All of these "general-purpose" characteristics are shared by reconfigurable computing. Instead of


computing a function by sequencing through a set of operations in time (like a processor),
reconfigurable computers compute a function by configuring functional units and wiring them up
in space. This allows parallel computation of specific, configured operations, like a custom ASIC.
Also it can also be reconfigured. The reconfigurable hardware fabric can be easily and quickly
modified from a remote location to upgrade its performance. It can be modified to perform a
completely different function. Hence, non-recurring engineering (NRE) costs of reconfigurable
computing are lower than that of a custom ASIC.

• Lower System Cost

By eliminating the ASIC design lower system cost on a low-volume product is achieved. For
higher-volume products, the production cost of fixed hardware is actually very much lower. In the
case of ASIC and general purpose hardware designs technical obsolescence drives up the cost
of systems. Reconfigurable computing systems are upgradeable and extend the useful life of the
system. This reduces lifetime costs.

• Reduced Time to Market

Reduced time-to-market is the final advantage of reconfigurable computing. Since ASIC is no


longer used in reconfigurable computing large amount of development effort is reduced. The logic
design remains flexible even after the product is shipped. Design can be sent to market with
minimum requirements and later additional features can be added without any change in physical
device (or system). Thus reconfigurable computing allows incremental design flow.

These advantages lead reconfigurable computers to serve as powerful tools for many
applications. The applications include research and development tools for sophisticated electronic
systems such as ASICs and printed circuit boards (PCBs). Simulation tools for these systems do
not exist. Also prototype fabrication is expensive and time consuming. A reconfigurable computer
can serve as an affordable, fast, and accurate tool for verifying electronic designs

Disadvantages

Two severe disadvantages of reconfigurable computing can be observed.


They are the time that the chip takes to reconfigure itself to a given task, and the difficulty in
programming such chips. Dynamic reconfigurable computing has several different complex
issues. They are design space, placement, routing, timing, consistency and development tools.
Each of these is discussed below.

• Placement Issues

In order to reconfigure a new hardware, it requires having ample space to place the new
hardware. The component placement issue becomes complex if the component needs to be
placed near special resources like built- in memory, I/O pins or DLLs on the FPGA.

• Routing Issues
Existing components has to be connected to the components newly reconfigured. The ports must
be available to interface new components. The same ports must have also been used under the
old configuration. To accomplish this orientation of the components should be in a workable
fashion.

• Timing Issues

Newly configured hardware must meet the timing requirement for the efficient operation of the
circuit. Longer wires between components may affect the timing. Optimal speed should be
attainable after dynamically reconfiguring the device. Over timing or under timing the new added
design may yield erroneous result.

• Consistency Issues

Static or dynamic reconfiguration of the device should not degrade computational consistency of
the design. This issue becomes critical when the FPGA is partially reconfigured and interfaced
with existing design. Adding new components to the device by reconfigurable fabric should not
erase or alter the existing design in the device. (Or memory). There should be some safe
methods to store the bit stream to the memory.

• Development Tools

Commercial development tools for dynamic reconfigurable computing are still under development
stage. The lack of commercially available tools for the specification to implementation stages of
the digital design is still a bottleneck. The available tools require enormous human intervention to
implement the complete system.

CONTENTS
1. INRTODUCTION

2. VLSI TECHNOLOGY

 ADVANCED TOOLS FOR VLSI DESIGN

 GLOBAL EXPANSION

3. INTEGRATED CIRCUIT

 INTRODUCTION

 INVENTION

 GENERATIONS

 TYPES OF IC

4. MOORE’S LAW

5. VHDL

 DESIGN

 DESIGN EXAMPLES

6. INTRODUCTION OF VHDL SIMULATION AND SYNTHESIS

 INTRODUCTION

 PROCEDURE

7. ADVANTAGES

8. DISADVANTAGES

You might also like