Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 23

SEARCH ENGINE OPTIMIZATION -TOOLS AND TECHNIQUES

1. INTRODUCTION:

Search engine optimization (SEO) is a well defined and managed process


which helps to increase the volume or improve quality of traffic to a web site
from search engines. SEO activity helps to increase the amount of visitors to a
Web site by ranking high in the search results of a search engine. The higher a
Web site ranks in the results of a search, the greater the chance that that site
will be visited by a user. This is because of the common habit seen among the
users to click on the first search result. Search Engine Optimization involves
the careful optimization of corporate web sites to effectively increase their
visibility in the major search engines such as Google, Yahoo, Alta-Vista,
Inktomi and many others.
SEO tactics make the site search engine friendly. It makes the difference
between a web site that has very little visibility and one that will be seen and
found by millions of people. Major search engines keep their ranking
algorithms a closely guarded secret; a practice which has left the world
guessing and requires researching methodologies for better optimizing sites.
Search engine algorithms change on a continual basis. Specific strategies
detailed in this report may be obsolete within months, but the general
principles will hold true for years to come. The advice in this report will be
mostly tailored to the top search engine, Google. As of January 2010, the top
search engine account for 65 to 75% of the search market. Because effective
SEO may require changes to the HTML source code of a site, SEO tactics may
be incorporated into web site development and design. The term "search
engine friendly" may be used to describe web site designs, content
management systems have been optimized for the purpose of search engine
exposure.
By relying so much on factors such as keyword density which were
exclusively within a webmaster's control, early search engines suffered from
abuse and ranking manipulation. To provide better results to their users, search
engines had to adapt to ensure their results pages showed the most relevant
search results, rather than unrelated pages stuffed with numerous keywords by
unscrupulous webmasters. Since the success and popularity of a search engine
is determined by its ability to produce the most relevant results to any given

H.V.P.M’s College Of Engineering And Technology, Amravati. Page 1


SEARCH ENGINE OPTIMIZATION -TOOLS AND TECHNIQUES

search, allowing those results to be false would turn users to find other search
sources. Search engines responded by developing more complex ranking
algorithms, taking into account additional factors that were more difficult for
webmasters to manipulate.

 History of SEO:
SEO Theory was born in the popular Virtual Promote Gazette
Enewsletter and associated Search Engine Forums founded by Jim Wilson
(whose Web properties have been handed over to his successors after his death
in May 2003). Wilson brought together the first organized community of Web
Site developers, marketers, and promoters with the purpose of sharing
information about how Web search works, how it can be used to benefit
internet community. Wilson's forums and newsletters often documented new
tools and promotional ideas along with techniques. Within few years the SEO
community began to branch out into younger forum and Enewsletter ventures.
SEO Theory shifted its emphasis from on page elements to off page elements
with the ascendancy of Google. Then they acknowledged the necessity of
obtaining links from other documents. Today SEO Theory is less emphasized
by the best practices community, who generally rely upon well-established
Website marketing and search engine Webmaster guidelines to structure their
Web promotion campaigns. Nonetheless, SEO Theory has made significant
contributions to widely accepted and advocated Website marketing
methodologies.

 Fundamental principle behind SEO:


SEO Theory is founded upon the assumption that, the review and analysis
of search results, technical papers, patent applications, search engine
Webmaster guidelines, and other authoritative sources, a search engine's
algorithm can be in whole or in part reverse engineered for the effective
production of Web pages that rank well in search results. The value of reverse
engineering search engine algorithms is questioned and challenged by many
best practices SEO including Doug Heil and Jill Whalen. Commonly referred
to as algorithm chasing, search algorithm reverse engineering is often

H.V.P.M’s College Of Engineering And Technology, Amravati. Page 2


SEARCH ENGINE OPTIMIZATION -TOOLS AND TECHNIQUES

associated with Black hat SEO or deceiving search engines. In addition to the
Web document design, and because of the importance placed upon link
analysis by several important search engines(including Google, Ask, Live
Search, Yahoo), SEO Theory has evolved to include the study and analysis of
linking practices, patterns, and placement. Although its history may be more
strongly identified with the darker side of search engine optimization, SEO
Theory today helps to guide the implementation of best practices marketing.
Web content providers used to manipulate number of attributes in HTML code
to rank well in search engines. By relying so much on them search engines
suffered abuse and ranking manipulation. So relevant searching came into
existence and search engines looked for relevant web pages.

2. SEO TERMINOLOGIES:
2.1 A Search Engine:
Basically every search engine that exists consists of 3 parts:
 A Web Crawler
 An Indexer
 A Query Processor

A Web crawler is a computer program that browses the World Wide Web
in a methodical, automated manner or in an orderly fashion. Other terms
for Web crawlers are ants, automatic indexers, bots, or Web spiders, Web
robots. Web crawlers are mainly used to create a copy of all the visited
pages for later processing by a search engine that will index the
downloaded pages to provide fast searches. It starts with a selected set of
URLs called seeds, and then fetches all the links on them and continues in
this way. A selection policy is implemented for it. Following is a list of
search engines and corresponding bots.

 Google – GoogleBot
 MSN – MSNBot
 Yahoo- Slurp
 Fast Search & Transfer - FAST

H.V.P.M’s College Of Engineering And Technology, Amravati. Page 3


SEARCH ENGINE OPTIMIZATION -TOOLS AND TECHNIQUES

Bots give the indexer the full text of the pages it finds. These pages are stored in
Google’s index database. This index is sorted alphabetically by search term, with each
index entry storing a list of documents in which the term appears and the location
within the text where it occurs. This data structure allows rapid access to documents
that contain user query terms.

To improve search performance, Google ignores (doesn’t index) common words


called stop words (such as the, is, on, or, of, how, why, as well as certain single digits
and single letters). Stop words are so common that they do little to narrow a search,
and therefore they can safely be discarded. The indexer also ignores some punctuation
and multiple spaces, as well as converting all letters to lowercase, to improve
performance. Stemming is a feature most major search engines have in which
searches will return results for words based on a particular stem. For example, a
search for the term “shoes” would also returns results for pages containing the Word
“shoe.” In another example, a user may search for “boat” and get “Boating” or
“boats” in search results.
The query processor has several parts, including the user interface (search box), the
“engine” that evaluates queries and matches them to relevant documents, and the
results formatter.

Fig. No.1 A Search Engine Processing A Query

H.V.P.M’s College Of Engineering And Technology, Amravati. Page 4


SEARCH ENGINE OPTIMIZATION -TOOLS AND TECHNIQUES

2.2 Page Rank:

Page Rank is Google's way of giving a specific value to how popular your website is.
As told earlier we will mostly count for Google. It is based on the number of "votes"
other websites cast for your website. A "vote" is simply when another website places
a link on their website that is pointing to your website. Generally, the more "votes" or
links you have pointing to your website, the higher your Page Rank (PR) will be. Page
Rank is 1 of the many factors that Google takes into account when ranking websites.

Fig.no 2 Page Rank Analysis

In the given figure Page C has a higher PageRank than Page E, even though it has
fewer links to it; the link it has is of a much higher value. A web surfer who chooses a
random link on every page (but with 15% likelihood jumps to a random page on the
whole web) is going to be on Page E for 8.1% of the time. The 15% likelihood of
jumping to an arbitrary page corresponds to a damping factor of 85%. Without
damping, all web surfers would eventually end up on Pages A, B, or C, and all other

H.V.P.M’s College Of Engineering And Technology, Amravati. Page 5


SEARCH ENGINE OPTIMIZATION -TOOLS AND TECHNIQUES

pages would have PageRank zero. Page A is assumed to link to all pages in the web,
because it has no outgoing links. PageRank was named after Larry Page.

Formula for calculating PageRank:

0.15 + (0.85*(A “share” of PR of every page that links to it)) =Your Page’s PR.

0.15 – Lowest PR ever possible.

0.85 – Damping factor.

“Share” – PR of pages linking to it divided by total number of pages.

For Ex:

A (PR 1) B (PR 1)

C (PR 1)

Fig.no. 3 Page Rank calculation

PR of A = 0.15 + (0.85 *( ½)) = 0.575


PR of B = 0.15 + (0.85 *( ½)) = 0.575
PR of C = 0.15 + (0.85 *( ½)) = 0.575

H.V.P.M’s College Of Engineering And Technology, Amravati. Page 6


SEARCH ENGINE OPTIMIZATION -TOOLS AND TECHNIQUES

2.3 Optimizers:
2.3.1 White Hat SEOs:
They are the search engine optimizers that use legal techniques to improve the
ranking of your Web site. They are also known as the Best Practices Community.
Best practices SEO contact a few webmasters having similar content and request for a
link exchange.
2.3.2 Black Hat SEOs:
They are the SEOs that try to deceive the search engines by using tricks. Many a
times sites developed by black hats get penalised or banned. Some of the tricks used
are:
 Cloaking / Doorway Pages :
Cloaking is the practice of showing one content page to a search engine and
another content page to actual people. "cloaking" is achieved through a variety of
means and some people argue that there are legitimate reasons for cloaking. Best
practices people advocates not to cloak. Google has been criticised for allegedly
permitting a group (such as newspapers, academic paper archives', SEO forums
where Google employees have participated) to engage in cloaking while banning
other sites for same reasons. Doorway or presell or gateway pages specially
designed minimal content pages created solely for the purpose of receiving traffic
from search engine and direct to destination.
 Meta Stuffing:
Meta tag is analysed by almost all search engines while crawling. Meta elements
are HTML or XHTML elements used to provide structured metadata about a Web
page. Such elements must be placed as tags in the head section of an HTML or
XHTML document. Meta elements can be used to specify page description,
keywords and any other metadata not provided through the other head elements
and attributes. However Black Hat SEOs started giving wrong information in
following Elements of Meta tag :
o Description:
The description attribute provides a concise explanation of a Web page's
content. Almost all search engines recommend it to be shorter than 155
characters of plain text. Black Hat SEOs give some different description
than their contents.

H.V.P.M’s College Of Engineering And Technology, Amravati. Page 7


SEARCH ENGINE OPTIMIZATION -TOOLS AND TECHNIQUES

o Keyword:

This attribute gives keywords by which the web page will be searched.
However, search engine providers realized that information stored in meta
elements, especially the keywords attribute, was often unreliable and
misleading, and at worst, used to draw users into spam sites.
o Robots:

The robots attribute supported by several search engines, controls whether


search engine spiders are allowed to index a web page or not and whether
they should follow links from pages or not. They use NOODP, NOYDIR
to stop crawlers from fetching titles from directories such as Yahoo,
Looksmart. They use “nofollow” to indicate that a particular link should
not be followed by search engine crawler. Meta tags are not the best
option to prevent search engines from indexing content of a website,
because a more reliable and efficient method is the use of the Robots.txt
file.
Ex: Google: <META NAME="GOOGLEBOT" CONTENT="NOODP">
Yahoo! <META NAME="Slurp" CONTENT="NOODP">
MSN: <META NAME="msnbot" CONTENT="NOODP">

o Language:
Language attributes tell search engines what natural language the web site is
written in opposed to coding language.

 Title Tag Stuffing:

Title is something that appears on the top left of your Web page. Following is
an example of Title tag stuffing used by an Black hat SEO. This things may
ban your site also.

H.V.P.M’s College Of Engineering And Technology, Amravati. Page 8


SEARCH ENGINE OPTIMIZATION -TOOLS AND TECHNIQUES

 Keyword Stuffing:
What I'm referring to here is when people throw in thousands of the same
exact keyword into their meta tags.

Tent
Fig.no. 4 Keyword Stuffing example

For example, the following website is trying to rank well for "tents".
<META NAME="KEYWORDS" CONTENT="tents, TENTS, Tents, tents
tents supplies, tents, tents tent, tent, Tent, TENTS, tents, Tents,tents, TENTS,
Tents, tents tents tent supplies, tents, tents tent, tent, Tent, TENTS, tents,
Tents, tents, TENTS, Tents, tents tents tent supplies, tents, tents tent, tent,
Tent, TENTS, tents, Tents, tents, TENTS, Tents, tents tents tent supplies,
tents, tents tent, tent, Tent, TENTS, tents, Tents tents, TENTS, Tents, tents
tents tent supplies>

This is obviously ridiculous. Google WILL penalize it, so, why


would anyone do something like this? Stay away from it. There is a Keyword
density defined which is a percentage of times a keyword appears on your
page. By standard it is 5%.

H.V.P.M’s College Of Engineering And Technology, Amravati. Page 9


SEARCH ENGINE OPTIMIZATION -TOOLS AND TECHNIQUES

 Hidden Text :
Hidden Text is simply text that users can't see when they visit your webpage.

Fig.no. 5 Hidden Text not seen


Above Fig. Looks all right.

Fig.no. 6 Hidden Text Found

H.V.P.M’s College Of Engineering And Technology, Amravati. Page 10


SEARCH ENGINE OPTIMIZATION -TOOLS AND TECHNIQUES

But when we use ctrl+a we get the trick. That hidden text might be the
keyword of the number 1 site for such quries so it will probably up your PR.

 Alt img tag spamming:


The following is a website that wanted to rank well for "cabbage soup diet".
What they've done is inserted a graphic of a cabbage. They've then added an
alt image tag to the graphic. When a visitor visits the website, hovers their
mouse over the cabbage soup graphic, a little popup will appear.
Following figure is an example of image spamming. In figure a popup is
appearing every time pointer is brought on the picture. Cabbage soup must
have been there or their competitors keyword.

Fig.no. 7 Image Spamming

3. TYPES OF OPTIMIZATION:

The Search Engine Optimization is divided into two parts. The division is based upon
the characteristics of the factors used in optimization.

H.V.P.M’s College Of Engineering And Technology, Amravati. Page 11


SEARCH ENGINE OPTIMIZATION -TOOLS AND TECHNIQUES

 On-Page Optimization
 Off-page Optimization.

3.1 On-Page Optimization:

It includes the trics and techniques that are used within the Web page in order
to secure a good ranking. Following are the techniques used:

3.1.1 Keyword Research:


“ Target the wrong keyword and
Your all efforts will be in vain”
One very common mistake many business web site operators and designers
make is to proceed on the assumption that favoured keyword are used by all
people to find specific types of content. For example a insurance professional
may assume that people are using the word "insurance" to find Web sites like
his own. But this is certainly not the case. Keyword research is generally
conducted with the aid of tools that draw upon pools of data provided by one
or more search engines for recent user quries. Query data provide a look into
how people look for information on the internet.
Let's create a weight loss related website. Our weight loss related website will
primarily sell a weight loss eBook. Before we start creating and collecting
content for the website we need to do a little keyword research. This is VERY
important and should not be skipped. To do our keyword research we need to
visit a few websites such as

http://www.goodkeywords.com

Download the related software from the site.

H.V.P.M’s College Of Engineering And Technology, Amravati. Page 12


SEARCH ENGINE OPTIMIZATION -TOOLS AND TECHNIQUES

Fig.no. 8 SEO Elite tool

The SINo. 1 is "weight loss" and was searched 1,413,194 times in the Overture
search engine last month. The Words column shows the specific keyword that was
searched. If you enter "weight loss", the Good Keywords tool will bring back the 100
keywords containing the word "weight loss" that were searched for last month. The
"count" column will then show us how many times the specific keyword has been
searched for the previous month within the Overture.com search engine. Generally,
you can take that number times 3, in order to estimate the number of times that
keyword has been searched within Google for the previous month. Already, I will
see many people making a BIG mistake, and I'll admit, I was one of these people
when I first begin my online endeavours. If we scroll down, we can find some more
specific keyword phrases like "weight loss story", "weight loss picture", and "safe
weight loss" for which the competition is less.

Some of the other tools are:


http://inventory.overture.com (web based version of goodkeywords.com) cost: free,
http://www.keywordlocator.com one time cost: $87,
http://www.wordtracker.com monthly cost: $53.

H.V.P.M’s College Of Engineering And Technology, Amravati. Page 13


SEARCH ENGINE OPTIMIZATION -TOOLS AND TECHNIQUES

3.1.2 Bolding, Italicising, Underlining:


At best try to add your primary keyword in heading <h1> tag. Add your secondary
keyword in <h2> tag.
For Ex: <h1> Weight Loss </h1>
<h2> Safe Weight Loss </h2>.
Don't overdo it though. If you simply repeat your keyword over and over like this:
Weight loss story about weight loss story that I have a weight loss story and weight
loss story.
Google will immediately see your website as search engine spam and you will not
rank well. So, try to keep in mind that you're creating your website for the eyes of
REAL people.
Make sure to mention your main keyword at the very top left and the very bottom
right hand side of the webpage. A trick used is to include this in the copyright
information line at the bottom of the website. For our example, this would be a good
example:
© 2005 copyright www.domain.com a weight loss story.
Properly include <alt> image tags.
Add an alt image tag to the very top image of your webpage (This is usually your
website's header graphic). Use the <alt> image tag using the text "weight loss story
header".
The html code used to add an <alt> image tag would look like this:
<img src="YourGraphic.jpg" width="503" height="93" alt="weight loss story">.
They look very simple options but they can surely improve your rankings.

3.1.3 Force search engine to read your keyword first:


Most of the sites have navigational links on the left hand side. Google starts reading
your web page from top left to right bottom. These sites have structure like this:

nnnnnn

Navigational links Your Body Text

H.V.P.M’s College Of Engineering And Technology, Amravati. Page 14


SEARCH ENGINE OPTIMIZATION -TOOLS AND TECHNIQUES

So naturally your links get first crawled and then contents. You would really want
Google to read your contents first. A good solution is given below:

Empty column Text

Navigational links

So, now search engine will read a empty column first and then your text and at last
navigational links.

3.1.4 Create Sitemaps, RSS:

A site map (or sitemap) is a list of pages of a web site accessible to crawlers or users.
It can be either a document in any form used as a planning tool for web design, or a
web page that lists the pages on a web site, typically organized in hierarchical fashion.
This helps visitors and search engine bots find pages on the site. While some
developers argue that site index is a more appropriately used term to relay page
function, web visitors are used to seeing each term and generally associate both as one
and the same. However, a site index is often used to mean an A-Z index that provides
access to particular content, while a site map provides a general top-down view of the
overall site contents.

Fig.no. 9 Sitemap of Google

H.V.P.M’s College Of Engineering And Technology, Amravati. Page 15


SEARCH ENGINE OPTIMIZATION -TOOLS AND TECHNIQUES

If sitemaps are big having 100 or more links, Google recommends to break sitemaps
in large pages.RSS is Rich Site Summary. RSS also known as Really Simple
Syndication. RSS feeds help other sites to distribute your headlines and content.

3.2 Off- Page Optimization:

Off-page optimization basically consists of all of the ranking factors that are
not located on your Webpage that the search engines look at when ranking a
website.
These include:
 Which websites link to you?
 The number of websites linking to you.

 The Google of the website linking to you.


 The page title of the website linking to you.
 The anchor text used in the link linking to you.
 The number and type of links linking to the website that's linking to you.
 The number of outbound links on the website that is linking to you.
 The total number of links on the website that is linking to you.
 Whether or not the websites linking to you are deemed by Google as an
authorised website.

3.2.1 Link Relationships:


Suppose we want create a diet related Web site that gives information about
well balanced diet. We searched Google and let got www.diet-i.com as the
first ranked Web site. Now we searched for number of links to it. We entered
its name on Google.
As seen in the figure given below we can say 461 links are there to diet-i.com
Now next we need to know the PageRanks of the Web Pages linking to them.
To know the IP addresses of them use following tool:

http://www.webrankinfo.com/english/tools/class-c-checker.php

H.V.P.M’s College Of Engineering And Technology, Amravati. Page 16


SEARCH ENGINE OPTIMIZATION -TOOLS AND TECHNIQUES

Fig.no. 10 Showing number of links to www.diet-i.com


Instead use a software program called SEO Elite to do everything I just
mentioned above. It's not a free tool, but it's well worth the money for the time
it can save you.
What SEO Elite will do is analyze ALL of the off-page ranking factors

Again, these things include:


 Which websites link to them
 The number of websites linking to them
 The Google of the website linking to them
 The page title of the website linking to them
 The anchor text used in the link linking to them
 The number and type of links linking to the website
 The number of outbound links on the website that
 The total number of links on the website that is linking
 Whether or not the websites linking to them are deemed
authority website.

H.V.P.M’s College Of Engineering And Technology, Amravati. Page 17


SEARCH ENGINE OPTIMIZATION -TOOLS AND TECHNIQUES

Fig.no. 11 Report after submitting www.diet-i.com to SEO Elite


software.

While looking at the report view within, you'll notice that www.diet-i.com has
hundreds of backlinks that contain the words "diet information" within their anchor
text. This is a BIG plus for them and something you'll want to duplicate to create your
site related to diet information. If you think about it, it makes sense that Google gives
priority to websites that have links on many IP Addresses rather than many links all
on the same IP Address. This helps eliminate the possibility of people controlling the
search engines. If Google didn't look at IP Addresses, I could simply create 1 website
with thousands of pages and link to another 1 of my websites from all of these pages.
I would then have thousands of links pointing to my website, which would probably
result in a #1 ranking...Similarly, observe the PageRanks, Analysis etc.

H.V.P.M’s College Of Engineering And Technology, Amravati. Page 18


SEARCH ENGINE OPTIMIZATION -TOOLS AND TECHNIQUES

3.2.2 Get listed in DMOZ, Yahoo:

DMOZ is the world’s largest directory used by Google and AOL to create their
directories. Getting place in it is getting indexed by Google. Getting listed in Yahoo is
$299 a year. But still is worth to get listed. Because once you get recognized by
search engines in a year you won’t have to pay again.

3.2.3 Submit Articles:

Another way you could increase the number of links to your website is by writing
articles related to your topic and then submit those to article directories. In the footer
of each article you write, you should include a "blurb" about yourself and also a LINK
back to you website, using your main keyword in the "anchor text" of course. This is a
great way to get 1-way links to your website. A 1-way link is when another website
links to you without you linking back to them. Google and other search engines view
1-way links as being much more important than reciprocal links, so the more 1-way
links you can get for your website, the better rankings you will have. A great place to
find a list of many article directories to submit your articles to is the following
website:

http://www.pro-marketing-online.com/submit-articles.html

3.2.4 Link Exchange:

Try to find quality sites that share compatibility with your site’s topic and ask
webmaster to have a link exchange. A link to your site is a vote to you. Just add your
keyword to Google. Get the site names and contact them requesting a link exchange.
Be careful while involving in Link farms. If your site is having a link from banned or
penalised site Google might ban your site also. In a recent high profile case, BMW
Germany was delisted temporarily for involving in cloaking form.

H.V.P.M’s College Of Engineering And Technology, Amravati. Page 19


SEARCH ENGINE OPTIMIZATION -TOOLS AND TECHNIQUES

4 SEO Myths:

 For a site to be included in a search engine, it must be manually


submitted. Most search engines actually find and index sites faster on their
own these days. In the past, manual submission was more important, but that
importance has faded as bots have become more sophisticated. Actually it is
better not to submit site manually, because it takes 6 to 9 weeks to get
indexed. However, crawlers themselves find sites in few days. Unfortunately,
spammers figured out how to create automated bots that bombarded the add
URL form with millions of URLs pointing to commercial propaganda. Google
rejects those URLs submitted through its Add URL form that it suspects are
trying to deceive users by employing tactics such as including hidden text or
links on a page, stuffing a page with irrelevant words, cloaking (aka bait and
switch), using sneaky redirects, creating doorways, domains, or sub-domains
with substantially similar content, sending automated queries to Google, and
linking to bad neighbours. So now the Add URL form also has a test: it
displays some squiggly letters designed to fool automated “letter-guessers”; it
asks you to enter the letters you see — something like an eye-chart test to stop
spambots.
 Using a particular type of webserver will benefit a site’s rankings. Google
clearly states that the type of webserver a website uses does not matter. Other
search engines don’t explicitly state that certain webservers are preferred or
discriminated against, but through testing within the SEO community, this
myth has been proven to be false.
 <META NAME=”robots” CONTENT=”ALL”> is a Necessary Tag. If a
search engine isn’t explicitly told not to index a page through robots.txt or
meta-tags, it will index the page.
 Search engines, especially Google, rely on links more than anything else to
determine rankings. This has never been true. What is true is that many
people, particularly in the search engine optimization community, have to
come to rely solely upon links to build their search engine traffic. This is a
very inefficient approach. The vast majority of several billion queries that
people enter into search engines every month are resolved on the basis of

H.V.P.M’s College Of Engineering And Technology, Amravati. Page 20


SEARCH ENGINE OPTIMIZATION -TOOLS AND TECHNIQUES

relevance, and it is determined depending on content of pages along with link


relationships.
 You lose link weight when you link out to other sites. Link Weight (often
called PageRank) is calculated on the basis of Link Relationships. But some
people have accepted the false argument that a Web site can lose Link Weight
if its pages links out to other sites. In fact, Link Weight is constantly being
recalculated as search engine finds new pages in their indexes. Furthermore,
sites that attempt to "Preserve PageRank" are most likely to be treated to all
other Web pages.

H.V.P.M’s College Of Engineering And Technology, Amravati. Page 21


SEARCH ENGINE OPTIMIZATION -TOOLS AND TECHNIQUES

5. CONCLUSION:

Web presence is the most necessary thing for your Web site in today's world.
Search engines are the most important source for finding your site. SEO
community will continue to evolve as search engines find new ways to index
and promote Web-based content. SEO theory will also continue to drive Black
Hat SEO practices as they react to the constantly changing criteria for
inclusion in search engine databases. But SEO theory should also remain a
viable part of White Hat or Best Practices SEO because it embraces the
holistic approach that White Hats take to web design and promotion.

H.V.P.M’s College Of Engineering And Technology, Amravati. Page 22


SEARCH ENGINE OPTIMIZATION -TOOLS AND TECHNIQUES

6. REFERENCES:

 BOOKS:
 Search Engine Optimization Made Easy By Brad Callen.

 WEB SITES:
 http://www.googleguide.com/google_works
 http://en.wikipedia.org/wiki/Sitemap
 http://en.wikipedia.org/wiki/Meta_tag
 http://en.wikipedia.org/wiki/Web_crawler
 www.google.stanford.edu
 www.1stQuery.com

H.V.P.M’s College Of Engineering And Technology, Amravati. Page 23

You might also like