Professional Documents
Culture Documents
Module 2 SEO Process
Module 2 SEO Process
For SEO, the most important point to start concentrating is on the way your site is built. One of
the first things that attracts a search engine crawler is the actual design of your site. Tags, links,
navigational structure, and content are just a few of the elements that catch crawlers’ attention.
And it is better to consider SEO before building your web site rather than implementing SEO
strategies after building it.
a. Content Format
If you’re looking to attract search engine traffic, make it easy for search engines to index your
website. Make sure your site design doesn’t present unnecessary obstacles to search engine
spiders. Thus, the first step in the SEO design process is to ensure that your site can be found and
crawled by the search engines.
To rank well in the search engines, your site’s content—that is, the material available to visitors
of your site—should be in HTML text form. Images and Flash files for example, while crawled by
the search engines, are content types that are more difficult for search engines to analyze.
Spiders are interested in text, text and more text. They don’t see the graphics, clever animations
and other flashy bells and whistles that web designers routinely use to make sites look pretty. In
fact over-reliance on some of these things can even stop some spiders in their tracks, preventing
them from indexing your pages at all. Make sure that each page includes relevant text-based
content; avoid flash-only sites and frames, which are difficult for spiders to crawl effectively; and
make sure that every page on your site can be reached via a simple text-based hyperlink.
Images are a file type that the search engines have challenges with “identifying” from a relevance
perspective, as there are minimum text-input fields for image files in GIF, JPEG, or PNG format
(namely the filename, title, and alt attribute). While we do strongly recommend accurate labeling
of images in these fields, images alone are usually not enough to earn a website page top rankings
for relevant queries.
b. Link Structure
Search engines use links on web pages to help them discover other web pages and websites. For
this reason, we strongly recommend taking the time to build an internal linking structure that
spiders can crawl easily. Don’t make a mistake of hiding their navigation in ways that limit spider
accessibility, thus making them unable to get pages listed in the search engines’ indexes, as
shown in figure below:
1
Figure : Link Structure and Crawling
In the Figure, Google’s spider has reached Page A and sees links to pages B and E. However, even
though pages C and D might be important pages on the site, the spider has no way to reach them
(or even to know they exist) because no direct, crawl-able links point to those pages. Great
content, good keyword targeting, and smart marketing won’t make any difference at all if the
spiders can’t reach those pages in the first place.
A well-designed site architecture can bring many benefits for both users and search engines. A
logical and properly constructed website architecture can help overcome the challenges faced by
crawlers in understanding your website and bring great benefits in search traffic and usability.
There are two critical principles of a well-designed site architecture: a)usability, or making a site
easy to use; and b) information architecture, or crafting a logical, hierarchical structure for
content.
As shown in figure below, a recipes website can use intelligent architecture to fulfill visitors’
expectations about content and create a positive browsing experience. This structure not only
helps humans navigate a site more easily, but also helps the search engines to see that your
content fits into logical concept groups. You can use this approach to help you rank for
applications of your product in addition to attributes of your product.
2
Figure: Structures site architecture
Although site architecture accounts for a small part of the algorithms, the engines do make use
of relationships between subjects and give value to content that has been organized in a sensible
fashion. Search engines, through their massive experience with crawling the Web, recognize
patterns in subject architecture and reward sites that embrace an intuitive content flow.
One very strict rule for search friendliness is the creation of flat site architecture. Flat sites require
a minimal number of clicks to access any given page, whereas deep sites create long paths of
links required to access detailed content. Flat sites aren’t just easier for search engines to crawl;
they are also simpler for users, as they limit the number of page visits the user requires to reach
his destination. This reduces the abandonment rate and encourages repeat visits. For nearly
every site with fewer than 10,000 pages, all content should be accessible through a maximum of
four clicks from the home page and/or sitemap page.
That said, flatness should not be forced if it does not make sense for other reasons. When
creating flat sites, be careful to not overload pages with links either. Pages that have 200 links on
them are not passing much PageRank to any of those pages. While flat site architectures are
desirable, you should not force an architecture to be overly flat if it is not other logical to do so.
3
2. Selecting right Keywords
a. Knowing Keywords
The key to effective SEO is knowing what people looking for your products, services or
information are typing into that little box on the search engine home page. Known as keywords
or keyword phrases which consist of two, three or more keywords, these form the foundation of
your SEO efforts. Find the optimum keywords, follow a few basic SEO guidelines, and when the
spiders re-index your site you’ll start to see it rise up the organic search rankings for those
keywords and, with a bit of luck, you’ll notice a corresponding increase in the level of targeted
traffic arriving at your site. Choose the wrong keywords, however, and the best SEO in the world
won’t deliver the results you’re looking for.
Knowing your target audience is a critical component of any marketing campaign – and it’s the
same here. Put yourself in your prospect’s shoes, sitting in front of your favourite search engine
looking for information on the product or service you’re selling. What would you type into the
box?
The most basic test of relevance by the search engines is the number of times the phrase appears
on the page.
However, there are many other factors that can also be applied. In its guidance for Webmasters,
Google states: ‘ Google goes far beyond the number of times a term appears on a page and
examines all aspects of the page ’s content (and the content of the pages linking to it) to
determine if it ’ s a good match for your query ’ .
These other factors include:
• Frequency
• Occurrence in headings
• Occurrence in anchor text of hyperlinks
• Markup such as bold
• Density
• Proximity of phrase to start of document and the gap between individual keywords
• Alternative image text
• Document meta data
Take these keywords and play around with them. Imagine the various combinations of search
terms your prospects might use to find your site. Type these into the engines and look at the
results. Examine the sites that are ranking highly for your chosen keywords. Analyse them and
try to work out how they’re achieving those rankings.
4
b. Useful Tools
You can also use a wide range of automated keyword suggestion tools, These tools typically
provide insight into the search traffic volumes for the most popular phrases relating to keyword
phrases you provide.
Other tools on the web can provide you with insight into how your leading competitors are doing
in terms of search engine traffic for particular keywords. Services on sites like SEO ToolSet
(www.seotoolset.com) and Compete (www.compete.com) can provide information on which
keywords are driving traffic to your competitors’ websites from the major search engines, and
which of your competitors’ sites are ranking for which keyword phrases – all of which can inform
the choice of keywords you want to optimize for.
While automated tools are a good guide, don’t underestimate the value of people as a source of
inspiration for keyword selection. Use the automated tools to assist, but please remember that,
although automated tools are brilliant, nothing is better at understanding the minds of people
than people themselves. What you believe people will search for and what they actually type into
the search box are often two very different things. Get a group of people together – if possible
representative of your target market – and start brainstorming keywords. The results will
probably surprise you.
5
best. Phrases like ‘mortgages in Ahmedabad’ or ‘mortgage consultant in Ahmedabad’, on the
other hand, are potentially much less competitive, and generate much lower search volumes, but
are much more valuable to your business, because the people who search on those terms are far
more likely to be interested in the products and services you offer.
In other words, the more general a keyword, the less likely it is that your site will contain what
the searcher is trying to find. Effective SEO isn’t just about generating traffic volume; it’s about
finding that elusive balance between keyword search volume and keyword specificity that drives
the maximum volume of targeted traffic to your site.
‘Your target keywords should always be at least two or more words long’, explained search guru
Danny Sullivan, in a 2007 article for Search Engine Watch (www.searchenginewatch.com).
Usually, too many sites will be relevant for a single word, such as “stamps”. This “competition”
means your odds of success are lower. Don’t waste your time fighting the odds. Pick phrases of
two or more words, and you’ll have a better shot at success. Search users are using more
sophisticated search queries to narrow down the results they get back. These days two, three or
even more words are becoming increasingly common. Exploiting that trend in your choice of
optimization keywords can yield real dividends.
6
Effective search marketing campaigns tend to put a lot of effort into discovering effective long-
tail terms, particularly for use in sponsored listings (PPC) campaigns.
f. Keyword Analysis
Company wanting to optimize websites may use following guidelines to analyze keyword
phrases used for their product or services.
Demand analysis : Identifying the popularity of each search term, its relevance to the products
or services qualified by the ‘intent of the searcher ’ indicated by the phrase and the competition
on it. We recommend using two free Google tools: the Google Keyword Tool and Google Traffic
Estimator which are great for giving estimates on the popularity of searches for different
products and brands online.
Other sources for identifying keywords include your market knowledge, competitors’ sites, key
phrases from visitors who arrive at your site (from web analytics), the internal site search tool.
When performing the keyword analysis we need to understand different qualifiers that users
type in so that we can target them in our SEM. For example, this list of seven different types of
keywords are used for ‘car insurance’:
Performance analysis: This assesses how the company is currently performing for these phrases.
With the right tracking tools and tags, it should be possible to report average position in natural
or paid listings; Click volume referred from search; Click quality (conversion rates and ideally
bounce rates to compare landing page effectiveness); Outcomes (sales, registrations or leads);
Costs (CPC and CPA); profitability (based on cost of sale or lifetime value models).
Gap analysis: Identifies for each phrase and product where the biggest potential for
improvement is, so you can target your resources accordingly.
Set goals and select keywords: You should identify the different types of keywords you want to
be visible for. Particularly important are the strategic keywords which are critical to success.
7
3. Content – the heart and soul of website
a. Introduction
Content is the single most important thing on your website. Unique, relevant, informative
content is what sets your site apart from the competition. It’s the reason users want to visit you,
why other sites will want to link to you and, of course, why search engines will want to suggest
your site to their users in search results.
Content is important because it is going to directly affect the user experience. And user
experience is also one of the element affecting the ranking of the site. However, it is a little harder
to quantify than other site-ranking elements.
Let us understand how user experience affects search engine ranking. Search engines today are
smarter than they have ever been. They can certainly keep track of what results users click when
they run a search. Those result selections are essential to adding to the organic ranking of your
site. Search engines monitor which sites are actually clicked on by users from the search results.
So let’s say a user search through the results and click a link on the fifth page. And suppose several
other people do so as well. That link on the fifth page is going to show more traffic than links that
are higher in the results, so smart search engines will move that page higher in the rankings. It
may not jump right up to the number one position, but it’s possible for the site to move from the
fifth page of rankings to the second or third. This is part of the equation used when user
experience is taken into consideration.
Another part of that experience might be how quickly the user jumps back to the search page.
Maybe when you click that link on the fifth page, you can tell when you hit the site that it’s not
the page you were looking for (or doesn’t contain the information or product that you were
looking for). You click the back button, and you’re taken back to the page of search results. This
is called bounce, and the rate at which users bounce off your site is an indicator of the usability
of the site in terms of how relevant it is to what users are searching for.
In the context of SEO, content means the text on each of your web pages. Content can be blog
posts, whitepapers, news articles, social media posting, or any media that is getting the message
across. But the type of content shared should not be advertising in nature. For example, if I were
marketing a university, I would have information for users on how to select a university, navigate
financial aid, and so on and not just talk about how great the university is.
A common mistake is to bid on your product name and not the need it is trying to serve. One
example is the product “Monster Spray,” a fanciful product to help parents allay their children
who are afraid of the dark. Parents can spray the spray and thus calm their children’s fears of
monsters under the bed or in the closet. Parents searching for this type of help don’t search for
a particular product. So bidding on Monster Spray might not be that effective. Instead, parents
use terms such as “child afraid of the dark.” Increasingly, users are searching for the answers to
questions such as “What can I do if my child is afraid of the dark?” and the search engines, such
8
as the recent Google Algorithm upgrade, are seeking to answer these questions. This
development in terms of trying to answer natural language queries and take into account the
context of the search is called “semantic search.”
9
just two things: the relevance and authority of your page content in the context of what the user
typed into the search box.
10
4. Optimize one page at a time
Optimize your site one page at a time. each of the existing pages on your site will need to be
optimized independently. This is because when a search engine presents results to a user, it’s not
presenting whole sites; it’s presenting the individual pages that, according to its algorithms, best
match a user’s query. That means each individual page on your website gives you an explicit
opportunity to optimize for specific keywords or phrases – and that’s important. Each page in
your website will have different target keywords that reflect the page’s content. For example,
say you have a page about the history of stamps. Then “stamp history” might be your keywords
for that page.
Keywords become the structure for your site, with a page for every topic. Laying these
foundations and allowing them to grow according to what you, your team and your visitors think
is the key to successful opportunities to rank. Isolate the important keywords and phrases in your
particular market and then to ensure your site includes individual pages with unique, relevant
content optimized for a small number of (ideally one or two, and no more than three) keyword
phrases. The more individual pages you have, the more opportunities you have to get your
business in front of your prospects in the SERPs – and at the end of the day that’s what SEO is all
about
6. Meta-description Tag
Meta-tags contain information that is accessible to browsers, search engine spiders and other
programs, but that doesn’t appear on the rendered (visible) page for the user. Meta-description
tag is worth including as part of your SEO.
Depending on the query and the page content, leading search engines will sometimes use the
contents of your meta-description tag as the descriptive ‘snippet’ of text that appears below your
page title in the SERPs. A well-written description for each page can, in theory at least, entice
more users to click through to your page when it’s returned in search results. Having compelling,
11
informative meta-description tags is something that search engines encourage, certainly won’t
hurt your rankings, is beneficial to users and may well boost traffic to your site.
• For the pages the search engines already know about through their regular spidering, they
use the metadata you supply, such as the last date the content was modified (lastmod
date) and the frequency at which the page is changed (changefreq), to improve how they
crawl your site.
• For the pages they don’t know about, they use the additional URLs you supply to increase
their crawl coverage.
• For URLs that may have duplicates, the engines can use the XML Sitemaps data to help
choose a basic and sanctioned version.
• Verification/registration of XML Sitemaps may indicate positive trust/authority signals.
• The crawling/inclusion benefits of Sitemaps may have second-order positive effects, such
as improved rankings or greater internal link popularity.
• Having a site map registered with Google Webmaster Tools can give you an extra
analytical insight into whether your site is suffering from indexation, crawling, or
duplicate content issues.
Once your XML Sitemap has been accepted and your site has been crawled, keep updating your
XML Sitemap with search engine whenever you add URLs to your site. You’ll also want to keep
your Sitemap file up-to-date when you add a large volume of pages or a group of pages that are
12
strategic. There is no need to update the XML Sitemap when simply updating content on existing
URLs. Do update your Sitemap file whenever you add any new content, and remove any deleted
pages at that time. Further, search engines will periodically re-download the Sitemap, so you
don’t need to resubmit your Sitemap.
13