Download as pdf or txt
Download as pdf or txt
You are on page 1of 13

Module 2 SEO Process

1. Make your site easy to crawl

For SEO, the most important point to start concentrating is on the way your site is built. One of
the first things that attracts a search engine crawler is the actual design of your site. Tags, links,
navigational structure, and content are just a few of the elements that catch crawlers’ attention.
And it is better to consider SEO before building your web site rather than implementing SEO
strategies after building it.

a. Content Format
If you’re looking to attract search engine traffic, make it easy for search engines to index your
website. Make sure your site design doesn’t present unnecessary obstacles to search engine
spiders. Thus, the first step in the SEO design process is to ensure that your site can be found and
crawled by the search engines.
To rank well in the search engines, your site’s content—that is, the material available to visitors
of your site—should be in HTML text form. Images and Flash files for example, while crawled by
the search engines, are content types that are more difficult for search engines to analyze.
Spiders are interested in text, text and more text. They don’t see the graphics, clever animations
and other flashy bells and whistles that web designers routinely use to make sites look pretty. In
fact over-reliance on some of these things can even stop some spiders in their tracks, preventing
them from indexing your pages at all. Make sure that each page includes relevant text-based
content; avoid flash-only sites and frames, which are difficult for spiders to crawl effectively; and
make sure that every page on your site can be reached via a simple text-based hyperlink.

Images are a file type that the search engines have challenges with “identifying” from a relevance
perspective, as there are minimum text-input fields for image files in GIF, JPEG, or PNG format
(namely the filename, title, and alt attribute). While we do strongly recommend accurate labeling
of images in these fields, images alone are usually not enough to earn a website page top rankings
for relevant queries.

b. Link Structure
Search engines use links on web pages to help them discover other web pages and websites. For
this reason, we strongly recommend taking the time to build an internal linking structure that
spiders can crawl easily. Don’t make a mistake of hiding their navigation in ways that limit spider
accessibility, thus making them unable to get pages listed in the search engines’ indexes, as
shown in figure below:

1
Figure : Link Structure and Crawling

In the Figure, Google’s spider has reached Page A and sees links to pages B and E. However, even
though pages C and D might be important pages on the site, the spider has no way to reach them
(or even to know they exist) because no direct, crawl-able links point to those pages. Great
content, good keyword targeting, and smart marketing won’t make any difference at all if the
spiders can’t reach those pages in the first place.

c. Overall Site Architecture


Although site architecture—the creation of structure and flow in a website’s topical hierarchy—
is typically the territory of information architects (or is created without assistance from a
company’s internal content team), its impact on search engine rankings - particularly in the long
run - is substantial. It is, therefore, a wise endeavor to follow basic guidelines of search
friendliness.

A well-designed site architecture can bring many benefits for both users and search engines. A
logical and properly constructed website architecture can help overcome the challenges faced by
crawlers in understanding your website and bring great benefits in search traffic and usability.
There are two critical principles of a well-designed site architecture: a)usability, or making a site
easy to use; and b) information architecture, or crafting a logical, hierarchical structure for
content.
As shown in figure below, a recipes website can use intelligent architecture to fulfill visitors’
expectations about content and create a positive browsing experience. This structure not only
helps humans navigate a site more easily, but also helps the search engines to see that your
content fits into logical concept groups. You can use this approach to help you rank for
applications of your product in addition to attributes of your product.

2
Figure: Structures site architecture
Although site architecture accounts for a small part of the algorithms, the engines do make use
of relationships between subjects and give value to content that has been organized in a sensible
fashion. Search engines, through their massive experience with crawling the Web, recognize
patterns in subject architecture and reward sites that embrace an intuitive content flow.

One very strict rule for search friendliness is the creation of flat site architecture. Flat sites require
a minimal number of clicks to access any given page, whereas deep sites create long paths of
links required to access detailed content. Flat sites aren’t just easier for search engines to crawl;
they are also simpler for users, as they limit the number of page visits the user requires to reach
his destination. This reduces the abandonment rate and encourages repeat visits. For nearly
every site with fewer than 10,000 pages, all content should be accessible through a maximum of
four clicks from the home page and/or sitemap page.
That said, flatness should not be forced if it does not make sense for other reasons. When
creating flat sites, be careful to not overload pages with links either. Pages that have 200 links on
them are not passing much PageRank to any of those pages. While flat site architectures are
desirable, you should not force an architecture to be overly flat if it is not other logical to do so.

Figure : Deep site structure v/s Flat Site Structure

3
2. Selecting right Keywords
a. Knowing Keywords
The key to effective SEO is knowing what people looking for your products, services or
information are typing into that little box on the search engine home page. Known as keywords
or keyword phrases which consist of two, three or more keywords, these form the foundation of
your SEO efforts. Find the optimum keywords, follow a few basic SEO guidelines, and when the
spiders re-index your site you’ll start to see it rise up the organic search rankings for those
keywords and, with a bit of luck, you’ll notice a corresponding increase in the level of targeted
traffic arriving at your site. Choose the wrong keywords, however, and the best SEO in the world
won’t deliver the results you’re looking for.
Knowing your target audience is a critical component of any marketing campaign – and it’s the
same here. Put yourself in your prospect’s shoes, sitting in front of your favourite search engine
looking for information on the product or service you’re selling. What would you type into the
box?

The most basic test of relevance by the search engines is the number of times the phrase appears
on the page.
However, there are many other factors that can also be applied. In its guidance for Webmasters,
Google states: ‘ Google goes far beyond the number of times a term appears on a page and
examines all aspects of the page ’s content (and the content of the pages linking to it) to
determine if it ’ s a good match for your query ’ .
These other factors include:
• Frequency
• Occurrence in headings
• Occurrence in anchor text of hyperlinks
• Markup such as bold
• Density
• Proximity of phrase to start of document and the gap between individual keywords
• Alternative image text
• Document meta data

Take these keywords and play around with them. Imagine the various combinations of search
terms your prospects might use to find your site. Type these into the engines and look at the
results. Examine the sites that are ranking highly for your chosen keywords. Analyse them and
try to work out how they’re achieving those rankings.

4
b. Useful Tools
You can also use a wide range of automated keyword suggestion tools, These tools typically
provide insight into the search traffic volumes for the most popular phrases relating to keyword
phrases you provide.

Other tools on the web can provide you with insight into how your leading competitors are doing
in terms of search engine traffic for particular keywords. Services on sites like SEO ToolSet
(www.seotoolset.com) and Compete (www.compete.com) can provide information on which
keywords are driving traffic to your competitors’ websites from the major search engines, and
which of your competitors’ sites are ranking for which keyword phrases – all of which can inform
the choice of keywords you want to optimize for.

While automated tools are a good guide, don’t underestimate the value of people as a source of
inspiration for keyword selection. Use the automated tools to assist, but please remember that,
although automated tools are brilliant, nothing is better at understanding the minds of people
than people themselves. What you believe people will search for and what they actually type into
the search box are often two very different things. Get a group of people together – if possible
representative of your target market – and start brainstorming keywords. The results will
probably surprise you.

c. Make a manageable list of Keywords


Keep the list of keywords of manageable size. What constitutes a manageable size will depend
on your situation – on how much time, money and resources you have available for your SEO
effort. Remember, there’s nothing wrong with starting small: optimize a few pages for what you
believe are your main keywords, and monitor the results on ranking, traffic and conversion for
those pages. That will give you a solid foundation from which to build your optimization efforts
and your SEO expertise.

d. Avoid using too general keywords


Start by eliminating all of the words or phrases that are much too general. Broad single-word
terms like ‘shoes’, ‘mortgages’, ‘bottles’ or ‘computers’ tend to be both very difficult to rank for
(because they’re high-traffic terms that can apply equally to a huge number of sites across the
net) and also far too generic to drive valuable targeted traffic to your site.
Suppose you’re an independent mortgage consultant based in Ahmedabad, Gujarat, India. If you
choose to optimize a page based on the keyword ‘mortgages’ you’ll find yourself competing with
a raft of mortgage providers, mortgage advisers, mortgage brokers, mortgage consultants,
mortgage industry news sites, etc from all over the world. Even if your page does make it to those
coveted elevated positions in the SERPs for that keyword, the chances that people searching for
the term ‘mortgages’ will be looking for an independent consultant in Ahmedabad are slim at

5
best. Phrases like ‘mortgages in Ahmedabad’ or ‘mortgage consultant in Ahmedabad’, on the
other hand, are potentially much less competitive, and generate much lower search volumes, but
are much more valuable to your business, because the people who search on those terms are far
more likely to be interested in the products and services you offer.
In other words, the more general a keyword, the less likely it is that your site will contain what
the searcher is trying to find. Effective SEO isn’t just about generating traffic volume; it’s about
finding that elusive balance between keyword search volume and keyword specificity that drives
the maximum volume of targeted traffic to your site.
‘Your target keywords should always be at least two or more words long’, explained search guru
Danny Sullivan, in a 2007 article for Search Engine Watch (www.searchenginewatch.com).
Usually, too many sites will be relevant for a single word, such as “stamps”. This “competition”
means your odds of success are lower. Don’t waste your time fighting the odds. Pick phrases of
two or more words, and you’ll have a better shot at success. Search users are using more
sophisticated search queries to narrow down the results they get back. These days two, three or
even more words are becoming increasingly common. Exploiting that trend in your choice of
optimization keywords can yield real dividends.

e. Long-tail versus short-tail keywords


Keywords in SEO fall into two broad categories. Short-tail keywords are simple one- or two-word
phrases that are typically very general in nature and attract a large volume of individual search
requests. Long-tail keywords, on the other hand, are more complex queries that contain more
words and are much more specific in nature. Individually they attract a much lower volume of
search traffic than their short-tail counterparts, but cumulatively these long-tail-type queries
account for the lion’s share of internet search traffic.
In any keyword domain there are a small number of highly trafficked keywords or phrases and a
large number of low-trafficked keywords or phrases. Often, the keyword domain approximates
to the right half of a normal curve with the tail of the curve extending to infinity. Low-trafficked
keywords are therefore also known as ‘long-tail keywords. The highly trafficked [short-tail]
keywords have the following characteristics: highly competitive, consist of one or two words,
have a high cost per click and may have low conversion rates as they tend to be quite general.
Examples from the accommodation sector might include ‘hotel’, ‘London hotel’ or ‘cheap hotel’.
Low-trafficked [long-tail] keywords are not so competitive, often consist of four, five or more
words, have a lower cost per click and can have a higher conversion rate as they are quite specific
indicating that the searcher is further along the online purchasing cycle. Examples might include
‘cheap city centre hotel Dublin’, ‘stags weekend hotel Temple Bar Dublin’ or ‘business hotel with
gym and spa Wexford’.

6
Effective search marketing campaigns tend to put a lot of effort into discovering effective long-
tail terms, particularly for use in sponsored listings (PPC) campaigns.

f. Keyword Analysis
Company wanting to optimize websites may use following guidelines to analyze keyword
phrases used for their product or services.
Demand analysis : Identifying the popularity of each search term, its relevance to the products
or services qualified by the ‘intent of the searcher ’ indicated by the phrase and the competition
on it. We recommend using two free Google tools: the Google Keyword Tool and Google Traffic
Estimator which are great for giving estimates on the popularity of searches for different
products and brands online.
Other sources for identifying keywords include your market knowledge, competitors’ sites, key
phrases from visitors who arrive at your site (from web analytics), the internal site search tool.
When performing the keyword analysis we need to understand different qualifiers that users
type in so that we can target them in our SEM. For example, this list of seven different types of
keywords are used for ‘car insurance’:

• Comparison/quality – compare car insurance


• Adjective (price/product qualifiers) – cheap car insurance, woman car insurance
• Intended use – high mileage car insurance
• Product type – holiday car insurance
• Vendor or brand – Churchill car insurance
• Location – car insurance uk
• Action request – buy car insurance
• Provider type – car insurance company, car insurance supermarket

Performance analysis: This assesses how the company is currently performing for these phrases.
With the right tracking tools and tags, it should be possible to report average position in natural
or paid listings; Click volume referred from search; Click quality (conversion rates and ideally
bounce rates to compare landing page effectiveness); Outcomes (sales, registrations or leads);
Costs (CPC and CPA); profitability (based on cost of sale or lifetime value models).

Gap analysis: Identifies for each phrase and product where the biggest potential for
improvement is, so you can target your resources accordingly.

Set goals and select keywords: You should identify the different types of keywords you want to
be visible for. Particularly important are the strategic keywords which are critical to success.

7
3. Content – the heart and soul of website
a. Introduction
Content is the single most important thing on your website. Unique, relevant, informative
content is what sets your site apart from the competition. It’s the reason users want to visit you,
why other sites will want to link to you and, of course, why search engines will want to suggest
your site to their users in search results.
Content is important because it is going to directly affect the user experience. And user
experience is also one of the element affecting the ranking of the site. However, it is a little harder
to quantify than other site-ranking elements.
Let us understand how user experience affects search engine ranking. Search engines today are
smarter than they have ever been. They can certainly keep track of what results users click when
they run a search. Those result selections are essential to adding to the organic ranking of your
site. Search engines monitor which sites are actually clicked on by users from the search results.
So let’s say a user search through the results and click a link on the fifth page. And suppose several
other people do so as well. That link on the fifth page is going to show more traffic than links that
are higher in the results, so smart search engines will move that page higher in the rankings. It
may not jump right up to the number one position, but it’s possible for the site to move from the
fifth page of rankings to the second or third. This is part of the equation used when user
experience is taken into consideration.
Another part of that experience might be how quickly the user jumps back to the search page.
Maybe when you click that link on the fifth page, you can tell when you hit the site that it’s not
the page you were looking for (or doesn’t contain the information or product that you were
looking for). You click the back button, and you’re taken back to the page of search results. This
is called bounce, and the rate at which users bounce off your site is an indicator of the usability
of the site in terms of how relevant it is to what users are searching for.
In the context of SEO, content means the text on each of your web pages. Content can be blog
posts, whitepapers, news articles, social media posting, or any media that is getting the message
across. But the type of content shared should not be advertising in nature. For example, if I were
marketing a university, I would have information for users on how to select a university, navigate
financial aid, and so on and not just talk about how great the university is.
A common mistake is to bid on your product name and not the need it is trying to serve. One
example is the product “Monster Spray,” a fanciful product to help parents allay their children
who are afraid of the dark. Parents can spray the spray and thus calm their children’s fears of
monsters under the bed or in the closet. Parents searching for this type of help don’t search for
a particular product. So bidding on Monster Spray might not be that effective. Instead, parents
use terms such as “child afraid of the dark.” Increasingly, users are searching for the answers to
questions such as “What can I do if my child is afraid of the dark?” and the search engines, such
8
as the recent Google Algorithm upgrade, are seeking to answer these questions. This
development in terms of trying to answer natural language queries and take into account the
context of the search is called “semantic search.”

b. Content is for audience


When writing content for your site the key thing to remember is that you’re writing it, first and
foremost, for a human audience, not for search engine spiders. Yes, your pages need to be
‘search engine friendly’, but the spiders should always be a secondary consideration: put your
human audience first.
Frankly, if your copy doesn’t engage real, live people when they arrive – address their needs right
from the start – then investing time and resources to attract more search engine traffic is
pointless. If your content doesn’t deliver, visitors will leave as soon as they arrive. Remember, on
the web you don’t have a captive audience. Users are in control – one click and they’re gone.
Your copy needs to be relevant, it has to be interesting, and above all it has to provide the
answers the user is looking for. It needs to do all of this quickly, in a concise, easily scannable
way.
Search engine optimization for sustainable high ranking, therefore, hinges on the production of
great original content that appeals to real, live people. The acid test is whether your visitors will
bookmark a page of your content or tell a friend about it. Think not only about the home page,
but also about other pages within the site.
By creating more valuable content and then showcasing them within your navigation, or grouping
it within a few pages such as a ‘Useful Resources ’ or a more extensive ‘ Resource Centre ’ you
can encourage more people to link to your content naturally, or approach them and suggest they
link not only to the home page, but directly to the useful tools that have been created.
User experience

c. Linking Content and the search engines


Successive generations of Search engines have become much ‘smarter’ at interpreting the actual
visible content on a page and judging its relevance to the user and they don’t rely on meta-data
to judge the content of a page; they analyse and interpret the actual content presented to the
user. And they’re getting better at doing it all the time. Since relevant content will also improve
your company’s rankings on search engines, it is even more important to figure out the brand
story and how to convey that story best not only through keywords but relevant content.
The search engine ranking algorithms is all about analysing and prioritizing your content. There
are all sorts of criteria that contribute to the process – some known, many guessed at and no
doubt some that we’ll never know. At the end of the day, though, they all combine to measure

9
just two things: the relevance and authority of your page content in the context of what the user
typed into the search box.

d. Linking Positioning strategy and search


it is important that the keywords on your website home page relate to the search terms that are
on that page. Develop the positioning statement and then the website or landing page next. The
more you know and develop your brand and select relevant keywords, the more likely you will
be able to be found by search engines.
One thing we do know is that it is important for SEO to have relevant keywords on the exact web
page you want to come up in search results, often called a “landing page” or “microsite”.
Therefore, the first step in SEM is deciding who you are and what is relevant to your product or
service, who your customers are, and basic company strategy. Only then you can know how they
search and what terms they use are critical to doing well in the search process.

e. Placing keywords in Content


The major question is where place the keywords in the content? when and how often they need
to appear on the page?
However, you should not worry about it too much and Focus on writing compelling copy that
addresses the needs of your target audience while keeping your target keywords for that page in
mind, and the search engines will do the rest. This is because, If you’re writing copy about a
specific set of keyword phrases, there’s a high probability you’ll use those keyword phrases and
related phrases organically in your writing and will achieve a natural balance. That’s exactly what
search engines are looking for.
Search engines are now more than intelligent enough to understand the semantic relationships
between words and phrases, so don’t try to assist them with certain keyword densities. Leave it
to their algorithms, and simply enjoy the rewards that their efforts can deliver to you and your
website.
Jill Whalen, CEO and founder of search marketing firm High Rankings, emphasizes that there are
no hard-and-fast rules to follow and there is no magical number of words per page or number of
times to use your phrases in your copy. The important thing is to use your keyword phrases only
when and where it makes sense to do so for the real people reading your pages. Simply sticking
keyword phrases at the top of the page for no apparent reason isn’t going to help, and it just
looks silly.

10
4. Optimize one page at a time
Optimize your site one page at a time. each of the existing pages on your site will need to be
optimized independently. This is because when a search engine presents results to a user, it’s not
presenting whole sites; it’s presenting the individual pages that, according to its algorithms, best
match a user’s query. That means each individual page on your website gives you an explicit
opportunity to optimize for specific keywords or phrases – and that’s important. Each page in
your website will have different target keywords that reflect the page’s content. For example,
say you have a page about the history of stamps. Then “stamp history” might be your keywords
for that page.
Keywords become the structure for your site, with a page for every topic. Laying these
foundations and allowing them to grow according to what you, your team and your visitors think
is the key to successful opportunities to rank. Isolate the important keywords and phrases in your
particular market and then to ensure your site includes individual pages with unique, relevant
content optimized for a small number of (ideally one or two, and no more than three) keyword
phrases. The more individual pages you have, the more opportunities you have to get your
business in front of your prospects in the SERPs – and at the end of the day that’s what SEO is all
about

5. Choose your page <title>s carefully


There’s a small but very important HTML tag that lives in the header section of the code on each
of your web pages. It’s called the ‘title’ tag, and the text it contains is what appears in the title
bar at the top of your browser window when you visit a web page. It’s also, crucially, the text
that appears as the ‘clickable’ blue link for a page when it’s presented to users in the SERPs.
That means that what you put in the title tag is incredibly important for the following reasons.
First, the title tag is one of the most important on- page factors used by the search engines to
rank your page. At this stage most, if not all, SEO experts agree that appropriate use of the title
tag is a key factor in ranking well in the SERPs, and advise weaving your primary keyword(s) for a
page into the title tag whenever possible. Just remember not to sacrifice readability for your
human audience. Second, the title is the first glimpse of your content a search user will see. Giving
your pages concise, compelling and informative titles will entice more users to click through to
your page when it appears in search results.

6. Meta-description Tag
Meta-tags contain information that is accessible to browsers, search engine spiders and other
programs, but that doesn’t appear on the rendered (visible) page for the user. Meta-description
tag is worth including as part of your SEO.
Depending on the query and the page content, leading search engines will sometimes use the
contents of your meta-description tag as the descriptive ‘snippet’ of text that appears below your
page title in the SERPs. A well-written description for each page can, in theory at least, entice
more users to click through to your page when it’s returned in search results. Having compelling,
11
informative meta-description tags is something that search engines encourage, certainly won’t
hurt your rankings, is beneficial to users and may well boost traffic to your site.

7. Submitting your site URL and sitemap


All of the major search engines offer a free submission process, and submitting your site won’t
hurt. If you want to kick-start the indexing process, by all means go ahead and manually submit
your home page and one or two other important pages.
The other thing you can do that will help search engines to crawl all relevant pages on your
website is to submit an XML sitemap that adheres to the sitemap protocol outlined on
www.sitemaps.org.
Sitemaps are an easy way for webmasters to inform search engines about pages on their sites
that are available for crawling. In its simplest form, a Sitemap is an XML file that lists URLs for a
site along with additional metadata about each URL (when it was last updated, how often it
usually changes, and how important it is, relative to other URLs in the site) so that search engines
can more intelligently crawl the site.
Web crawlers usually discover pages from links within the site and from other sites. Sitemaps
supplement this data to allow crawlers that support Sitemaps to pick up all URLs in the Sitemap
and learn about those URLs using the associated metadata. Using the Sitemap protocol does not
guarantee that web pages are included in search engines, but provides hints for web crawlers to
do a better job of crawling your site. The benefits of Sitemaps include the following:

• For the pages the search engines already know about through their regular spidering, they
use the metadata you supply, such as the last date the content was modified (lastmod
date) and the frequency at which the page is changed (changefreq), to improve how they
crawl your site.
• For the pages they don’t know about, they use the additional URLs you supply to increase
their crawl coverage.
• For URLs that may have duplicates, the engines can use the XML Sitemaps data to help
choose a basic and sanctioned version.
• Verification/registration of XML Sitemaps may indicate positive trust/authority signals.
• The crawling/inclusion benefits of Sitemaps may have second-order positive effects, such
as improved rankings or greater internal link popularity.
• Having a site map registered with Google Webmaster Tools can give you an extra
analytical insight into whether your site is suffering from indexation, crawling, or
duplicate content issues.
Once your XML Sitemap has been accepted and your site has been crawled, keep updating your
XML Sitemap with search engine whenever you add URLs to your site. You’ll also want to keep
your Sitemap file up-to-date when you add a large volume of pages or a group of pages that are

12
strategic. There is no need to update the XML Sitemap when simply updating content on existing
URLs. Do update your Sitemap file whenever you add any new content, and remove any deleted
pages at that time. Further, search engines will periodically re-download the Sitemap, so you
don’t need to resubmit your Sitemap.

8. Optimization is a continuous process


The ever-changing nature of the search environment means that there’s no end to SEO.
Optimization is a dynamic and iterative process – and if you want sustained results it needs to be
ongoing. You have to measure, monitor and refine continuously, tweaking and tuning your
optimization efforts based on changing conditions in the marketplace, the search engines and
your customers. You have to work hard to find the right blend of targeted keywords for your
particular business, operating within your particular market at the current point in time. You have
to optimize your pages based on those keywords, and deliver compelling, high-impact content.

13

You might also like