Professional Documents
Culture Documents
Search Engine Optimization: by Udit Khanna
Search Engine Optimization: by Udit Khanna
Search Engine Optimization: by Udit Khanna
Training Institute
Experience: Education
4+ Years – Expert Training Institute Masters in Business Administration
- Started this Venture in 2012 - Dual Specialization in Marketing & Finance
- Working as Trainer till date - HMRITM, Delhi
- 2007-2009
Working as a Freelancer
- Expert Seo Training Institute(my site) B.Sc in Hospitality & Hotel Admin
- Delhi Technical Campus - Institute of Hotel Management, BBSR
- Travel Kitty - 2003-2006
- Way 2 Automation
- Expert Indian Recipes(my site) Higher Secondary
- Get Cakes Online(my site) - Commerce
- Expert Digital Marketing(my site) - SDPS, 2001-2003
Bacati, Inc(2010-2012)
- As Online Marketing Specialist Skill Set
- SEO – PPC – SMO – HTML – Wordpress
R. S. Components (2009-2010 ) - Photoshop – Dreamweaver – Ms Excel
- As a Search Engine Optimiser - Marketing - Lead Generation - Analytics
3 Expert Training Institute - Udit Khanna
Search Engine Optimization
Expert Training Institute
Ê TYPES
White Hat – Refers to the usage of optimization strategies, techniques and tactics
that focus on a human audience
Black Hat - Refers to the use of aggressive SEO strategies, techniques and tactics that
focus only on search engines and not a human audience, and usually does not obey
search engines guidelines.
Grey Hat – Refers to SEO practice that's riskier than White Hat SEO, but one that may
or may not result in your site being banned from search engines and their affiliate
sites.
Ê The web is like an ever-growing library with billions of books and no central
filing system. We use software known as web crawlers to discover publicly
available webpages. Crawlers look at webpages and follow links on those
pages, much like you would if you were browsing content on the web. They go
from link to link and bring data about those webpages back to Google’s servers.
Ê The crawling process begins with a list of web addresses from past crawls and
sitemaps provided by website owners. As our crawlers visit these websites, they
use links on those sites to discover other pages. The software pays special
attention to new sites, changes to existing sites and dead links.
Ê When crawlers find a webpage, our systems render the content of the page,
just as a browser does. We take note of key signals — from keywords to website
freshness — and we keep track of it all in the Search index. The Google Search
index contains hundreds of billions of webpages and is well over 100,000,000
gigabytes in size.
6 Expert Training Institute - Udit Khanna
How Search Engine [Google] Works?
Expert Training Institute
Ê SEARCH ALGORITHMS
Ê You want the answer, not billions of webpages, so Google ranking systems sort through the
hundreds of billions of webpages in our Search index to give you useful and relevant results in a
fraction of a second. These ranking systems are made up of a series of algorithms that analyze
what it is you are looking for and what information to return to you. And as we’ve evolved Search
to make it more useful, we’ve refined our algorithms to assess your searches and the results in
finer detail to make our services work better for you.
Ê Analyzing Your Words: Understanding the meaning of your search is crucial to returning good answers.
Ê Matching Your Search : Next, we look for webpages with information that matches your query.
Ê Ranking Useful Pages : For a typical query, there are thousands, even millions, of webpages with potentially
relevant information. So to help rank the best pages first, we also write algorithms to evaluate how useful these
webpages are.
Ê Considering Context : Information such as your location, past search history and Search settings all help us to tailor
your results to what is most useful and relevant for you in that moment.
Ê Returning the Best Results : Before we serve your results, we evaluate how all the relevant information fits together: is
there only one topic among the search results, or many? Are there too many pages focusing on one narrow interpretation? We
strive to provide a diverse set of information in formats that are most helpful for your type of search.
7 Expert Training Institute - Udit Khanna
How Search Engine [Google] Works?
Expert Training Institute
Ê USEFUL INFORMATION
Ê Larry Page once described the perfect search engine as understanding exactly what you
mean and giving you back exactly what you want. Over time, our testing has
consistently showed that users want quick answers to their queries. We have made a lot
of progress on delivering you the most relevant answers, faster and in formats that are
most helpful to the type of information you are seeking.
Ê FIGHTING SPAM
Ê Every day, millions of useless spam pages are created. We fight spam through a combination of
computer algorithms and manual review.
Ê Spam sites attempt to game their way to the top of search results through techniques like repeating
keywords over and over, buying links that pass PageRank or putting invisible text on the screen. This
is bad for search because relevant websites get buried, and it’s bad for legitimate website owners
because their sites become harder to find. The good news is that Google's algorithms can detect the
vast majority of spam and demote it automatically. For the rest, we have teams who manually review
sites.
Ê Identifying Spam
Ê Spam sites come in all shapes and sizes. Some sites are automatically-generated gibberish that no
human could make sense of. Of course, we also see sites using subtler spam techniques. Check out
these examples of “pure spam,” which are sites using the most aggressive spam techniques. This is a
stream of live spam screenshots that we’ve manually identified and recently removed from appearing
in search results.
Ê Things we cover:
Competitor
Analysis
Ê Things we check:
Title Headings Discovered WWW XML
Tag Pages Resolve Sitemap
(H1)
Meta Alt Content
Blocking
Description Attribute Quality Robots.txt
Factors
Domain Mobile
Registration
Mobile Custom Structured
Friendliness Font Size 404 Page Data
Markup
Mobile
Blog Mobile
Speed Analytics
Compatibility Load Time
Ê It's not always about getting visitors to your site, but about
getting the right kind of visitors.
Ê Keyword Placement
Check and set the crawl rate, and view statistics about when GoogleBot accesses a
particular site.
Write and check a robots.txt file to help discover pages that are blocked in robots.txt
accidentally.
Get a list of links which GoogleBot had difficulty crawling, including the error that
GoogleBot received when accessing the URLs in question.
See what keyword searches on Google led to the site being listed in the SERPs, and
the click through rates of such listings with extended filter possibilities for devices,
search types and date periods).
Rich Cards a new section added, for better mobile user experience.
Ê In a nutshell. Web site owners use the /robots.txt file to give instructions
about their site to web robots; this is called The Robots Exclusion
Protocol. The "User-agent: *" means this section applies to all robots.
The "Disallow: /" tells the robot that it should not visit any pages on the
site.
Ê A robots.txt file is a file at the root of your site that indicates those parts
of your site you don’t want accessed by search engine crawlers.
Ê The first thing a search engine spider like Googlebot looks at when it is
visiting a page is the robots.txt file.
Ê XML Sitemap: By placing a formatted xml file with site map on your
webserver, you enable Search Engine crawlers (like Google) to find out what pages
are present and which have recently changed, and to crawl your site accordingly.
Ê This is a very common error on the web and it occurs when you are
trying to visit a page which has either been deleted or has been
moved somewhere else.
Ê Put simply, a degree of control is possible over how information travels from a
third-party website to Facebook when a page is shared (or liked, etc.). In order
to make this possible, information is sent via Open Graph meta tags in the
<head> part of the website’s code. head html
Ê Now, other social media sites also are taking advantage of social meta tags. All
of the other major platforms, Twitter, LinkedIn, and Google+, recognize Open
Graph tags. Twitter actually has its own meta tags for Twitter Cards, but if
Twitter robots cannot find any, Twitter uses Open Graph tags instead.
Ê Social media sites are the major drivers of most of the web’s traffic.
Consequently, the ability to harness the power of social meta tags is a vital skill
for today’s marketers. The tags can affect conversions and click-through rates
hugely.
Ê RSS (Rich Site Summary; originally RDF Site Summary; often called Really
Simple Syndication) uses a family of standard web feed formats[2] to publish
frequently updated information: blog entries, news headlines, audio, video. An
Ê PANDA - Google's Panda Update is a search filter introduced in February 2011 meant to stop
sites with poor quality content from working their way into Google's top search results. Panda is
updated from time-to-time. When this happens, sites previously hit may escape, if they've made
the right changes.
Ê PENGUIN - Google launched the Penguin Update in April 2012 to better catch sites deemed
to be spamming its search results, in particular those doing so by buying links or obtaining them
through link networks designed primarily to boost Google rankings.
Ê PEGION - The purpose of Pigeon is to provide preference to local search results. This is quite
useful for the user and local businesses.
Ê Google's link disavowal tool allows publishers to tell Google that they don't want
certain links from external sites to be considered as part of Google's system of
counting links to rank web sites.
Ê Off page SEO refers to techniques that can be used to improve the position of a web
site in the search engine results page (SERPs). Many people associate off-page SEO
with link building but it is not only that. In general, off Page SEO has to do with
promotion methods – beyond website design –for the purpose of ranking a website
higher in the search results.
Ê Unlike On- page SEO, off-page SEO refers to activities outside the boundaries of the
webpage.
Ê Things we cover:
Directory
Submission
Ê Among all the SEO techniques, article submission can be one of the
most successful. Article submission generally refers to the writing
of articles that are relevant to your online business and then getting
them added to the popular article submission directories.
Ê Forum websites are online discussion sites or in other words, 'message board'. ... Forum
Posting simply means posting new threads or replying to old ones in forums in order to
get quality inbound links to a website.
Ê Advantages
#1 Forum link building/Posting service is a SEO technique which helps in building backlinks
to your website.
#2 Forum Posting is an Internet Marketing service which uses forum communities to build
inbound links.
#3 Forum Posting is an ideal method to quickly build inbound links to your website.
Ê Guest posts and guest posting is where a writer who owns his
or her own blog creates a unique and original post on another
blog or site with a mention of the author and usually their
blog at the bottom of the article.
Ê Keyword Stuffing
Ê Technical SEO
Ê Plagiarism Checker
Ê Woorank Ê LipperHey
Ê Seoptimer Ê Trifecta
64 Expert Training Institute - Udit Khanna
SEO Tools
Expert Training Institute
Ê KW Finder Ê SE Cockpit
Ê Ranks NL
Ê Web Seo Analytics
Ê David Naylor
Ê Live Keyword Density Analysis
Ê SEO Book
Ê Add Me Tool
Ê Rapid Search Metrics
Ê SEO Centro Tool
Ê Webconfs Keyword Density Checker
Ê SEO Chat Keyword Density Tool
Ê Plagiarism Checker
Ê Duplichecker Ê Plagium