Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 9

The purpose of this paper is to consider the current state of SEO in 2012.

It studies several of the key drivers that affect ranking signals, past present and future. We will examine some of the recent changes by implemented by Google over the past 12 months, to better understand the impact this is having on modern Search Engine Optimisation. The explosion of the internet has been the greatest achievement of the past century. It was only natural that a technology would emerge, to manage and organise the content of the websites and webpages that make up the visible internet. We all look to Google as the leading Search Engine today; they quite rightly deserve respect, on a good day. When we consider that Google was created as a small project called Back Rub in 1996 by two engineers from Stanford University, Its clear that it the engine would draw upon some of the principles of academic referencing to index the web. Not to mention the processing power and data modelling thats needed to effectively crawl each website to provide accurate search results. Given the size of the rapidly growing index, Google is still able crawl, index and rank a website then to return a results page within 0.3 seconds. (Thats faster than the strike of a diamond back rattle snake!) Every webpage is given a small amount of Page Rank according to the original formula by Google founder Larry Page. Page Rank is assigned from Google to websites and then to webpages by the use of hyperlinks. Subsequently the associated page becomes authoritative based on the types of links from internal and external sources. This ranking factor helps creates an organic web index. Google has turned this into a vote for the websites authority which has become one of the key ranking signals. Page Rank is an Iterative Algorithm which means that it doesnt stay static; its passed back and forth depending on the amount of links per page (a little like water being carried from page to page, site to site.) The Page Rank formula has been documented as follows, PR(A) = (1-d) + d (PR(T1)/C(T1) + ... + PR(Tn)/C(Tn))

Page Rank changes overtime and re-calculated as the updates come out and back-link profiles change. It has become a little less important due to over 200 ranking signals that have been developed to rank websites as part of the modern search algorithm in 2012. Google historically used Page Rank as a crawling signal on a logarithmic scale 1-10, assigning each website a Page Rank. This scale provides an indication of quality (per page) and an indication of how often the site should be visited by Googles Search Bots that update their index of the web. Page Rank can be moved throughout a website to improve the strength of a page by using the newer no follow link (2005) or just by removing the links on a page. This is known as Page Rank sculpting, The no follow link doesnt improve Page Rank it ignores it, thereby keeping the link-juice on the page based on the amount of links per page. The information architecture / information hierarchy that is created as part of the U.X or U.I can have a significant affect on this at a design level. By changing the flow of link-juice at a design level, were able to

create a taxonomy that supports the information hierarchy of a websites structure. Another way we achieve this is interlinking content or navigation through the anchor text of a hyper-link. In recent years weve seen changes to how the authority of internal and external links can affect the positioning of a webpage due to relevance within the results page. This has been a main cause of SEO practitioners over-optimising the anchor text of hyper-links. <a href=http://www.liquid-silver-marketing,co.uk/seo.php>SEO</a> Anchor text from links within the back-link profile helps to determining the topical relevance and authority of each website. Creating targeted links based on content has always been an important factor for on-site and off-site SEO. Many Search Practitioners have over used this method by targeting the anchor text on external links too much to improve keyword related rankings. Google responded in 2012 with the link spam penalty also known as Google Penguin which analyses the anchor text density of links, to rate the quality of the on-site content. When we consider the nature of the authority of a website we must think of trust. But how do you understand trust within an algorithm that can return a result in 0.3 seconds? The quality of the links and the persona adopted as part of a link-building strategy. There will always be a need to create accessible, quality content as the foundation of SEO, but quality online is subjective when we really try to define what quality content means. The accessibility of content can be defined with the areas that we as search professionals can control, such as on-site, page level SEO i.e. Title Tags Meta Descriptions Keywords URL Structure Internal Linking Structures Valid XHTML / CSS Server-side Hierarchy Sitemaps Unique content Googles still only able to fully index and understand XHTML 1.0, as it is unable to fully index technologies such as, Flash, Java, Ajax etc. Our task in Search Engine Optimisation is to look beyond these on-page accessibility improvements and create valuable content thats deserving of rankings in search engines. Too much Search Engine Optimisation We in search have always understood the value of exact match keywords and anchor text to pass link-juice via Page Rank, which helps to improve the relevancy of a webpage. But given the recent changes in Googles Algorithm - the anchor seems to be dying with the replacement of link authority aforementioned.

Its not surprising really when we consider historical Search Engine Optimisation practice; it was just a case of looking at the Googles Webmasters Guidelines, having a basic knowledge of XHTML and a simple understanding of the Page Rank formula. Over the years people have been building links that target the anchor text, do this a few times and great, youre number 1. This low-level SEO practice has caused the search engines abit of a problem, as poor quality websites and directories have started to rank in places they shouldnt. Google needed to do something to curb the size of the rapidly growing index, which has been deteriorating in terms of quality for quite some time. The size of the internets index is beyond Googles control but the ranking factors are. The quality of the returned results can be interpreted as a direct reflection of Googles search engines standards to users. They need to quality assure results to a certain extent. In February 2011, Google launched The Panda update, which is algorithmic and is updated nearly every month, so its an instant change to the way results are returned. Its partly based on saleable machine learning so it thinks more like a person and is less logical rather than just a programme or formula. When we really look at the heart of the problem of over-optimisation beyond keywords.The anchor text of links is largely to blame, but without links there would be no internet as we know it! Five years ago it was a simple process you build X amount of links with Y amount of Page Rank and you gain positive rankings. Its no wonder why so many people started to undertake article spinning, directories and using blog networks in a distant and foreign-land to improve rankings for keywords. For example if we look at a PR4 site, it creates a Page Rank of 61,195 with 42 links on the homepage the transference of link-juice would be 1,441 per link. When anchor text link density at 72% is used on sitewide to increase relevancy it creates problem by replicating the theme across the internal pages. Its fine to link pages that have relevancy to the user, but when we cross the line by trying to target the online content too much, by stuffing keywords on a page it becomes a problem. Especially for the information architecture on-site, it becomes a mess. For example, breadcrumbs are a better alternative to add keywords on-page because it adds value to the user-experience. but we really need to consider how this works with the theme of the page and site and if it relates to our content strategy. Every webpage can and should own a keyword, internal pages should inter-link to support the navigation for users. SEO is so far beyond keywords now and there is now a real penalty in place that can see your website go from hero to zero overnight, if we get the off-page \ on-site SEO strategy wrong. This does create friction with the SEO field when we consider the competitive nature of competing sites. Link-building is hard and is even getting harder. Every time in SEO we learn of a new search strategy the rules change or our process is de-valued by Google. Our strategies needs to be ethical and sustainable more so then ever before. Hence the differences between white-hat and black-hat SEO. I was recently reminded at an SEO event hosted by Webbed Feet UK that, Everyone wants to be Number 1, but theres only one Number 1 spot. The chances are, the websites that rank above you are doing something well and better than you Aaron then went on to explain the diverse range of long-tail keyword research and content marketing techniques that can be used, but the main point remains, we now need to be better online and deserve to rank for our chosen keywords. Good just isnt good enough anymore.

Link Analysis, the Google Penguin Update So how do you fix this over-optimisation problem in Search? Googles answer seems to be, Devalue the anchor text by reconsidering prominence, quality, environment and relevancy. As previously mentioned Google Penguin was first seen in April 2012, which seeks to better understand the types of links that have been created for SEO purposes and it drops the rankings accordingly if the percentage of spam is reached. The removal of the penalty is part of a manual review process that can even see a website banned due to questionable link-building strategies. We know that anchor-text now considers prominence, which weve always used to an unfair advantage but also contextual analysis driven by using Ngram data is also being used. How else beyond links could relevancy at a global level within search be defined? (Just an observation of connected studies). In terms of prominence, not all links are created equal, many websites use footer links to improve the theme of the page content, but also to try and pass value to other pages or from site to site. This can also occur based on the location of keywords that are placed throughout the site as part of the internal linking structure; we often help support the theme by targeting and disseminating the link-juice across the pages by improving the Page Rank flow or in other words sculpting Page Rank. Which is ok, we just need to be careful to keep the anchor text density below 65% and shown in some recent studies. We have recently seen change to the rankings for content below the fold which again seems to be based on relevancy. Maybe bounce rates? Although Google says theyre not used to calculate rankings, we do know that click-through-rate is connected to rankings how else could a website be found and quality measured on a search engine results page beyond rankings be used as a quality signal? Some of this can be seen within Google Webmaster Tools which has a section that provides us with a CTR (click-through-rate) which is an indication of how well the content is doing within the results pages. If anchor text drives rankings from the back-link profile then poor results are shown by way of high bounce rates, the impact on search quality is clear to see. The reading level of content has always been important as we consider the returned results. If we were to query the term flu we would expect maybe a Wikipedia article but if we query the term. Influenza. We get medical journals and articles returned. Although these subjects are related, on their own they return a completely different result which has to do with searchers intent and the reading level for the query. Thereby serving a result relevant to the users search which goes beyond keyword research and is theme related. Therefore the brand must reflect this content strategy site-wide. Search Engine Optimisation isnt just about working on the site-wide and page level link-juice metrics, its about building brands that people want to link to and share, which drives relevancy.

Its become more about Inbound Marketing which is a term coined by a company called Hub Spot. Inbound Marketing seeks to place a greater emphasis on a content strategy that drives results. Most website can benefit from some forms of SEO because its often overlooked at the development stage, theres a large gap between the below professions which can have an impact of how well a website performs in the search engines. Graphic Design SEO / SEM / PPC Content Strategist, Copy Web design XHTML,CSS User Experience Management (U.X) Clients Aspirations (Disjointed) Web Developers PHP, ASP, JAVA, AJAX Content Management Systems C.M.S Marketing, Branding, Communications This goes far beyond the technicalities of SEO We are all seeking to do different things that are related. So we need to adopt a design process that works together to deliver websites that arent solely created for search engines. They need to put the user first. When we consider that the purpose of search marketing the objective is to deliver targeted visitors that ultimately complete a conversion. Whether that be lead generation, e-commerce, engagement or direct sales. part of our goal is to deliver our message to the audience that supports this process.

Search, P.P.C Email and Blogging are within the reach of most business, our strategies need to consider the wider context of our search marketing in order to deliver meaningful results. The Future of Ranking Signals Search Engines are always seeking to improve ranking factors, we know that Google changes and refreshes up to 6 times per day and that they make over 500 changes per year. Every year they make a major update that puts the SEO sector in a spin, as previously mentioned the Panda and Penguin updates

(prior to this Google Caffeine and Florida, etc). The purpose of these updates are with good intentions and also provides us with insight and direction to where SEO is going. If we as search professionals stay within the guidelines we are going to be fine. SEO had become about ethics, What we could do, What we might get away with and What we should be doing. Its not the same as other forms of marketing is it? Now other websites / spammers can have a serious affect on our websites performance due to these new updates. Although Google did change the guidelines in May 2012 to admit that the concept of negative SEO is a real possibility. Given the penaltys that over-optimisation can cause, competitors have become a greater threat. We are given notice of penalties via unnatural link warnings but often its too late. Theres is little we can do if others link to our content, Its a bit worrying when SEO is reverse engineered! In 2008 Chief Editor of Search Engine Land, Danny Sullivan summarised the evolution of search in four simple phases, 1) Keywords and text 2) Link Analysis 3) Integration of vertical results 4) Personalisation He was right, when we look back in hindsight, but where do we go from here? A recent questionnaire by SEO Moz to various search professionals concluded that the following ranking factors are important: 6.2 % Click-Through-Rate from search results traffic 5.3% Social Signals, 23% Trust of the link environment popularity of links the specific page 20.26% of the anchor text used 15.04% of keywords used 6.91% of the hosting environment Graphic Design Web design XHTML,CSS SEO / SEM / PPC User Experience Management UX Content Management Systems CMS Content Strategist, Copy Clients aspirations (Disjointed)

Web Developers PHP,ASP,JAVA,AJAX

Marketing, Branding, Communications

As we can see that trust has become an important factor, as we now know that bad links do damage rankings as per the Penguin update. The quality deserves freshness update (QDF) saw sites with a high level of domain authority reach the top sometimes within hours, (newspapers /media sites are a good example of this.) The top of the search results page over the past 12 months has changed. We have now seen an increase in other pages that arent solely optimised content. A large part of the real estate on the top of the results

page is taken up by extended P.P.C links, the new Google Knowledge Graph, Author Bios, News Results and Google Places to name a few. Some say that diversity of domain level results has suffered as some sites with high levels of authority have started to rank and cover the top spots for content that isnt specific enough to the search query. Theres been a lot of speculation about the rise of Social Media, the truth is that nobody knows how strongly social will impact on rankings. Social isnt applicable to every business and fans; friends and follows are easier to gain than targeted links from trusted sources. Its just not clean enough yet to depend upon as a ranking factor and should be treated as such when considering the overall search marketing strategy. Even although Google are heavily investing in their Google Plus Network, its just because they need to hold on to their dominance of the search market. Just imagine if Bing partnered with Facebook to provide integrated search to an audience of 900 million that would hit Googles market share hard! The cobblers children often have no shoes Its also been said that Google is the rich kid that showed up to the Social Media party a little late and as a result nobody wants to speak to him or her! The Google Plus Network hasnt grown at the level it should have compared to others over the past 12 months. Even although they have pushed this entire platform with all of their corporate might and integrated technologies to increase memberships such as Android Smart phones which includes Google Plus as an automatic update. Also pushing this network onto Gmail customers The plus one button and profile badges as a Google link-building strategy hasnt even gone mainstream by way of being adopted on websites as a trust signal like Facebook or Twitter. But pictures of authors in the search have fundamentally altered the click-though-rates. Once this has been fully adopted, it could become a strong ranking signal. Links from valid author bios that have a large social following could be better than traditional link from other websites due to personalisation. Author Rank may well be the next big-thing! But what if Search Engines became RSS driven using author rank? Wow! Content is Still King in a post Panda and Penguin World! When we look at the nature of Content Strategy, most SEO practitioners see this as beyond S.E.O and more of a marketing and branding function. The term Optimisation needs boundaries and Google making that clear, we need quality content. But good content is tough to create; it quite often needs an entire team to build. Most digital practitioners all have different views on this and not to mention internal departments. I try to forget how many times Ive heard people say, I dont like that image, I dont like way it reads, That needs to go on the home page, I want a button there.

The truth is that engagement of content at an organisational level provides the opportunity to help build a brand that everyone signs up to. Rather than us in S.E.O protecting the keyword density and ranking factors of a page or an entire site. We need to build our brand strategy around researched content first. We need to work with stakeholders within the brand to better understand the brand identity aspirations and goals that are being sought from an organisational perceptive. Its not enough to say: we just need to be optimised as the number 1 provider within our sector, that way were seen market leading. Real companies do real company stuff (Will Reynolds Seer Interactive). The real stuff that companies do is what we optimise for, thats what creates a communications strategy. Content needs to deserve to rank and rank for quality and not for on-page keyword analysis that chases the long-tail with the view of gaining lots of little wins. Our content online needs to deliver, to the right audience at the right time and it should help us generate revenue, after all thats what were in business for! The content we create needs to add value to the user experience. For example 5 years ago in SEO we might of created the below snippet to drive keyword density. www fridge- freezer-frost-free-example.com Here you will find all of your favourite Fridge Freezer brands, including Hotpoint, LG and Candy. Browse our selection of Fridge Freezers and save with our awesome Fridge Freezer deals! Buying Fridge Freezers is easy and cheap when you shop Fridge Freezers Given the recent changes in 2012 this should be... Not all Fridge Freezer are made equal!, but we have a Fridge Freezer for every need. Hotpoint and Candy Fridge Freezers have proven to be reliable, affordable mid-range options, while LG Fridge Freezers are the type of top-of-the-line appliances you might find in a hotel. Budget-conscious shoppers may appreciate the simplicity and affordability of brands like Candy, Bush or Samsung. Chat instantly to a Customer Service Representative if you have any questions. Were here to help! View our range of Fridge Freezers The above example is adapted from a recent study on conversion rate optimisation and is more focused towards copy style. Overall it provides a better reason to buy and if not it becomes informative to support navigational queries; search engines would love it too!

We now live in a world where our messages transfer across the entire digital ego system such as Mobile, Video, Email, Social, Pod casts, Webinars and localised results. In Summary Its tough to create content within a brand that connects with clients across multiple verticals, but we now live in an age where SEO is evolving beyond the technicalities of page-rank, internal and external links. This is being driven by the changes in the Google Ranking factors and updates that are being refreshed every month. Times are only going to get more challenging as weve seen the Panda update refresh up to 3.9 already and become a core part of the algorithm. The future is content focus material that is shareable across the entire social spectrum online. Without the immediate need to over optimise for traffic and keywords, as the saying goes: Build it and they will come. As newer channels are growing and developing such as Mobile, Social, Clouds and Apps we need to keep ahead of what people need and want by constantly researching and testing our methods. We have a responsibility to our clients to ensure that the changes and recommendations we make are going to be sustainable in the long-term and dont focus on short-term gains for keyword positioning. The future of Search Engine Optimisation is Inbound Marketing

You might also like