Friday, May 08, 2009

SEO, Subdomains, Site Architecture and Sitemaps

By Jeffrey Smith in Featured

Today, (with slight hesitation in fear of giving away too much) I am electing to share an effective SEO method which incorporates the use of sitemaps, subdomains and site architecture. As a result, you will have the capacity to develop robust websites with colossal proportions using a coherent site and link architecture to virtually zero in on competitive rankings and long-tail keywords alike.

This involves the use of subfolder / naming conventions, SEO friendly titles, relevant semantic descriptions, pretty urls, subdomains and sitemaps.

By employing this strategy, it is similar to targeting the roots of a tree (the keywords stemming from a topic) to reach the leaves (top 10 rankings) by giving them a value (page) and then implement an internal link / irrigation system capable of producing its own secondary and tertiary ranking factors as a result of link cultivation.

Sitemaps do not have to have a passive (just for crawling) in contention to SEO. In fact, think of a sitemap as a two way street. On one hand, you can use sitemaps to increase crawl frequency and get more pages in a search engine’s index. On another level, you can use sitemaps as a ranking tool designed to “reverse funnel” ranking factors to the pages that need added link weight to hone in on competitive rankings (much like a powerful pipeline).

In order to take this tool which was considered passive and turn it into a very powerful internal link sculpting tool, you only need to apply a few fundamental protocols to implement this tactic.

When you look at a Wikipedia ranking, try looking beyond the topical layer and attempt to observe the infrastructure of why and how it got there. The topical layer (landing page) represents a relevant triangulation of on page relevance (title) with keyword / search term is prominent and first - brief descriptor and site referral loop (Wikipedia, the free encyclopedia) to round off the title tag /naming convention.

In addition, the keyword is also translated into a URL string on a sub domain [en.wikipedia.com/wiki/keyword] to truly concentrate the ranking factors. The tactful use of Sub domains is one SEO method to (a) expand the exact match domain / url to encroach on a more relevant keyword making a domain more specific to a topic.

There is virtually no limit to this on page SEO tactic as you can essentially expand the focus of any website to broaden the relevance funnel using the sub domain tactic. This means, with the right amount of links and content, you virtually scale the content and links pointing to each page in a website to function as the preferred landing page by consolidating internal and external links. This is known as the threshold or barrier to entry for that keyword, and each keyword has a unique tipping point until it gains momentum and ranking power.

An example of how Wikipedia employs site architecture for optimal SEO value is:

* Topic1.domain.com as the base - which will require a sufficient amount of links to stem.
* Topic1.domain.com/wiki/topic1-keyword (the wiki folder is where the magic happens).
* Topic1 Keyword becomes first shingle in the title tag.
* Topic1 Keyword becomes H1 header tag to emphasize relevance.
* Topic1 anchor text from other pages all link to Topic1.domain.com/wiki/topic1-keyword

Yet, there is a hidden layer of SEO (the wiki folder) that most do not witness that is responsible for the prominent rankings based on site architecture the site produces.

What I am referring to is the other pages in the subfolder and non indexed pages responsible for shifting ranking factors that allow the webmaster to add one more layer of relevance by controlling the anchor text that feeds the main silo/ subfolder or landing page.

Naturally this can be implemented on semantics alone or a simple PHP script will suffice to concentrate ranking factors in your websites content management system. The only thing you need to maintain buoyancy for hundreds or thousands of pages is a pipeline capable of shifting link weight from page to page, in this instance the subfolders within the subdomains become the preferred landing page.

In this instance, using the http://en.wikipedia.org sub domain (as an example) for English provides them with the ability to funnel ranking factors from page to page, yet still keep the english from the spanish version, and so on and so fourth.

In the past, the downside of this strategy is that each sub domain is considered its own site. Now, this becomes an asset as you can essentially determine how you feed your pages and subsections (all based on a keyword) from one to the next. Also, which type of anchor text you use to feed the specific landing pages will determine how they fare in the search engine result pages.

For example, by using custom sitemaps (based on semantic clusters) you can funnel specific anchor text to specific pages to elevate prominence and relevance. For example, all pages corresponding to a particular keyword could be fed with a second / alternative modifier or qualifying term to promote keyword stemming.

The site:yourdomain.com keyword site operator can provide ideas for semantically themed pages that correspond to a virtual site architecture within a website.

Once you have a list of semantically coherent pages (based on keyword research) you can then nurture them in one place to implement the primary point of convergence (the sitemap) or hub page.

By using robots.txt or the noindex, follow meta tag, you can use sitemaps and landing pages designed to group clusters of concepts, keywords, other landing pages or subjects in one central place where you can feed multiple pages from one entry point.

Through managing the supporting pages (which all link up to the top level landing page to transfer their authority) you can sculpt up to 70% of the ranking factors for any given keyword. As a result, the transcendental ranking factors begin to spill over and strengthen the domain they are hosted on (which in turn feeds more pages which rank higher, etc.).

Eventually you have dozens, hundreds or thousands of pages in a site that all have page rank or the ability to pass ranking factors from one page to the next. By their very nature (the individual pages) they are optimized from the onset and when combined represents a ranking juggernaut as each page develops trust rank and authority.

The aggregate ranking factors for each page begin to stem and expand (which means it can be found for any two or three word combination’s on that page) if a related search query is executed in search engines.

What you have at that point is a website capable of ranking for multiple keywords simultaneously and showcasing the tip of the iceberg (the ideal landing page) built specifically as a consolidation of the keyword / topic capable of ranking on a fraction of the links required by a website that does not employ superior / coherent site architecture.

To summarize, internal links fueled by external links to one concentrated point and then augmented by deep links to the top level landing page have the ability to rank on fumes compared to a website employing that employs less efficient site architecture.

Which means that (a) the more topical information you have on a topic the better (b) that you can elect which pages are SEO savvy and appear (as a result of internal linking) and (c) there is virtually no limit to the size of reach of the websites semantic theme.

Obviously the more concentrated it is, the better (as it will require less links to cross the tipping point). You must understand that content and or external links both produce ranking factors, so it is possible to a website such as this to produce 60-80% of its own ranking factor by default (with more pages gaining strength, page rank and stemming daily).

So, the takeaway for the tactic is:

1) Group landing pages or themes virtually by using a sitemaps to stand in as pipelines to funnel link flow.

2) Build it properly from the onset, or consider mapping out a more conducive site architecture and 301 redirecting legacy pages to the new themed and silo’d pages.

3) Group primary keywords along with secondary supporting pages in miniature sitemaps to concentrate on a core group of keyphrase shingles.

Taking point 3 from above, you could take keywords like; SEO consulting, SEO consultant, SEO consulting services and feed them by a virtual sitemap linking them together (regardless of their location in the site architecture).

However, if the pages were in a subfolder (consulting) within a subdomain (SEO.domain.com) for example and used internal links which all point to the sitemap and the sitemap to them, each page would share a percentage of total link flow for that topical range of key phrases and modifiers to essentially exceed the threshold of relevance on all layers.

Then add deep links (links to each respective landing page) and you have the ability to catapult them all to the top of their respective shingles using a fraction of the links your competition is using.

And, how do we know this you might ask? Let’s just say we have done this before “with stellar results”…the next layer would be to implement a series of RSS feeds based on the same type of infrastructure to publish related content whenever new information was added, linked to or layered to promote buoyancy for laser-focused keywords and key phrases, yet that is another post in itself…

Jeffrey Smith is an active internet marketing optimization strategist, consultant and the founder of Seo Design Solutions Seo Company http://www.seodesignsolutions.com. He has actively been involved in internet marketing since 1995 and brings a wealth of collective experiences and fresh marketing strategies to individuals involved in online business.



No comments: