Friday, May 15, 2009

What’s Killing the Newspapers?

As ironic as it may be, newspapers are currently topping their own headlines. Well-known newspapers such as The New York Times, Chicago Tribune, and the Los Angeles Times are not only downsizing employees, but are also cutting sections and features from their publications. While the newspaper industry appears to be dying, the news itself is actually flourishing in other forms.

What’s the reason for all this? Some blame the economy and expect the government to bail out the newspapers. U.S. Senator Benjamin Cardin introduced the Newspaper Revitalization Act to Congress, which would allow newspapers to operate as non-profit organizations if they wanted to. This week, Governor Chris Gregoire of Washington State approved a tax break for newspaper printers and publishers.

Some newspapers blame Google for their struggles claiming the search engine is stealing their content. Search industry leader Danny Sullivan disagrees. He believes newspapers actually get “special treatment” from Google. There are news publications that do not appear in Google news, but so many of the complaining newspapers do. These newspapers also receive a tremendous amount of traffic from Google that many other publications would readily appreciate.

Lastly, some even say the newspapers have created their own crisis. Has the newspaper industry embraced the Internet to its full potential? Could they have approached advertising in a different way that could have produced better benefits for them? Are they monetizing their traffic in the most effective manner?







Oldest Tweeter

Oldest Tweeter talks cuppas and casserole on Twitter at 104


Ivy Bean has become the UK’s oldest Tweeter after signing up for the micro-blogging site at the age of 104.


Ivy Bean, who is 104 years old, has become the UK's oldest Tweeter on the social networking site Photo: PA

People can follow the silver surfer’s updates at 'IvyBean104 ’ on Twitter.com

Her first posts have included “Looking forward to Deal or No Deal later,” “just having a cuppa,” and “chicken casserole was lovely, going to have a nod now.”

And she is not the only pensioner at Hillside Manor residential home in Bradford, west Yorks, to be regularly getting online.

Pat Wright, Residence Manager said: “All the residents are taking a leaf out of her book. Four signed up for 'computer-college’ while others joined Facebook, surf the net and enjoy themselves with ten-pin bowling on the games console.”

Mrs Bean was already a keen Facebook user but members of the IT support service the Geek Squad helped the pensioner get bang up to date.

The Geek Squad set her up and gave her some navigation training and top tips on how to manage the social networking phenomenon.

The support group is now challenging the Twittersphere to find out if there is actually someone older than Ivy posting updates on their whereabouts, updates and activities. Make sure you catch her before she logs off for her daily weekday appointment with Noel Edmonds on 'Deal or No Deal’.

“It’s brilliant to help someone as inspirational as Ivy to get started and teach her about Twitter,” said agent Martin Dix

“She’s quite tech savvy and already signed up to Facebook with 4,800 friends. She shows others that you shouldn’t be frightened of technology. If she was 50 years younger, I’m sure she’d make an ace Agent herself.”

Twitter membership figures are on the rise with adults and recent figures found that 45 – 54 year-olds are the site’s top demographic

© Copyright of Telegraph Media Group Limited 2009


Thursday, May 14, 2009

Watching Twitter’s #Fixreplies Firestorm

By Catharine P. Taylor

What a fun morning I’m having on Twitter search, looking for tweets containing the hashtag #fixreplies. Oops, wait a minute … since I logged onto the site, a minute and a half ago, 230 more replies have come in with that hashtag. Oops, make that 276. Now make that 326.

So what is everyone all tied up in their underwear about? The settings change that Twitter (415 tweets as of now) announced on its blog yesterday, saying that people would no longer see @replies (453) of people they don’t follow. This has caused the first Facebook-style Twitter revolt, as users (489) have poured onto the service to complain. The main complaint about the change is in that without this option, (531), users lose an important resource that tells them who might be interesting to follow, and they’re mad (605). (OK, I’ll stop with that meme, but you get the drift. Smoke is coming out of Twitter’s servers right about now.)

According to Twitter co-founder Biz Stone, Twitter enacted the change because, “based on usage patterns and feedback, we’ve learned most people want to see when someone they follow replies to another person they follow — it’s a good way to stay in the loop. However, receiving one-sided fragments via replies sent to folks you don’t follow in your timeline is undesirable. Today’s update removes this undesirable and confusing option.”

What we’re witnessing here is, once again, that it’s going to become nigh impossible for any of the popular social nets to make changes without involving users first. If you take a close look at Stone’s statement above, what you see is actually a fairly old media response, one assuming that the owner of the media property, in this case, Twitter, knows best: “receiving one-sided fragments via replies sent to folks you don’t follow in your timeline is undesirable.” You can just hear the Twitterati saying, “Undesirable to whom?”



The headline for Stone’s statement in the same vein. Reading “Small Settings Update,” it assumes that users will also view this as small — but apparently, it’s big (OK, now we’re at 1,390 new #fixreplies tweets.)

So what are social nets to do? Put everything to a vote? Not always practical, although Facebook was right to do it with its terms of service, which truly was a big change. So, are they to pack their services with so many potential options that traveling through “settings” for any one of them is a day-long excursion? Also not practical. What they do have to do is float changes with users before they make them, and then gauge the volume of the outcry. Something tells me that they would get a more reasoned approach by communicating potential changes before they happen, rather than dealing with the firestorm that inevitably erupts when users feel that something some of them valued has been snatched from them in the night.

In the current situation, Twitter appears to be weighing the outpouring of feedback, which is good. As @adbroad points out, co-founder Evan Williams tweeted the following 10 hours ago: “Reading people’s thoughts on the replies issue. We’re considering alternatives. Thanks for your feedback.”

But for now, the firestorm is raging, out of control (3,431).

Catharine P. Taylor has been covering digital media and advertising for almost 15 years. She currently writes daily about advertising on her blog, Adverganza.com. You can reach her via email at cathyptaylor@gmail.com, follow her on Twitter at cpealet, or friend her on Facebook at Catharine P. Taylor. (mediapost)
One Response to “Watching Twitter’s #Fixreplies Firestorm”



How To Write Article Headlines

How To Write Article Headlines: 5 Tips for Clickable Titles

What makes the difference between an under-performing article and one with a drastically higher number of views?

Many times it’s something as simple as an awesome title that pushes an article over the edge from so-so to spectacular!

Here’s why your title is crucial:

Most of the time readers will discover your article through:

a) a Google search b) looking through an article directory

In both of these instances, your headline is one of the few bits of information the reader sees before they decide to click through to read your entire article.

When a reader is looking through a long list of articles on a directory or on a search engine results page, they are quickly scanning a long list of titles, and the title plays a huge role in which article they decide to read in its entirety.

So, don’t take your titles lightly–really put some thought into them and be willing to do some experimenting to see what types of titles work best for you.



Want headlines that generate more traffic? Try these 5 tactics:

1) Be short and sweet.

Put yourself in the shoes of a reader–when you’re scanning a long list of article titles, you don’t necessarily take the time to read each and every title in full. You’re just glancing over each line, trying to get the gist of what the article offers, and sometimes a very clever, long title can be overlooked simply because it doesn’t scan well.

Shorter titles tend to be more direct and focused, and this also helps search engines determine what your article is about.

Now, this isn’t to say that you shouldn’t ever write long and clever titles–experiment, but be sure to try submitting articles that have short and punchy titles as well.

Then look at your article statistics and see if you can tell a difference in performance between the short titles and the longer ones.

2) Be direct.

Remember, Google does not understand irony, humor or puns. Search engines take things at face value. A more direct and straightforward title can help your article get higher ranking for your keyword terms.

3) Put your most crucial words at the beginning of your headline.

Again, this pays off when people are scanning your titles, but it also helps Google and the other search engines classify your article. By putting your most important words at the beginning of the title (possibly your keywords or variations of your keywords), you are making it easier for readers and Google to determine what your article is about.

Not sure how to make that work?

Here’s an example:

How To Write Article Headlines: 5 Tips for Clickable Titles

The first few words state specifically what the article is about, and the part after the colon gives additional information.

4) Your title should indicate the topic of your article.

Kind of obvious there, but when you become aware of your keywords, you may be tempted to put your keywords in your title even when the keywords are not appropriate for the article.

For example, your keywords may be “New York Dog Walker”, but in order to use those keywords in your title your article would have to be about some aspect of New York dog walkers. If your article is just about dogs or dog walking in general and not specifically about New York, then it wouldn’t be appropriate to include ‘New York’ in your title.

Your title should always describe what your article is about, and sometimes it’s not appropriate to use your keywords in your title.

5) Your title should make readers want to click through and read the entire article.

Remember, you’re writing for human readers, not just search engines. Even if it is appropriate to use your keywords in your title, you’ll want to put some thought into the phrasing of the headline so that it invites/inspires the reader to click the title and read the entire article.

Your title is the first thing a reader sees when they’re introduced to your article, and you can drastically improve your article submission success simply by paying attention to how you phrase your article headlines. Try these 5 tips and then watch your article stats to see which types of titles work best for you.
Carefully write your title, then submit your article to a vast network of targeted publishers. The more places your article is published, the more traffic your website receives. Steve Shaw created the web’s first ever 100% automated article distribution service, SubmitYOURArticle.com, which distributes your articles to hundreds of targeted publishers with the click of a button. For more information go to=> http://www.SubmitYOURArticle.com



Using Social Media to Boost Search Engine Results

By Lauren Hobson

Most of us are well aware that the search engines frequently change their algorithms to improve search results for users (and foil spammers), which can make it challenging for small businesses just to keep up. But as web technology continues to evolve, it also creates new opportunities for small businesses to improve their SEO strategies and boost their rankings as well. Social media (sites like Facebook, Twitter, LinkedIn, Technorati, Digg, etc.) provide an excellent opportunity for small businesses to not only promote their products and services online, but also to gain significant ground in the search engine results.

One of the most critical components to getting top search engine rankings is the number of inbound links and link popularity a web site is able to build. Although there are several existing link building strategies available to small businesses (e.g., press releases, directory submissions, article syndication, etc.), social media can help create additional high-value, on-target inbound links that are essential to achieving top placements in the search engines.

For example, each time you use Twitter to publish a link to new content on your web site, that link gets “planted” on the Twitter page of each person following you, and has the potential to spread even further as your followers share that information with their own network of contacts.

Integrated Social Marketing (ISM)TM
If you have properly integrated your social networking profiles together, that same Twitter “tweet” could then be fed via RSS to your Facebook business profile, your corporate blog, your LinkedIn account, and any number of other social sites that you have set up for your business. It’s not a far stretch to imagine the link you broadcast on Twitter could reach dozens, hundreds, or even thousands of other places on the web, all pointing back to your web site! By integrating your social networking profiles with each other, with your web site, and with your existing marketing initiatives, you can easily make one single marketing action (such as a tweet) show up in multiple places online, each containing a new, relevant inbound link to your site.



Quantity AND Quality
In addition to the sheer number of inbound links that are created through social marketing, the value of the links that are created is another important criterion that search engines consider. To be valued by the search engines, inbound links must be from relevant, “quality” web sites, and search engines today give social sites like Facebook and Twitter great value. These sites are highly visible to the search engines, and are constantly taking updates from users. Links tend to be shared according to subject matter, which means the search engines will see them as being relevant and on-target. All of these factors combine to create high-quality inbound links in the eyes of the search engines.

Online Visibility and Branding
Creating visibility for your business and your “brand” is really key when using social media for building links. The power of social media is realized when other users see your links or content, then share that information with their own network of contacts. Simply adding a bunch of links to your social profiles is not enough; you need to have a strong reputation and a brand that users trust so they will feel comfortable sharing your content with others. Brand recognition typically leads to natural link building anyway, which means your inbound links will end up coming from bloggers, colleagues, customers, and other people who are exposed to your links and find them useful enough to share with their own contacts.

The Proof is in the Rankings
A recent example from Website Magazine explained somewhat surprising results when they searched for their publication’s name in Google. As expected, their web site came up as the number one listing on the results page. But what was not expected was the number three listing on the results page was the magazine’s Twitter page. They then performed a number of Google searches for the terms “Chicago Tribune,” “Chicago Public Golf,” and “Daily Career Tips,” all with similar results in Google - the Twitter page for each of these terms came up near the top of the search engine results every time.

The conclusion was that given these results, Google must be giving serious weight to Twitter content, and I happen to agree. The search engines of course keep their ranking algorithms top-secret, so there’s no way to know how much weight (if any) is really given to Twitter or other social media sites. But results like those in the example above are hard to ignore!

A Great Opportunity
Social media is here to stay, and small businesses are beginning to use it to effectively promote their businesses, reach their customers, find new leads, keep customer mindshare, and instantly communicate with customers. But maybe one of the biggest benefits of adding social media to your marketing mix is the creation of high-value, on-target inbound links that can help improve visibility in the search engines and boost your business to the top of the search engine rankings.


Lauren Hobson, President of Five Sparrows, LLC, has more than 16 years of experience in small business technology writing, marketing, and web site design and development. Five Sparrows provides professional web site and marketing services to small businesses and non-profit organizations, giving them access to high-quality services at affordable prices. To read articles or subscribe to Biz Talk, please visit www.FiveSparrows.com/biztalk.htm.



Wednesday, May 13, 2009

How To Use Keywords In Your Article Submissions

By Steve Shaw in Writing

Article marketing has beginner, intermediate and advanced stages to it, so no matter what skill level you’re at, you can still submit articles to drive traffic to your website.

You may have started out simply writing articles on the topic of your website–that is a great start, and you can see excellent results by consistently writing and submitting on-topic articles.

But after you get used to the basics of submitting articles, you may want to challenge yourself and see if you can improve your results. One of the ways you can advance to the next level is to integrate keywords into your article submission campaign.

Google and other search engines look for words of special importance on a web page to help them determine what a website is about.

These words are called “keywords” or “keyword phrases”, and if a website owner knows the types of words/phrases that their target customers are typing into search boxes, then he can be sure to use those keywords in his articles to capitalize on the demand for those search terms in Google.

How do you use keywords in your article submissions?

Great question–it’s actually not that complicated.

1) First, figure out the keywords for your website.

Use a keyword suggestion tool such as WordTracker to create a detailed list of keywords and long-tailed keyword phrases.

A long-tailed keyword phrase is a phrase that is anywhere from 3-5 words long that a search engine customer would use to reach a site such as yours. Usually a basic keyword term for your website is more general and is 2 words long, but there is also merit to targeting longer phrases that potential customers do searches for.

For example, your main keyword phrase may be “chocolate recipe”.

Your long tail phrase may be “chocolate birthday cake recipe”.

When you’re doing your keyword research you’ll make a list of the general 2 word phrases as well as several of the more specific 3-5 word phrases.

2) Write an article around each keyword term.

Now, each keyword term has many possibilities for articles–there isn’t just one article that could be generated off of your keyword term. Try taking each keyword term and writing several articles addressing different aspects of that keyword.

If you have a long list of keyword and long-tailed keywords, that list could keep you busy for a while!

Just go through the list, writing a different article around the keyword phrase. Pretty soon you will have a library of articles that are covering virtually every topic related to your niche all pointing readers back to your website.

All of this instruction about keywords comes with a word of caution–there is a good way and a bad way to use keywords in article marketing.

The good way would be to use the keyword phrase to guide your article topic, and use the phrase or variations of the phrase naturally in your article so that the article makes sense to your readers.

The bad way would be to haphazardly spray your article with your keyword term without thinking about how the article will sound to readers or if the article makes sense.

You may have seen articles that were obviously written with the intention of using a particular keyword where you felt like the author was writing for search engines rather than for human readers–that is not way things should be!

You can write articles that please human readers and search engines–what these two groups are looking for is not at odds with each other.

Google wants to provide the search customers with an accurate list of results that is ranked in order with the results most likely to answer the searchers question at the top. Google cares about whether an article is reader friendly–it is not just looking for random words on a page.

When you write your articles, you can use your keywords to determine your specific topic and also use the keywords themselves where they sound natural. For best results keep your keyword density to around 2%.

Staying within these guidelines will give your article submissions the best chance of being recognized for that keyword term by search engines, and it will also produce an article that brings value to your reader.


Use an article distribution service like SubmitYOURArticle.com to magnify the impact of each article - distribute your articles to hundreds of targeted publishers with the click of a button. For more information go to=> http://www.SubmitYOURArticle.com

SEO Guidelines

By Jeffrey Smith in SE Optimization

Search engine optimization also known by the acronym SEO is comprised of multiple facets. SEO is not a linear process, but rather a holistic evolution involving intricate layers, steps and cumulative stages which are equally as delicate as they are demanding to perfect.
However, there are fundamental SEO guidelines one can use to incorporate granular changes to improve coherence, functionality and visibility of a website by working in tandem with the metrics that search engines deem worthy and therefore reward with a higher relevance / optimization score.

On the contrary, if you deliberately or inadvertently neglect any one of the necessary characteristics of fine-tuning, then your pages could fall short of their goal which is to find the most suitable audience by way of reaching the most coveted top 10 spots for the contents primary keywords.

Rather than butchering coherence after the fact in an attempt to make a square peg fit in a round hole by editing content, links or the architecture of your website. It is better to start with the SEO goal in mind and building the platform to support it vs. just altering aspects of each after the fact.


With initiating any SEO campaign, you should give credence to:

Understanding your competition - There is a reason why the top 10 spots are occupied, take a look for consistencies so you can emulate certain characteristics if your website lacks them.

Determining the Gap - Determining the gap implies removing the obstacles between you and your objective. Time is the obvious ranking factor; hence, someone online for 5 years in a niche who has achieved keyword saturation and authority has an easier time maintaining visibility compared to a new website (who has not achieved a suitable reputation through peer review).

Before you just build links, your traffic and engagement for the site must be commensurate in order to get past algorithmic filters which can determine things like (1) link clusters from building links in automation (2) the ratio of inbound links to outbound links (3) engagement time / bounce rate factor (which are a metric of satisfaction and relevance) and (4) if there are other supporting topical areas within the site that concentrate internal links, subjects or landing pages to support a more competitive rankings.

Building Internal Authority - Authority is the objective; rankings are merely a side-effect (not the goal). With this in mind, it is more about acquiring a stake in market share that unmistakably positions your website in front of any search which corresponds to any of the terms, keywords or topics covered in your title, content or tags. The more authority a website has, the easier it is to rank for more keywords with less effort.

Gaining Validation from Citation and Peer Review - You can have the greatest website online, but without co-occurrence and other websites referencing your pages, it is merely conjecture. Granted, your website can eventually acquire authority in and of itself, but links from other related sites or websites already ranking for the keywords you are targeting are the fastest way to expedite the process of creating a site that is less dependent on external sources for validation and rankings.

Managing User Expectations - Since no two people think or search alike, you will need an array of landing pages to help direct them to the ultimate conversion objective. The wild card in this equation is the mood of the surfer. Landing pages are all about getting the right person in the right mindset to read the right message. If you can accomplish that with your SEO, there is virtually no limit to increasing user engagement (which is getting them to take the desired action).

Landing Pages are your websites means to an end, they are what keep you in business. With a landing page tailored to a specific array of keywords, the more relevance you can create between what a searcher expects and what a searcher discovers, the higher conversion rate your site will experience.

Landing Page Conversion - The first step in creating a successful online presence is having a page worthy of conversion. Conversion implying that it performs a specific function (sign up for a free download, sign up for a newsletter, subscribe to an RSS feed, purchase a produce, inquire about a service, etc.).

Instead of hemorrhaging user intent or overwhelming users with too many choices, the more refined and focused your value proposition is, the more likely it is that users will engage it. The key behind landing pages are (1) make it clear to the visitor what the VALUE IS TO THEM for engaging the offer, not just to your business and (2) if you have to go back and read anything twice or if a 4th grader cannot understand the offer, then it’s probably too complex.

SEO delivers traffic, but the strength of your offer is what determines if people shop at your website and proceed to checkout or if they move on and use your site like a doormat for the next search result, which is more honed to their mental map of what they consider a superb offer.

The tasks and responsibilities of an SEO company is simple (1) fill the gap with relevant content (2) salvage the existing elements that are conducive to optimization (3) build off page reputation from link building and promotion and most of all (4) fine tune the on page elements that aid conversion until a suitable conversion rate exists.

For those offering SEO services that offer anything less is just theory. The bottom line is, SEO is about results, not just temporal rankings. So, as long as you stick to fundamental guidelines that are not dependent on fickle appearances or tricks but rather real content and real substance, changes in algorithms are the least of your concern, it’s only a matter of producing enough content, links or popularity to cross the tipping point.

Jeffrey Smith is an active internet marketing optimization strategist, consultant and the founder of Seo Design Solutions Seo Company http://www.seodesignsolutions.com. He has actively been involved in internet marketing since 1995 and brings a wealth of collective experiences and fresh marketing strategies to individuals involved in online business.



Friday, May 08, 2009

SEO, Subdomains, Site Architecture and Sitemaps

By Jeffrey Smith in Featured

Today, (with slight hesitation in fear of giving away too much) I am electing to share an effective SEO method which incorporates the use of sitemaps, subdomains and site architecture. As a result, you will have the capacity to develop robust websites with colossal proportions using a coherent site and link architecture to virtually zero in on competitive rankings and long-tail keywords alike.

This involves the use of subfolder / naming conventions, SEO friendly titles, relevant semantic descriptions, pretty urls, subdomains and sitemaps.

By employing this strategy, it is similar to targeting the roots of a tree (the keywords stemming from a topic) to reach the leaves (top 10 rankings) by giving them a value (page) and then implement an internal link / irrigation system capable of producing its own secondary and tertiary ranking factors as a result of link cultivation.

Sitemaps do not have to have a passive (just for crawling) in contention to SEO. In fact, think of a sitemap as a two way street. On one hand, you can use sitemaps to increase crawl frequency and get more pages in a search engine’s index. On another level, you can use sitemaps as a ranking tool designed to “reverse funnel” ranking factors to the pages that need added link weight to hone in on competitive rankings (much like a powerful pipeline).

In order to take this tool which was considered passive and turn it into a very powerful internal link sculpting tool, you only need to apply a few fundamental protocols to implement this tactic.

When you look at a Wikipedia ranking, try looking beyond the topical layer and attempt to observe the infrastructure of why and how it got there. The topical layer (landing page) represents a relevant triangulation of on page relevance (title) with keyword / search term is prominent and first - brief descriptor and site referral loop (Wikipedia, the free encyclopedia) to round off the title tag /naming convention.

In addition, the keyword is also translated into a URL string on a sub domain [en.wikipedia.com/wiki/keyword] to truly concentrate the ranking factors. The tactful use of Sub domains is one SEO method to (a) expand the exact match domain / url to encroach on a more relevant keyword making a domain more specific to a topic.

There is virtually no limit to this on page SEO tactic as you can essentially expand the focus of any website to broaden the relevance funnel using the sub domain tactic. This means, with the right amount of links and content, you virtually scale the content and links pointing to each page in a website to function as the preferred landing page by consolidating internal and external links. This is known as the threshold or barrier to entry for that keyword, and each keyword has a unique tipping point until it gains momentum and ranking power.

An example of how Wikipedia employs site architecture for optimal SEO value is:

* Topic1.domain.com as the base - which will require a sufficient amount of links to stem.
* Topic1.domain.com/wiki/topic1-keyword (the wiki folder is where the magic happens).
* Topic1 Keyword becomes first shingle in the title tag.
* Topic1 Keyword becomes H1 header tag to emphasize relevance.
* Topic1 anchor text from other pages all link to Topic1.domain.com/wiki/topic1-keyword

Yet, there is a hidden layer of SEO (the wiki folder) that most do not witness that is responsible for the prominent rankings based on site architecture the site produces.

What I am referring to is the other pages in the subfolder and non indexed pages responsible for shifting ranking factors that allow the webmaster to add one more layer of relevance by controlling the anchor text that feeds the main silo/ subfolder or landing page.

Naturally this can be implemented on semantics alone or a simple PHP script will suffice to concentrate ranking factors in your websites content management system. The only thing you need to maintain buoyancy for hundreds or thousands of pages is a pipeline capable of shifting link weight from page to page, in this instance the subfolders within the subdomains become the preferred landing page.

In this instance, using the http://en.wikipedia.org sub domain (as an example) for English provides them with the ability to funnel ranking factors from page to page, yet still keep the english from the spanish version, and so on and so fourth.

In the past, the downside of this strategy is that each sub domain is considered its own site. Now, this becomes an asset as you can essentially determine how you feed your pages and subsections (all based on a keyword) from one to the next. Also, which type of anchor text you use to feed the specific landing pages will determine how they fare in the search engine result pages.

For example, by using custom sitemaps (based on semantic clusters) you can funnel specific anchor text to specific pages to elevate prominence and relevance. For example, all pages corresponding to a particular keyword could be fed with a second / alternative modifier or qualifying term to promote keyword stemming.

The site:yourdomain.com keyword site operator can provide ideas for semantically themed pages that correspond to a virtual site architecture within a website.

Once you have a list of semantically coherent pages (based on keyword research) you can then nurture them in one place to implement the primary point of convergence (the sitemap) or hub page.

By using robots.txt or the noindex, follow meta tag, you can use sitemaps and landing pages designed to group clusters of concepts, keywords, other landing pages or subjects in one central place where you can feed multiple pages from one entry point.

Through managing the supporting pages (which all link up to the top level landing page to transfer their authority) you can sculpt up to 70% of the ranking factors for any given keyword. As a result, the transcendental ranking factors begin to spill over and strengthen the domain they are hosted on (which in turn feeds more pages which rank higher, etc.).

Eventually you have dozens, hundreds or thousands of pages in a site that all have page rank or the ability to pass ranking factors from one page to the next. By their very nature (the individual pages) they are optimized from the onset and when combined represents a ranking juggernaut as each page develops trust rank and authority.

The aggregate ranking factors for each page begin to stem and expand (which means it can be found for any two or three word combination’s on that page) if a related search query is executed in search engines.

What you have at that point is a website capable of ranking for multiple keywords simultaneously and showcasing the tip of the iceberg (the ideal landing page) built specifically as a consolidation of the keyword / topic capable of ranking on a fraction of the links required by a website that does not employ superior / coherent site architecture.

To summarize, internal links fueled by external links to one concentrated point and then augmented by deep links to the top level landing page have the ability to rank on fumes compared to a website employing that employs less efficient site architecture.

Which means that (a) the more topical information you have on a topic the better (b) that you can elect which pages are SEO savvy and appear (as a result of internal linking) and (c) there is virtually no limit to the size of reach of the websites semantic theme.

Obviously the more concentrated it is, the better (as it will require less links to cross the tipping point). You must understand that content and or external links both produce ranking factors, so it is possible to a website such as this to produce 60-80% of its own ranking factor by default (with more pages gaining strength, page rank and stemming daily).

So, the takeaway for the tactic is:

1) Group landing pages or themes virtually by using a sitemaps to stand in as pipelines to funnel link flow.

2) Build it properly from the onset, or consider mapping out a more conducive site architecture and 301 redirecting legacy pages to the new themed and silo’d pages.

3) Group primary keywords along with secondary supporting pages in miniature sitemaps to concentrate on a core group of keyphrase shingles.

Taking point 3 from above, you could take keywords like; SEO consulting, SEO consultant, SEO consulting services and feed them by a virtual sitemap linking them together (regardless of their location in the site architecture).

However, if the pages were in a subfolder (consulting) within a subdomain (SEO.domain.com) for example and used internal links which all point to the sitemap and the sitemap to them, each page would share a percentage of total link flow for that topical range of key phrases and modifiers to essentially exceed the threshold of relevance on all layers.

Then add deep links (links to each respective landing page) and you have the ability to catapult them all to the top of their respective shingles using a fraction of the links your competition is using.

And, how do we know this you might ask? Let’s just say we have done this before “with stellar results”…the next layer would be to implement a series of RSS feeds based on the same type of infrastructure to publish related content whenever new information was added, linked to or layered to promote buoyancy for laser-focused keywords and key phrases, yet that is another post in itself…

Jeffrey Smith is an active internet marketing optimization strategist, consultant and the founder of Seo Design Solutions Seo Company http://www.seodesignsolutions.com. He has actively been involved in internet marketing since 1995 and brings a wealth of collective experiences and fresh marketing strategies to individuals involved in online business.



*Traditional Media Vs. Digital Media *

Although we consider ourselves on board and moving with the digital age, there are some areas that are taking longer to evolve. One such area is the transition of traditional media metrics online. Most marketers realize the need to integrate online, but struggle with the actual process.

In many cases, traditional media forms such as television are the easiest option for marketers and advertisers, even if it’s not the best choice. Some may argue that the proper measurement tools are not available, but Erin Hunter of comScore says they do exist. comScore even has a media planning suite for marketers and advertisers that goes beyond traditional measures.

One significant problem with choosing that “easy” form of media is the consumers that could be missed. Many consumers are now fully reliant on the Web. If marketers and advertisers aren’t taking this into consideration, a large part of their demographic is probably being neglected.

What challenges have you encountered in your online efforts? Have you discovered that you were missing out on consumers before you integrated online?




Monday, May 04, 2009

Standing Out In The Digital Age

How many websites do you visit each day? Do you even have an accurate number or is it more than you could count? A recent Nielsen study unveiled the average Internet user in the US views 115 different websites each day. That’s a lot of websites!

Oddly enough, many of us view even more websites than that on a daily basis. What type of websites do you view? Is it commerce, community, news, or blogs? For most, it is probably a combination of all mentioned.

In order to get your business noticed in this digital age, Richard Jalichandra of Technorati says companies have to make their brand accessible. Utilizing 2 or 3 portals aren’t good enough. Richard says brands must realize that they have to extend their efforts beyond their comfort zones in order to succeed in the digital age.

Look at the above statistics and then think about your consumption of traditional media sources. How much TV do you watch? Do you get your news from television or a newspaper, or do you turn to the Web to get it? When was the last time you visited a library, or do you simply go to the Internet to find the information you need?

The Web has changed the way we do business and live our everyday life. We should expect this trend to only increase over time. Going back to the point Richard made, this information means that marketers and advertisers have to embrace several digital areas.

Take the Technorati online property BlogCritics.org for example. This freshly redesigned website tries bridge the gap between journalists and bloggers by setting a high quality content precedent. The site provides valuable content, but delivers it in a “community” atmosphere. There are countless properties like this on the Web and businesses simply need to find which ones can be integrated into their business model.





Social Media Done Right

A lot of people like to give their opinions on how to use social media, but Katie Sween’s opinion is advice that everyone would likely want to hear. Katie is the Head of Marketing at the rapidly growing company, StumbleUpon. She says social media shouldn’t be scary at all but should simply be an extension of other business efforts.

Many social networks were intended to be fun and should still be fun, even though they are now incorporated into business operations. Social media provides many cost-effective opportunities for businesses such as brand-building and reputation management, but websites still need to have quality content.

StumbleUpon comes into the game here since it tries to help the content providers connect with their brands. Katie says StumbleUpon listens and engages with users and also aims to make themselves available to users. The companies and brands that understand this “push and pull” concept are approaching social the right way.

Recently, StumbleUpon announced that is was no longer a part of eBay. The company is now in the hands of its original founders and a few other investors. While under eBay, Katie says the company was able to focus entirely on their product. As a result, they experienced unprecedented growth.

While very thankful for eBay, StumbleUpon is excited to be a start-up again. Katie says the company has more freedom and liberty now that they are independent. To give an example of those new opportunities, StumbleUpon just released a few enhancements to their Web Stumbling function. (Web Stumbling is the act of stumbling without downloading the toolbar.)

With Web Stumbling, StumbleUpon wanted to make stumbling accessible from any computer or browser. StumbleUpon revealed these enhancements to WebProNews:

Fully Personalized Experience - Now you can expect the same high-quality and personalized recommendations that you receive from your downloaded toolbar. And you can access it from any computer, and from any browser… just visit StumbleUpon.com and login to get a personalized stumbling experience.

Web Stumbling syncs with your toolbar activity - Use Web Stumbling to rate, review and share content and all your activities will be saved for you in your profile and will influence your future recommendations. For example, Web Stumbling in Safari or Opera would use ratings given using the Firefox or IE toolbars, and anything you rate will improve your recommendations on any platform.

Enhancements to Sharing - When sharing websites while Web Stumbling, you now have the ability to share with several friends at once, post sites directly to your Facebook profile, and even have conversations about the websites you share with your friends.

For more information, check out Chris Crum’s write-up or visit StumbleUpon.

What are your social media best practices? And how do you see StumbleUpon’s Web Stumbling enhancements supporting your social efforts?