Thursday, July 02, 2009
Why Good Vibrations Create A Better World
At the other end of the scale at 700- 1000 is enlightenment. This is the level of the Great Ones such as Krishna, Buddha and Jesus. It is the peak of the evolutionary consciousness in the human realm.
All levels (which could be classed as vibration levels) below 200 are said to be energy draining, and below integrity. These vary from Guilt (30), Grief (75), Fear (100) up to Pride (175).
People feel positive as they reach Pride level. However Pride feels good only in contrast to the lower levels. Pride is defensive and vulnerable because it’s dependent upon external conditions, without which is can suddenly revert to a lower level.
At the 200 level, power first appears. Courage (200) is the zone of exploration, accomplishment, fortitude, and determination. People at this level put back into the world as much energy as they take; at the lower levels, populations as well as individuals drain energy from society without reciprocating.
Further levels include willingness (310), Acceptance (350) and Love (500).
This level is characterized by the development of a Love that is unconditional, unchanging, and permanent. It doesn’t fluctuate – its source isn’t dependent on external factors. Loving is a state of being. This is the level of true happiness.
Interesting facts from the book -
* The concept and theories behind these experiments were conducted over a 20 year period using a variety of Kinesiology tests and examinations.
* Kinesiology has an almost certain 100% accuracy reading every time. It will always reveal Yes, No, True, and False answers.
* Collective Consciousness: These experiments reveal that there is a higher power that connects everything and everyone.
* Everything calibrates at certain levels from weak to high including books, food, water, clothes, people, animals, buildings, cars, movies, sports, music etc.
* 85% of the human race calibrates below the critical level of 200.
* The overall average level of human consciousness stands at 207.
* Human consciousness was dangling at below the 200 level (190) for many centuries before it suddenly rose up to its present level some time in the mid 1980s. Hence Nostradamus’s end of the world predictions may have been avoided (he made his predictions at a time when human consciousness was at below the 200 level). For the world to stay at levels below 200 over a prolonged period of time would cause a great imbalance that would undoubtedly lead to the destruction of all humanity.
* The power of the few individuals at the top counterbalances the weakness of the masses.
* 1 individual at level 300 counterbalances 90,000 individuals below level 200
* 1 individual at level 500 counterbalances 750,000 individuals below level 200
* 1 individual at level 700 counterbalances 70 million individuals below level 200
In other words, as a co creator of the world, if you vibrate at 200 and above you will be helping to raise the consciousness of mankind, and be a big part in creating a better world for everyone.
----------------------------------------------------------
Get a free Alpha Mind Control mp3, originally created to help soldiers with post traumatic stress disorder. This powerful audio will help you reach deep levels of alpha brainwaves, helping you to become more creative and aware. It's good for your health too. It can help you sleep better, boost your immune system and make you feel good.
It will help also raise your consciousness levels...
Get your free Alpha Mind Control mp3 here
Friday, May 15, 2009
What’s Killing the Newspapers?
As ironic as it may be, newspapers are currently topping their own headlines. Well-known newspapers such as The New York Times, Chicago Tribune, and the Los Angeles Times are not only downsizing employees, but are also cutting sections and features from their publications. While the newspaper industry appears to be dying, the news itself is actually flourishing in other forms.
What’s the reason for all this? Some blame the economy and expect the government to bail out the newspapers. U.S. Senator Benjamin Cardin introduced the Newspaper Revitalization Act to Congress, which would allow newspapers to operate as non-profit organizations if they wanted to. This week, Governor Chris Gregoire of Washington State approved a tax break for newspaper printers and publishers.
Some newspapers blame Google for their struggles claiming the search engine is stealing their content. Search industry leader Danny Sullivan disagrees. He believes newspapers actually get “special treatment” from Google. There are news publications that do not appear in Google news, but so many of the complaining newspapers do. These newspapers also receive a tremendous amount of traffic from Google that many other publications would readily appreciate.
Lastly, some even say the newspapers have created their own crisis. Has the newspaper industry embraced the Internet to its full potential? Could they have approached advertising in a different way that could have produced better benefits for them? Are they monetizing their traffic in the most effective manner?
Oldest Tweeter
Oldest Tweeter talks cuppas and casserole on Twitter at 104
Ivy Bean has become the UK’s oldest Tweeter after signing up for the micro-blogging site at the age of 104.

People can follow the silver surfer’s updates at 'IvyBean104 ’ on Twitter.com
Her first posts have included “Looking forward to Deal or No Deal later,” “just having a cuppa,” and “chicken casserole was lovely, going to have a nod now.”
And she is not the only pensioner at Hillside Manor residential home in Bradford, west Yorks, to be regularly getting online.
Pat Wright, Residence Manager said: “All the residents are taking a leaf out of her book. Four signed up for 'computer-college’ while others joined Facebook, surf the net and enjoy themselves with ten-pin bowling on the games console.”
Mrs Bean was already a keen Facebook user but members of the IT support service the Geek Squad helped the pensioner get bang up to date.
The Geek Squad set her up and gave her some navigation training and top tips on how to manage the social networking phenomenon.
The support group is now challenging the Twittersphere to find out if there is actually someone older than Ivy posting updates on their whereabouts, updates and activities. Make sure you catch her before she logs off for her daily weekday appointment with Noel Edmonds on 'Deal or No Deal’.
“It’s brilliant to help someone as inspirational as Ivy to get started and teach her about Twitter,” said agent Martin Dix
“She’s quite tech savvy and already signed up to Facebook with 4,800 friends. She shows others that you shouldn’t be frightened of technology. If she was 50 years younger, I’m sure she’d make an ace Agent herself.”
Twitter membership figures are on the rise with adults and recent figures found that 45 – 54 year-olds are the site’s top demographic
© Copyright of Telegraph Media Group Limited 2009Thursday, May 14, 2009
Watching Twitter’s #Fixreplies Firestorm
What a fun morning I’m having on Twitter search, looking for tweets containing the hashtag #fixreplies. Oops, wait a minute … since I logged onto the site, a minute and a half ago, 230 more replies have come in with that hashtag. Oops, make that 276. Now make that 326.
So what is everyone all tied up in their underwear about? The settings change that Twitter (415 tweets as of now) announced on its blog yesterday, saying that people would no longer see @replies (453) of people they don’t follow. This has caused the first Facebook-style Twitter revolt, as users (489) have poured onto the service to complain. The main complaint about the change is in that without this option, (531), users lose an important resource that tells them who might be interesting to follow, and they’re mad (605). (OK, I’ll stop with that meme, but you get the drift. Smoke is coming out of Twitter’s servers right about now.)
According to Twitter co-founder Biz Stone, Twitter enacted the change because, “based on usage patterns and feedback, we’ve learned most people want to see when someone they follow replies to another person they follow — it’s a good way to stay in the loop. However, receiving one-sided fragments via replies sent to folks you don’t follow in your timeline is undesirable. Today’s update removes this undesirable and confusing option.”
What we’re witnessing here is, once again, that it’s going to become nigh impossible for any of the popular social nets to make changes without involving users first. If you take a close look at Stone’s statement above, what you see is actually a fairly old media response, one assuming that the owner of the media property, in this case, Twitter, knows best: “receiving one-sided fragments via replies sent to folks you don’t follow in your timeline is undesirable.” You can just hear the Twitterati saying, “Undesirable to whom?”
The headline for Stone’s statement in the same vein. Reading “Small Settings Update,” it assumes that users will also view this as small — but apparently, it’s big (OK, now we’re at 1,390 new #fixreplies tweets.)
So what are social nets to do? Put everything to a vote? Not always practical, although Facebook was right to do it with its terms of service, which truly was a big change. So, are they to pack their services with so many potential options that traveling through “settings” for any one of them is a day-long excursion? Also not practical. What they do have to do is float changes with users before they make them, and then gauge the volume of the outcry. Something tells me that they would get a more reasoned approach by communicating potential changes before they happen, rather than dealing with the firestorm that inevitably erupts when users feel that something some of them valued has been snatched from them in the night.
In the current situation, Twitter appears to be weighing the outpouring of feedback, which is good. As @adbroad points out, co-founder Evan Williams tweeted the following 10 hours ago: “Reading people’s thoughts on the replies issue. We’re considering alternatives. Thanks for your feedback.”
But for now, the firestorm is raging, out of control (3,431).
Catharine P. Taylor has been covering digital media and advertising for almost 15 years. She currently writes daily about advertising on her blog, Adverganza.com. You can reach her via email at cathyptaylor@gmail.com, follow her on Twitter at cpealet, or friend her on Facebook at Catharine P. Taylor. (mediapost)
One Response to “Watching Twitter’s #Fixreplies Firestorm”
How To Write Article Headlines
What makes the difference between an under-performing article and one with a drastically higher number of views?
Many times it’s something as simple as an awesome title that pushes an article over the edge from so-so to spectacular!
Here’s why your title is crucial:
Most of the time readers will discover your article through:
a) a Google search b) looking through an article directory
In both of these instances, your headline is one of the few bits of information the reader sees before they decide to click through to read your entire article.
When a reader is looking through a long list of articles on a directory or on a search engine results page, they are quickly scanning a long list of titles, and the title plays a huge role in which article they decide to read in its entirety.
So, don’t take your titles lightly–really put some thought into them and be willing to do some experimenting to see what types of titles work best for you.
Want headlines that generate more traffic? Try these 5 tactics:
1) Be short and sweet.
Put yourself in the shoes of a reader–when you’re scanning a long list of article titles, you don’t necessarily take the time to read each and every title in full. You’re just glancing over each line, trying to get the gist of what the article offers, and sometimes a very clever, long title can be overlooked simply because it doesn’t scan well.
Shorter titles tend to be more direct and focused, and this also helps search engines determine what your article is about.
Now, this isn’t to say that you shouldn’t ever write long and clever titles–experiment, but be sure to try submitting articles that have short and punchy titles as well.
Then look at your article statistics and see if you can tell a difference in performance between the short titles and the longer ones.
2) Be direct.
Remember, Google does not understand irony, humor or puns. Search engines take things at face value. A more direct and straightforward title can help your article get higher ranking for your keyword terms.
3) Put your most crucial words at the beginning of your headline.
Again, this pays off when people are scanning your titles, but it also helps Google and the other search engines classify your article. By putting your most important words at the beginning of the title (possibly your keywords or variations of your keywords), you are making it easier for readers and Google to determine what your article is about.
Not sure how to make that work?
Here’s an example:
How To Write Article Headlines: 5 Tips for Clickable Titles
The first few words state specifically what the article is about, and the part after the colon gives additional information.
4) Your title should indicate the topic of your article.
Kind of obvious there, but when you become aware of your keywords, you may be tempted to put your keywords in your title even when the keywords are not appropriate for the article.
For example, your keywords may be “New York Dog Walker”, but in order to use those keywords in your title your article would have to be about some aspect of New York dog walkers. If your article is just about dogs or dog walking in general and not specifically about New York, then it wouldn’t be appropriate to include ‘New York’ in your title.
Your title should always describe what your article is about, and sometimes it’s not appropriate to use your keywords in your title.
5) Your title should make readers want to click through and read the entire article.
Remember, you’re writing for human readers, not just search engines. Even if it is appropriate to use your keywords in your title, you’ll want to put some thought into the phrasing of the headline so that it invites/inspires the reader to click the title and read the entire article.
Your title is the first thing a reader sees when they’re introduced to your article, and you can drastically improve your article submission success simply by paying attention to how you phrase your article headlines. Try these 5 tips and then watch your article stats to see which types of titles work best for you.
Carefully write your title, then submit your article to a vast network of targeted publishers. The more places your article is published, the more traffic your website receives. Steve Shaw created the web’s first ever 100% automated article distribution service, SubmitYOURArticle.com, which distributes your articles to hundreds of targeted publishers with the click of a button. For more information go to=> http://www.SubmitYOURArticle.com
Using Social Media to Boost Search Engine Results
Most of us are well aware that the search engines frequently change their algorithms to improve search results for users (and foil spammers), which can make it challenging for small businesses just to keep up. But as web technology continues to evolve, it also creates new opportunities for small businesses to improve their SEO strategies and boost their rankings as well. Social media (sites like Facebook, Twitter, LinkedIn, Technorati, Digg, etc.) provide an excellent opportunity for small businesses to not only promote their products and services online, but also to gain significant ground in the search engine results.
One of the most critical components to getting top search engine rankings is the number of inbound links and link popularity a web site is able to build. Although there are several existing link building strategies available to small businesses (e.g., press releases, directory submissions, article syndication, etc.), social media can help create additional high-value, on-target inbound links that are essential to achieving top placements in the search engines.
For example, each time you use Twitter to publish a link to new content on your web site, that link gets “planted” on the Twitter page of each person following you, and has the potential to spread even further as your followers share that information with their own network of contacts.
Integrated Social Marketing (ISM)TM
If you have properly integrated your social networking profiles together, that same Twitter “tweet” could then be fed via RSS to your Facebook business profile, your corporate blog, your LinkedIn account, and any number of other social sites that you have set up for your business. It’s not a far stretch to imagine the link you broadcast on Twitter could reach dozens, hundreds, or even thousands of other places on the web, all pointing back to your web site! By integrating your social networking profiles with each other, with your web site, and with your existing marketing initiatives, you can easily make one single marketing action (such as a tweet) show up in multiple places online, each containing a new, relevant inbound link to your site.
Quantity AND Quality
In addition to the sheer number of inbound links that are created through social marketing, the value of the links that are created is another important criterion that search engines consider. To be valued by the search engines, inbound links must be from relevant, “quality” web sites, and search engines today give social sites like Facebook and Twitter great value. These sites are highly visible to the search engines, and are constantly taking updates from users. Links tend to be shared according to subject matter, which means the search engines will see them as being relevant and on-target. All of these factors combine to create high-quality inbound links in the eyes of the search engines.
Online Visibility and Branding
Creating visibility for your business and your “brand” is really key when using social media for building links. The power of social media is realized when other users see your links or content, then share that information with their own network of contacts. Simply adding a bunch of links to your social profiles is not enough; you need to have a strong reputation and a brand that users trust so they will feel comfortable sharing your content with others. Brand recognition typically leads to natural link building anyway, which means your inbound links will end up coming from bloggers, colleagues, customers, and other people who are exposed to your links and find them useful enough to share with their own contacts.
The Proof is in the Rankings
A recent example from Website Magazine explained somewhat surprising results when they searched for their publication’s name in Google. As expected, their web site came up as the number one listing on the results page. But what was not expected was the number three listing on the results page was the magazine’s Twitter page. They then performed a number of Google searches for the terms “Chicago Tribune,” “Chicago Public Golf,” and “Daily Career Tips,” all with similar results in Google - the Twitter page for each of these terms came up near the top of the search engine results every time.
The conclusion was that given these results, Google must be giving serious weight to Twitter content, and I happen to agree. The search engines of course keep their ranking algorithms top-secret, so there’s no way to know how much weight (if any) is really given to Twitter or other social media sites. But results like those in the example above are hard to ignore!
A Great Opportunity
Social media is here to stay, and small businesses are beginning to use it to effectively promote their businesses, reach their customers, find new leads, keep customer mindshare, and instantly communicate with customers. But maybe one of the biggest benefits of adding social media to your marketing mix is the creation of high-value, on-target inbound links that can help improve visibility in the search engines and boost your business to the top of the search engine rankings.
Lauren Hobson, President of Five Sparrows, LLC, has more than 16 years of experience in small business technology writing, marketing, and web site design and development. Five Sparrows provides professional web site and marketing services to small businesses and non-profit organizations, giving them access to high-quality services at affordable prices. To read articles or subscribe to Biz Talk, please visit www.FiveSparrows.com/biztalk.htm.
Wednesday, May 13, 2009
How To Use Keywords In Your Article Submissions
Article marketing has beginner, intermediate and advanced stages to it, so no matter what skill level you’re at, you can still submit articles to drive traffic to your website.
You may have started out simply writing articles on the topic of your website–that is a great start, and you can see excellent results by consistently writing and submitting on-topic articles.
But after you get used to the basics of submitting articles, you may want to challenge yourself and see if you can improve your results. One of the ways you can advance to the next level is to integrate keywords into your article submission campaign.
Google and other search engines look for words of special importance on a web page to help them determine what a website is about.
These words are called “keywords” or “keyword phrases”, and if a website owner knows the types of words/phrases that their target customers are typing into search boxes, then he can be sure to use those keywords in his articles to capitalize on the demand for those search terms in Google.
How do you use keywords in your article submissions?
Great question–it’s actually not that complicated.
1) First, figure out the keywords for your website.
Use a keyword suggestion tool such as WordTracker to create a detailed list of keywords and long-tailed keyword phrases.
A long-tailed keyword phrase is a phrase that is anywhere from 3-5 words long that a search engine customer would use to reach a site such as yours. Usually a basic keyword term for your website is more general and is 2 words long, but there is also merit to targeting longer phrases that potential customers do searches for.
For example, your main keyword phrase may be “chocolate recipe”.
Your long tail phrase may be “chocolate birthday cake recipe”.
When you’re doing your keyword research you’ll make a list of the general 2 word phrases as well as several of the more specific 3-5 word phrases.
2) Write an article around each keyword term.
Now, each keyword term has many possibilities for articles–there isn’t just one article that could be generated off of your keyword term. Try taking each keyword term and writing several articles addressing different aspects of that keyword.
If you have a long list of keyword and long-tailed keywords, that list could keep you busy for a while!
Just go through the list, writing a different article around the keyword phrase. Pretty soon you will have a library of articles that are covering virtually every topic related to your niche all pointing readers back to your website.
All of this instruction about keywords comes with a word of caution–there is a good way and a bad way to use keywords in article marketing.
The good way would be to use the keyword phrase to guide your article topic, and use the phrase or variations of the phrase naturally in your article so that the article makes sense to your readers.
The bad way would be to haphazardly spray your article with your keyword term without thinking about how the article will sound to readers or if the article makes sense.
You may have seen articles that were obviously written with the intention of using a particular keyword where you felt like the author was writing for search engines rather than for human readers–that is not way things should be!
You can write articles that please human readers and search engines–what these two groups are looking for is not at odds with each other.
Google wants to provide the search customers with an accurate list of results that is ranked in order with the results most likely to answer the searchers question at the top. Google cares about whether an article is reader friendly–it is not just looking for random words on a page.
When you write your articles, you can use your keywords to determine your specific topic and also use the keywords themselves where they sound natural. For best results keep your keyword density to around 2%.
Staying within these guidelines will give your article submissions the best chance of being recognized for that keyword term by search engines, and it will also produce an article that brings value to your reader.
Use an article distribution service like SubmitYOURArticle.com to magnify the impact of each article - distribute your articles to hundreds of targeted publishers with the click of a button. For more information go to=> http://www.SubmitYOURArticle.com
SEO Guidelines
Search engine optimization also known by the acronym SEO is comprised of multiple facets. SEO is not a linear process, but rather a holistic evolution involving intricate layers, steps and cumulative stages which are equally as delicate as they are demanding to perfect.
However, there are fundamental SEO guidelines one can use to incorporate granular changes to improve coherence, functionality and visibility of a website by working in tandem with the metrics that search engines deem worthy and therefore reward with a higher relevance / optimization score.
On the contrary, if you deliberately or inadvertently neglect any one of the necessary characteristics of fine-tuning, then your pages could fall short of their goal which is to find the most suitable audience by way of reaching the most coveted top 10 spots for the contents primary keywords.
Rather than butchering coherence after the fact in an attempt to make a square peg fit in a round hole by editing content, links or the architecture of your website. It is better to start with the SEO goal in mind and building the platform to support it vs. just altering aspects of each after the fact.
With initiating any SEO campaign, you should give credence to:
Understanding your competition - There is a reason why the top 10 spots are occupied, take a look for consistencies so you can emulate certain characteristics if your website lacks them.
Determining the Gap - Determining the gap implies removing the obstacles between you and your objective. Time is the obvious ranking factor; hence, someone online for 5 years in a niche who has achieved keyword saturation and authority has an easier time maintaining visibility compared to a new website (who has not achieved a suitable reputation through peer review).
Before you just build links, your traffic and engagement for the site must be commensurate in order to get past algorithmic filters which can determine things like (1) link clusters from building links in automation (2) the ratio of inbound links to outbound links (3) engagement time / bounce rate factor (which are a metric of satisfaction and relevance) and (4) if there are other supporting topical areas within the site that concentrate internal links, subjects or landing pages to support a more competitive rankings.
Building Internal Authority - Authority is the objective; rankings are merely a side-effect (not the goal). With this in mind, it is more about acquiring a stake in market share that unmistakably positions your website in front of any search which corresponds to any of the terms, keywords or topics covered in your title, content or tags. The more authority a website has, the easier it is to rank for more keywords with less effort.
Gaining Validation from Citation and Peer Review - You can have the greatest website online, but without co-occurrence and other websites referencing your pages, it is merely conjecture. Granted, your website can eventually acquire authority in and of itself, but links from other related sites or websites already ranking for the keywords you are targeting are the fastest way to expedite the process of creating a site that is less dependent on external sources for validation and rankings.
Managing User Expectations - Since no two people think or search alike, you will need an array of landing pages to help direct them to the ultimate conversion objective. The wild card in this equation is the mood of the surfer. Landing pages are all about getting the right person in the right mindset to read the right message. If you can accomplish that with your SEO, there is virtually no limit to increasing user engagement (which is getting them to take the desired action).
Landing Pages are your websites means to an end, they are what keep you in business. With a landing page tailored to a specific array of keywords, the more relevance you can create between what a searcher expects and what a searcher discovers, the higher conversion rate your site will experience.
Landing Page Conversion - The first step in creating a successful online presence is having a page worthy of conversion. Conversion implying that it performs a specific function (sign up for a free download, sign up for a newsletter, subscribe to an RSS feed, purchase a produce, inquire about a service, etc.).
Instead of hemorrhaging user intent or overwhelming users with too many choices, the more refined and focused your value proposition is, the more likely it is that users will engage it. The key behind landing pages are (1) make it clear to the visitor what the VALUE IS TO THEM for engaging the offer, not just to your business and (2) if you have to go back and read anything twice or if a 4th grader cannot understand the offer, then it’s probably too complex.
SEO delivers traffic, but the strength of your offer is what determines if people shop at your website and proceed to checkout or if they move on and use your site like a doormat for the next search result, which is more honed to their mental map of what they consider a superb offer.
The tasks and responsibilities of an SEO company is simple (1) fill the gap with relevant content (2) salvage the existing elements that are conducive to optimization (3) build off page reputation from link building and promotion and most of all (4) fine tune the on page elements that aid conversion until a suitable conversion rate exists.
For those offering SEO services that offer anything less is just theory. The bottom line is, SEO is about results, not just temporal rankings. So, as long as you stick to fundamental guidelines that are not dependent on fickle appearances or tricks but rather real content and real substance, changes in algorithms are the least of your concern, it’s only a matter of producing enough content, links or popularity to cross the tipping point.
Jeffrey Smith is an active internet marketing optimization strategist, consultant and the founder of Seo Design Solutions Seo Company http://www.seodesignsolutions.com. He has actively been involved in internet marketing since 1995 and brings a wealth of collective experiences and fresh marketing strategies to individuals involved in online business.
Friday, May 08, 2009
SEO, Subdomains, Site Architecture and Sitemaps
Today, (with slight hesitation in fear of giving away too much) I am electing to share an effective SEO method which incorporates the use of sitemaps, subdomains and site architecture. As a result, you will have the capacity to develop robust websites with colossal proportions using a coherent site and link architecture to virtually zero in on competitive rankings and long-tail keywords alike.
This involves the use of subfolder / naming conventions, SEO friendly titles, relevant semantic descriptions, pretty urls, subdomains and sitemaps.
By employing this strategy, it is similar to targeting the roots of a tree (the keywords stemming from a topic) to reach the leaves (top 10 rankings) by giving them a value (page) and then implement an internal link / irrigation system capable of producing its own secondary and tertiary ranking factors as a result of link cultivation.
Sitemaps do not have to have a passive (just for crawling) in contention to SEO. In fact, think of a sitemap as a two way street. On one hand, you can use sitemaps to increase crawl frequency and get more pages in a search engine’s index. On another level, you can use sitemaps as a ranking tool designed to “reverse funnel” ranking factors to the pages that need added link weight to hone in on competitive rankings (much like a powerful pipeline).
In order to take this tool which was considered passive and turn it into a very powerful internal link sculpting tool, you only need to apply a few fundamental protocols to implement this tactic.
When you look at a Wikipedia ranking, try looking beyond the topical layer and attempt to observe the infrastructure of why and how it got there. The topical layer (landing page) represents a relevant triangulation of on page relevance (title) with keyword / search term is prominent and first - brief descriptor and site referral loop (Wikipedia, the free encyclopedia) to round off the title tag /naming convention.
In addition, the keyword is also translated into a URL string on a sub domain [en.wikipedia.com/wiki/keyword] to truly concentrate the ranking factors. The tactful use of Sub domains is one SEO method to (a) expand the exact match domain / url to encroach on a more relevant keyword making a domain more specific to a topic.
There is virtually no limit to this on page SEO tactic as you can essentially expand the focus of any website to broaden the relevance funnel using the sub domain tactic. This means, with the right amount of links and content, you virtually scale the content and links pointing to each page in a website to function as the preferred landing page by consolidating internal and external links. This is known as the threshold or barrier to entry for that keyword, and each keyword has a unique tipping point until it gains momentum and ranking power.
An example of how Wikipedia employs site architecture for optimal SEO value is:
* Topic1.domain.com as the base - which will require a sufficient amount of links to stem.
* Topic1.domain.com/wiki/topic1-keyword (the wiki folder is where the magic happens).
* Topic1 Keyword becomes first shingle in the title tag.
* Topic1 Keyword becomes H1 header tag to emphasize relevance.
* Topic1 anchor text from other pages all link to Topic1.domain.com/wiki/topic1-keyword
Yet, there is a hidden layer of SEO (the wiki folder) that most do not witness that is responsible for the prominent rankings based on site architecture the site produces.
What I am referring to is the other pages in the subfolder and non indexed pages responsible for shifting ranking factors that allow the webmaster to add one more layer of relevance by controlling the anchor text that feeds the main silo/ subfolder or landing page.
Naturally this can be implemented on semantics alone or a simple PHP script will suffice to concentrate ranking factors in your websites content management system. The only thing you need to maintain buoyancy for hundreds or thousands of pages is a pipeline capable of shifting link weight from page to page, in this instance the subfolders within the subdomains become the preferred landing page.
In this instance, using the http://en.wikipedia.org sub domain (as an example) for English provides them with the ability to funnel ranking factors from page to page, yet still keep the english from the spanish version, and so on and so fourth.
In the past, the downside of this strategy is that each sub domain is considered its own site. Now, this becomes an asset as you can essentially determine how you feed your pages and subsections (all based on a keyword) from one to the next. Also, which type of anchor text you use to feed the specific landing pages will determine how they fare in the search engine result pages.
For example, by using custom sitemaps (based on semantic clusters) you can funnel specific anchor text to specific pages to elevate prominence and relevance. For example, all pages corresponding to a particular keyword could be fed with a second / alternative modifier or qualifying term to promote keyword stemming.
The site:yourdomain.com keyword site operator can provide ideas for semantically themed pages that correspond to a virtual site architecture within a website.
Once you have a list of semantically coherent pages (based on keyword research) you can then nurture them in one place to implement the primary point of convergence (the sitemap) or hub page.
By using robots.txt or the noindex, follow meta tag, you can use sitemaps and landing pages designed to group clusters of concepts, keywords, other landing pages or subjects in one central place where you can feed multiple pages from one entry point.
Through managing the supporting pages (which all link up to the top level landing page to transfer their authority) you can sculpt up to 70% of the ranking factors for any given keyword. As a result, the transcendental ranking factors begin to spill over and strengthen the domain they are hosted on (which in turn feeds more pages which rank higher, etc.).
Eventually you have dozens, hundreds or thousands of pages in a site that all have page rank or the ability to pass ranking factors from one page to the next. By their very nature (the individual pages) they are optimized from the onset and when combined represents a ranking juggernaut as each page develops trust rank and authority.
The aggregate ranking factors for each page begin to stem and expand (which means it can be found for any two or three word combination’s on that page) if a related search query is executed in search engines.
What you have at that point is a website capable of ranking for multiple keywords simultaneously and showcasing the tip of the iceberg (the ideal landing page) built specifically as a consolidation of the keyword / topic capable of ranking on a fraction of the links required by a website that does not employ superior / coherent site architecture.
To summarize, internal links fueled by external links to one concentrated point and then augmented by deep links to the top level landing page have the ability to rank on fumes compared to a website employing that employs less efficient site architecture.
Which means that (a) the more topical information you have on a topic the better (b) that you can elect which pages are SEO savvy and appear (as a result of internal linking) and (c) there is virtually no limit to the size of reach of the websites semantic theme.
Obviously the more concentrated it is, the better (as it will require less links to cross the tipping point). You must understand that content and or external links both produce ranking factors, so it is possible to a website such as this to produce 60-80% of its own ranking factor by default (with more pages gaining strength, page rank and stemming daily).
So, the takeaway for the tactic is:
1) Group landing pages or themes virtually by using a sitemaps to stand in as pipelines to funnel link flow.
2) Build it properly from the onset, or consider mapping out a more conducive site architecture and 301 redirecting legacy pages to the new themed and silo’d pages.
3) Group primary keywords along with secondary supporting pages in miniature sitemaps to concentrate on a core group of keyphrase shingles.
Taking point 3 from above, you could take keywords like; SEO consulting, SEO consultant, SEO consulting services and feed them by a virtual sitemap linking them together (regardless of their location in the site architecture).
However, if the pages were in a subfolder (consulting) within a subdomain (SEO.domain.com) for example and used internal links which all point to the sitemap and the sitemap to them, each page would share a percentage of total link flow for that topical range of key phrases and modifiers to essentially exceed the threshold of relevance on all layers.
Then add deep links (links to each respective landing page) and you have the ability to catapult them all to the top of their respective shingles using a fraction of the links your competition is using.
And, how do we know this you might ask? Let’s just say we have done this before “with stellar results”…the next layer would be to implement a series of RSS feeds based on the same type of infrastructure to publish related content whenever new information was added, linked to or layered to promote buoyancy for laser-focused keywords and key phrases, yet that is another post in itself…
Jeffrey Smith is an active internet marketing optimization strategist, consultant and the founder of Seo Design Solutions Seo Company http://www.seodesignsolutions.com. He has actively been involved in internet marketing since 1995 and brings a wealth of collective experiences and fresh marketing strategies to individuals involved in online business.
*Traditional Media Vs. Digital Media *
In many cases, traditional media forms such as television are the easiest option for marketers and advertisers, even if it’s not the best choice. Some may argue that the proper measurement tools are not available, but Erin Hunter of comScore says they do exist. comScore even has a media planning suite for marketers and advertisers that goes beyond traditional measures.
One significant problem with choosing that “easy” form of media is the consumers that could be missed. Many consumers are now fully reliant on the Web. If marketers and advertisers aren’t taking this into consideration, a large part of their demographic is probably being neglected.
What challenges have you encountered in your online efforts? Have you discovered that you were missing out on consumers before you integrated online?
Monday, May 04, 2009
Standing Out In The Digital Age
How many websites do you visit each day? Do you even have an accurate number or is it more than you could count? A recent Nielsen study unveiled the average Internet user in the US views 115 different websites each day. That’s a lot of websites!
Oddly enough, many of us view even more websites than that on a daily basis. What type of websites do you view? Is it commerce, community, news, or blogs? For most, it is probably a combination of all mentioned.
In order to get your business noticed in this digital age, Richard Jalichandra of Technorati says companies have to make their brand accessible. Utilizing 2 or 3 portals aren’t good enough. Richard says brands must realize that they have to extend their efforts beyond their comfort zones in order to succeed in the digital age.
Look at the above statistics and then think about your consumption of traditional media sources. How much TV do you watch? Do you get your news from television or a newspaper, or do you turn to the Web to get it? When was the last time you visited a library, or do you simply go to the Internet to find the information you need?
The Web has changed the way we do business and live our everyday life. We should expect this trend to only increase over time. Going back to the point Richard made, this information means that marketers and advertisers have to embrace several digital areas.
Take the Technorati online property BlogCritics.org for example. This freshly redesigned website tries bridge the gap between journalists and bloggers by setting a high quality content precedent. The site provides valuable content, but delivers it in a “community” atmosphere. There are countless properties like this on the Web and businesses simply need to find which ones can be integrated into their business model.
Social Media Done Right
A lot of people like to give their opinions on how to use social media, but Katie Sween’s opinion is advice that everyone would likely want to hear. Katie is the Head of Marketing at the rapidly growing company, StumbleUpon. She says social media shouldn’t be scary at all but should simply be an extension of other business efforts.
Many social networks were intended to be fun and should still be fun, even though they are now incorporated into business operations. Social media provides many cost-effective opportunities for businesses such as brand-building and reputation management, but websites still need to have quality content.
StumbleUpon comes into the game here since it tries to help the content providers connect with their brands. Katie says StumbleUpon listens and engages with users and also aims to make themselves available to users. The companies and brands that understand this “push and pull” concept are approaching social the right way.
Recently, StumbleUpon announced that is was no longer a part of eBay. The company is now in the hands of its original founders and a few other investors. While under eBay, Katie says the company was able to focus entirely on their product. As a result, they experienced unprecedented growth.
While very thankful for eBay, StumbleUpon is excited to be a start-up again. Katie says the company has more freedom and liberty now that they are independent. To give an example of those new opportunities, StumbleUpon just released a few enhancements to their Web Stumbling function. (Web Stumbling is the act of stumbling without downloading the toolbar.)
With Web Stumbling, StumbleUpon wanted to make stumbling accessible from any computer or browser. StumbleUpon revealed these enhancements to WebProNews:
Fully Personalized Experience - Now you can expect the same high-quality and personalized recommendations that you receive from your downloaded toolbar. And you can access it from any computer, and from any browser… just visit StumbleUpon.com and login to get a personalized stumbling experience.
Web Stumbling syncs with your toolbar activity - Use Web Stumbling to rate, review and share content and all your activities will be saved for you in your profile and will influence your future recommendations. For example, Web Stumbling in Safari or Opera would use ratings given using the Firefox or IE toolbars, and anything you rate will improve your recommendations on any platform.
Enhancements to Sharing - When sharing websites while Web Stumbling, you now have the ability to share with several friends at once, post sites directly to your Facebook profile, and even have conversations about the websites you share with your friends.
For more information, check out Chris Crum’s write-up or visit StumbleUpon.
What are your social media best practices? And how do you see StumbleUpon’s Web Stumbling enhancements supporting your social efforts?
Wednesday, April 15, 2009
Making Your eCommerce Site Convert
As more and more people are shopping online as opposed to shopping at brick and mortar stores, it only makes sense that ecommerce sites would want to do everything they can to ensure that their traffic is converting. In this interview, Khalid Saleh of Invesp Consulting shares the shockingly low number of conversions some ecommerce sites are getting but also shares how they can improve them.
Some ecommerce websites are only converting 1-3 percent of their incoming traffic, which translates into 1 conversion for every 100 visitors. For comparison, sites like Amazon are getting conversions at a 12-14 percent rate.
According to Khalid, conversion optimization is where the real ROI is. There are a few basic assumptions that ecommerce sites need to realize about their traffic. The first is that roughly 25 percent of the visitors to your site are probably there by accident. Secondly, you need to understand that another approximated 25 percent of your traffic are offline shoppers simply comparing prices.
You are now left with 50 percent of your traffic to convert. There are a few quick tips for converting such as correcting headlines and images, but Khalid says you have to take a systematic approach to get the conversions that really matter.
Going back to basic marketing, having a thorough knowledge of your target audience is the first step in this approach. Use your market research and translate the data into personas and then determine how they interact with your website.
Here are several common mistakes that ecommerce sites make that prevent conversions:
1. Neglect to understand what customer is looking for
2. Shopping cart and shipping costs aren’t clear
3. Assume customers are “committed” to shopping transaction
4. Fail to include security seals
5. Insist shoppers complete forms before shopping process begins
What are some other mistakes that ecommerce sites make that you have experienced as a shopper? From a marketing standpoint, what advice would you offer to ecommerce site looking to improve their conversions?
Thursday, April 09, 2009
Content Monetization: What Not To Do
During this time in our economy everyone is trying to figure out a way to both save and make a buck. Jennifer Slegg, also known as JenSense, is well versed in monetizing online content. In this interview, she discusses some of the mistakes many people are making which ultimately prevent them from fully monetizing their content.
Jennifer believes that people are making mistakes because they are in a “freaking-out” mode as a result of the economy. They are so consumed with reaching a monthly quota in order to pay bills and provide for their families, that they are committing serious ad blunders on their site. Jennifer shares a personal experience in which she visited a blog that had 27 300×250-sized ads and only 3 paragraphs of content. Unfortunately, some people think this is effective.
According to Jennifer, a webpage should have a maximum of 3-4 ads on a page, depending upon the rest of the content. In a blog post on a related topic, Jennifer suggests replacing your ads with boxes of the same size but making them an outrageous color. This test proves that if you are more focused on the color than you are on the site’s content, then you likely need to adjust your ad structure.
Another mistake people are making involves running Adsense on sites where it isn’t allowed. If you’ve ever wondered why Google is showing Public Service Announcements on your site rather than targeted ads, you may need to check the Adsense policies to ensure that your site is following the rules. For example, sites that sell goods such as prescription drugs, school essays, tobacco products, and adult content are not allowed to run AdSense on their sites.
To get your ad campaign “right,” Jennifer recommends being creative and innovative. Try adding a border or shadow to your ad units and wrap text around them. Do you have any other creative ideas for integrating ads into content? If so, we would love to hear about your experiences.
Marketing Strategies for Mobile
As mobile devices become more and more popular, the importance mobile marketing becomes greater as well. Cindy Krum of Rank-Mobile always keeps us updated on what’s going on in the mobile market and this video from SES New York is no exception.
Cindy actively promotes having only one website by making your existing website provide mobile capabilities too. The same goes with marketing strategies. Cindy suggests that businesses integrate their mobile marketing strategy into their existing marketing strategy to get the best results.
The mobile Web is not just the current trend; it’s here to stay. Not only Cindy, but also many others including Dr. Vint Cerf, believe mobile will play a large role in the future. Based upon that information, Cindy believes if you embrace mobile now, you have the opportunity to pull ahead of your competitors.
So how do you integrate mobile into your overall marketing strategy? Here are a few quick tips from Cindy:
- Follow all traditional and local SEO best practices
- Provide info relevant to mobile users
- Submit your site to mobile search engines and directions
- Don’t rely on: Embedded images, Objects, Scripts, Frames, Flash, Pop-up Windows, Mouse-over trends
There are many mobile applications for search, which are taking feeds from existing search engines and integrating results into their own. Determine where they are getting their data from and make sure you are ranking well there. For mobile SEO, Cindy suggests coding in XHTML, employing multiple style sheets, and using CSS.
For more information on Cindy’s mobile strategies, check out Rank-Mobile.
Wednesday, April 08, 2009
TwitterHawk, Spam or Not?
Twitter is receiving almost as much attention as Google receives at industry events. In this interview from SES New York, Jeff Ferguson of Napster talks about TwitterHawk and the marketing power of it.
TwitterHawk is a new tool for Twitter that is still in the developing stages. It is keyword-based and allows users to create a variety of reply phrases to respond to people with. TwitterHawk usage costs 5 cents per tweet (reply) but can be purchased in advance through a variety of packages.
In an effort to build up a client’s Twitter base, Jeff decided to use the tool for a DJ for iGlobal Radio. He used keywords such as Radiohead, alternative music, Indie-Music, etc. When someone tweeted one of those keywords, they were sent an automated response that said something similar to “If you like Radiohead, you’ll love iGlobal Radio…”
As a result, TwitterHawk helped Jeff’s client to receive many more Twitter followers and created more interactivity with his followers. On the negative side, there were a couple of people that accused Jeff’s client of spam. Jeff responded to one user and found that he didn’t like that fact that a robot was responding to him and that it was a paid service.
Jeff countered the argument by explaining that his client wrote his own copy for his responses. Also, his client’s actions were completely human but simply carried out in an automated fashion.
If users only use TwitterHawk and are not personally active on the service, then Jeff says those users could be classified as spammers. His client however, is active on Twitter and used TwitterHawk to better target his market.
What are your thoughts on TwitterHawk? Do you think it is pure marketing or does it cross the line and create a spam issue?
Will Monetization Models for Social Media Ever Come?
Dave Naylor is never shy to share his opinion on matters and this interview with WebProNews is no exception. As evidenced by several of our videos from SES New York, social media was a hot topic. One area in particular that raised a lot of discussion was monetization of social sites.
According to comScore, YouTube is the second largest search engine next to Google. It however, does not have a solid monetization model. Dave says YouTube users don’t care what’s going on around them. They simply want to watch a video. YouTube users turn advertisements away much like they do on television, by either flipping to another channel or leaving the room until their program comes back on.
Twitter is a whole different story altogether. Look at all the applications it provides such as TweetBerry, TweetDeck, and our very own Twellow. Dave then raises the following question: what if Twitter would start charging for their usage? How would that change your business strategies on Twitter?
Twitter recently announced that Pro accounts are coming this year but has yet to disclose what they consist of.
Moving right along to Facebook. As discussed in our interview with Tim Kendall, contextual advertising on Facebook is based on the freely submitted data Facebook users provide. On the contrary, Google and the other search engines base their advertising on the searcher’s past history.
Facebook’s model is much more accurate than the search engine’s technique. Most users conduct many searches at work, which does not reflect their actual behavior. Facebook uses the information users readily give to them as part of their preferences.
On a final note, Dave points out that Google needs to incorporate a “business account” for Google accounts since sometimes many people are logged into one person’s account. He suggests giving the account holder the ability to choose who can have access to what.