The Sandbox Effect

Sandbox Effect is causing new web sites not to rank very well in the search results of Google, not even for the least competitive phrases. It usually means that a filter is being placed on new web sites and cannot rank very high for most words or phrases for a certain amount of time.

There has been a lot of debate as to why Google uses the sandbox filter. However, it is believed that this is an attempt by Google to discourage web site owners that uses SEO Spam techniques or some call it as blackhat technique to rank high fast and make a quick buck before Google discovers it. This is largely because Google uses link popularity so much to rank web sites in it's search engine.

There is no way to know for sure if a new web site is in the Sandbox, but the best way to determine if you are in the Sandbox is if your web site has a good amount of quality backlinks, quality content and a good ranking on other search engines, such as Yahoo!, MSN and Ask Jeeves, but still no where in Google's search engine results for your keywords.

It is a common trend that it can take up to 6 months before a new web site can get a high ranking in Google's search engine results and out of the Sandbox, but this varies from web site to web site. It also depends on how competitive your new web site's market and keywords are.

There is no known way of speeding up the amount time a new web site is in the Sandbox. But some new websites tend to get away with it by doing some tagging through social bookmarking sites. Some would even buy links from blogs or sites with high PR and were able to be crawled by SE spiders earlier than expected.

Better Search Engine Rankings with RSS

RSS is the latest trend in online publishing. But what exactly is RSS?RSS or Rich Site Syndication is a file format similar to XML, and is used by publishers to make their content available to others in a format that can be generally understood.

RSS allows publishers to "syndicate" their content through the distribution of lists of hyperlinks.
It has actually been around for a while, but with the introduction of spam filters and online blogging, it is fast becoming the choice of ezine publishers who want to get their message across to their subscribers.

However, not much attention has been given to the advantages RSS provides for search engine optimization.

Search Engines Loves RSS

Many SEO experts believe that sites optimized around themes, or niches, where all pages correspond to a particular subject or set of keywords, rank better in the search engines.
For example, if your website is designed to sell tennis rackets, your entire site content would be focused around tennis and tennis rackets. Search engines like Google seem to prefer tightly-themed pages.

But where does RSS figure in all this?

RSS feeds, usually sourced from newsfeeds or blogs, often correspond to a particular theme or niche.By using highly targeted RSS feeds, you can enhance your site's content without having to write a single line on your own.It's like having your own content writer - writing theme-based articles for you - for free!

How can RSS improve Search Engine Rankings?

There are three powerful reasons why content from RSS Feeds is irresistible bait for search engine spiders.

1. RSS Feeds Provide Instant Themed Content
There are several publishers of RSS feeds that are specific to a particular theme.Since the feed is highly targeted, it could contain several keywords that you want to rank highly for.Adding these keywords to your pages helps Google tag your site as one with relevant content.

2. RSS Feeds Provide Fresh, Updated Content
RSS feeds from large publishers are updated at specific intervals. When the publisher adds a new article to the feed, the oldest article is dropped.These changes are immediately effected on your pages with the RSS feed as well. So you have fresh relevant content for your visitors every hour or day.

3. RSS Feeds Result in More Frequent Spidering
One thing I never anticipated would happen as a result of adding an RSS feed to my site was that the Googlebot visited my site almost daily. To the Googlebot, my page that had the RSS feed incorporated into it was as good as a page that was being updated daily, and in its judgement, was a page that was worth visiting daily.

What this means to you, is that you will have your site being indexed more frequently by the Googlebot and so any new pages that you add to your site will be picked up much faster than your competitors.

How does this benefit you as a marketer?

Well, for example, let's says a top Internet Marketer comes out with a new product that you review and write up a little article on, and that your competitors do the same.

Google generally tends to index pages at the start of the month and if you miss that update, you will probably need to wait till the next month to even see your entry in.But, since your site has RSS feeds, it now gets indexed more frequently. So the chances of getting your page indexed quickly are much higher.

This gives you an advantage over the competition, as your review will show up sooner in the search results than theirs.Imagine what an entire month's advantage could do to your affiliate sales!

Why Javascript Feeds Are Not Effective

Some sites offer javascript code that generates content sourced from RSS feeds for your site. These are of absolutely no value in terms of search engine rankings, as the googlebot cannot read javascript and the content is not interpreted as part of your page.

What you need is code that parses the RSS feed and renders the feed as html content that's part of your page.This is achieved using server side scripting languages like PHP or ASP.Feel free to visit my article in embedding RSS feeds to your websites.

So in conclusion, besides optimizing on page and off page factors, adding RSS feeds to your pages should be an important part of your strategy to boost your search engine rankings.

Search Engine Marketing 101 For Corporate Sites

When most people want to find something on the web, they use a search engine. Millions of searches are conducted every day on search engines such as:,, and many others. Some people are looking for your website. So how do you capture people searching for what your site has to offer? Through techniques called search engine marketing (SEM).

This tutorial is foundational information for anyone looking to implement search engine marketing. This tutorial will also help you understand how the search engines work, what SEM is, and how it can help you get traffic.

What is a Search Engine?

All search engines start with a "search box", which issometimes the main focus of the site, e.g.,,; sometimes the "search box" is just one feature of a portal site, e.g.,, Just type in your search phrase and click the "search" button, and the search engine will return a listing of search engine result pages (SERPs). To generate SERPs the search engine compared your search phrase with information it has about various web sites and pages in its database and ranks them based on a "relevance" algorithm.

Search Engine Classes

Targeted audience, number of visitors, quality of search and professionalism is what determines a search engine's class. Each search engine typically target specific audiences based on interest and location. World-class search engines look very professional, include virtually the entire web in their database, and return highly relevant search results quickly.

Most of us are familiar with the major general search engines;,, A general search engine includes all types of websites and as such are targeting a general audience. There are also the lesser known 2nd tier general search engines;,, The primary difference is that 2nd tier engines are lesser known and generate significantly less traffic.

There are also several non-general or targeted search engines that limit the types of websites they include in their database. Targeted search engines typically limit by location or by industry / content type or both. Most large metro areas will have local search engines that list local businesses and other sites of interest to people in that area. Some are general and some are industry specific, such as specificallylisting restaurants or art galleries.

Many other targeted search engines list sites from any location but only if they contain specific types of content. Most webmasters are familiar with webmaster tools search engines such as;,, and more. There are niche SEs for practically any industry and interest.

Search Engine Models

There are two fundamentally different types of search engine back ends: site directories and spidering search engines. Site directory databases are built by a person manually inputting data about websites. Most directories include a site's url, title, and description in their database. Some directories include more information, such as keywords, owner's name, visitor rankings and so on. Some directories will allow you to control your website's information yourself others rely on editors that write the information to conform to the directory standards.

It is important to note that most directories include directory listings as an alterative to the search box for finding websites. A directory listing uses hierarchal groupings from general to specific to categorize a site.

Spidering search engines take a very different approach. They automate the updating of information in their database by using robots to continually read web pages. A search engine robot/spider/crawler acts much like a web browser, except that instead of a human looking at the web pages, the robot parses the page and adds the page's content it's database.

Many of the larger search engines will have both a directory and spidering search engine, e.g.,, and allow visitors to select which they want to search. Note that many search engines do not have their own search technology and are contracting services from elsewhere. For example, Google's spider SE is their own, but their directory is and Open Directory; additionally and both use Google's spider SE for their results.

There are a few other search engine models of interest. There are some search engines that combine results from other engines such as and There are also search engines that add extra information to searches such as Amazon's, which uses Google's backend but adds data from its search bar regarding tracking traffic to the site.

Getting In

One of the most important things to understand about the SE database models is how to get into their database and keep your listing updated. With a search directory, a submission needs to be done to provide the directory all the information needed for the listing. It is generally recommended that this be done by hand, either by you or a person familiar with directory submissions. There are many submission tools available that advertise they automate the submission process. This may be fine for smaller directories but for the major directories, manual submissions are worth the time.

Not all search directories are free; many charge a one-time or annual fee for review. Many of the free search directories have little quality control. For free directories you may have to submit your site several times before being accepted.

There are three different methods for getting into spidering search engines; free site submission, paid inclusion and links from other sites. Virtually all spidering SEs offer a free site submission. For most, you simply enter your url into a form and submit. Paid inclusion is normally not difficult, except for the credit card payment. For free site submission there is no quality control. The SE may send a spider to your site in the next few weeks, months or never. Typically with paid inclusion you will get a guarantee that the page you submitted will be included within a short amount of time. The other standard way to get included is to have links to your website from other web pages that are already in the SEs database. The SE spiders are always crawling the web and will eventually follow those links to find your site.

Once you are in a search engine database, you might change your site and need the search engine to update their database. Each directory handles this differently; generally each database will have a form for you to submit a change request. Spidering search engines will eventually find the change and add your updates automatically.

Getting High Rankings

Getting into a search engine database is only the first step. Without other factors you will not rank in the top positions, a prerequisite for quality traffic. So how do you get top positions? You can pay for placement with sponsored links that is covered in the next section. To place well in the free, organic SERPs, you will need to perform search engine optimization.

Search engine optimization is one of the most complicated aspects of web development. Each search engine uses a different algorithm, using hundreds of factors, that they are constantly changing, and they carefully guard their algorithm as trade secrets. Thus no one outside of the search engines employ knows with 100% certainty the perfect way to optimize a site. However, many individuals called search engine optimizers have studied the art and derived set of techniques that have a track record for success.

In general, there are two areas to focus on for top rankings; on-page factors and linking. On-page factors mean placing your target keywords in the content of your site in the right places. The structure of and technologies used on your website also play a role in on-page factors. Linking, refers to how other website's link to yours and how your site links internally.

Search Engine's Marketing Offerings

Search engines in the early days of the web were focused solely on serving the visiting searcher. They worked to capture as much of the web as possible in their database and provide fast, relevant searches. Many early website owners learned to reverse engineer the relevancy algorithms and to make their sites "search engine friendly" to get top rankings. They were the first search engine optimizers, manipulating the search engine's natural or organic SERPs as a means of generating free web traffic.

Often times these optimized sites compromised the integrity of the SERPs and lowered the quality for the searcher. Search engines fought, and continue to fight, to maintain the quality of their results. Eventually, the search engines embraced the fact that they are an important means for marketing websites. Today most search engines offer an array of tools to balance website's owners need to market while maintaining quality for the searcher.

You can generally break search engine marketing tools into free and for-pay. Realize these classifications are from the search engine's point of view. Effort and expense is required to setup and maintain any search engine marketing campaign.

Organic rankings are still one of the most important ways to drive quality traffic. Search engines now seek to reward ethical, high-quality websites with top rankings and remove inappropriate "spam" websites. While organic rankings can produce continual free traffic, it takes time from an experienced individual to achieve optimum results. Additionally, organic placement offers no guarantees, it generally takes months to get listed and can be unpredictable once listed.

Some search engines offer services that add more control to your organic campaign. Most of these services will list / update your site faster or will guarantee that all essential content is listed. For integrity reasons, no major search engine offers higher organic rankings for a fee.

If you need top rankings quickly, pay-per-positioning (PPP) is the most popular way to go. PPP rankings appear in normal organic SERPs but are usually designated as "sponsored listings". PPP listings use a bidding process to rank sites. If you are the top bidder, e.g. willing to pay the most per click on a given phrase, you will have top placement. The 2nd highest bidder is two; the next is 3 and so on. While most PPP works using this model, some search engines offer modifications such as Google's AdWords where bid price and click-through rates are both factors for positioning.

Search Engines have many other marketing tools, such as search specific banner ads; listings on affiliate sites and more.

Getting Started

The majority of websites have sub-optimal search engine marketing. Most sites have no effective search engine marketing and are continually missing out on valuable leads. Many other websites are too aggressive, wasting money on low value traffic or harming the functionality of their site due to over optimization. Too many sites are even paying money and receiving no results because they have trusted unethical or inexperienced search engine optimizers.

All SEM campaigns should start with a strategic evaluation of SEM opportunities based on return on investment (ROI). You need to assess how much each lead is worth for each keyword phrase and determine which SEM tools will achieve the best ROI for the phrase.

You also have to decide how much you want to do in-house vs. retaining an expert. A qualified expert will typically produce better results faster, but the high expenses may destroy the ROI. Often it is best to work with an expert as a team, the expert to develop the strategy and internal staff to perform implementation and ongoing management.

Google Adwords Tool

Google AdWords Keyword Tool is a free tool offered by Google to help its users to estimate keyword search volume and advertiser competition for their target search terms. The tool uses a relative scale to represent which search terms are more popular than others – what you get is a general idea of which terms get plenty of searches and which terms don’t.

Google Adwords Tool

The Google Keywords Tool also provides a picture of advertiser competition on a similar, relative scale. This is very useful in driving Pay per Click traffic to your website. The Google AdWords Keyword Tool can be a very useful starting step to help you select keywords for your campaign.

However, because you are not seeing any actual numbers, you cannot use the Google tool to predict search volume accurately. But it is an excellent resource for generating keyword lists as it is based on what terms people use to search on Google. And since it has the largest user base compared to all the other keyword research tools, you get a lot of depth when research keywords and building lists.

Part III: Do Follow Blogs

As I have mentioned in my 2 previous post Do Follow Blogs - What Are they?, and Part II: Do Follow Blogs.

I am going to post the third batch of this Do-Follow blogs for everyone to use in their quest to link building. As I have mentioned before, this blogs will surely boost your traffic and will ensure some greens in your PR bar on the next Google update.

But once again that spamming is not allowed. You wont get the desired backlink if you spam and might even be banned from the site. So, make use of it and post relevant comments and be friendly and join the discussion. Good luck!


Till the next update. I hope you already started posting to them.

Do you like my post? Feel free to subscribe to my feeds.

Link Building Basics

Links plays a major role in search engine rankings. They provide you with the popularity the you seek and the traffic that you need. There are lot of ways in acquiring them, but before you do you need to know everything about this links to avoid getting the unnecessary links or the link you don’t need for your site.

4 Important Things that you need to know about links
· Authority, trust and relevance of the link
· How search engines evaluate links
· Different type of links
· The value of link popularity

- the measure of how well search engines “trusts” a particular website to provide an accurate and reliable information. How does search engine measure trust? The answer, only SE knows. But trust is usually a name given to a set of metrics that appear to influence ones search engine rankings.

Key factors for Trust
· Links coming from authority sites and popular sites.
· Link popularity growth patterns and ageing
· Links coming from the same niche
· Power links from different niche with relevant link text – denoting the site as topical authority

Search engines use time-base indicators and link quality to establish web site trust. This trust in turn helps boost the sites search engine rankings. The best way in taking advantage of this approach is to:

· Gather your best links as early as possible(remember that link age is a factor)
· Adopt a natural link building pattern; as they say, slowly but surely.
· Make a link building balance by actively seeking “power links”.

- like trust, authority is a term that refers to a site’s reputation over a particular topic. Search engines use “authority” to measure a web sites reputation within a niche. A site maybe considered as an authority in one topic but doesn’t rank at all on other topics.

Key factors for Authority:
· Quality links from the same niche
· Links coming from different niche but using relevant link text
· Link popularity

When trust and authority combined, a domain can easily rank for search terms within their own niche. They can also dominate or rank on low-competition terms from other niche with no problems.

A good example for these are sites like : Wikipedia and These domain have built sufficient trust and authority that a new page on them can rank well on major search engines.

- it is simply a link coming from the same niche. Search engines consider them as a community/niche and measures them as a group of websites who interlinks to and with each other and carries a similar topic or theme.

Links coming from this sites carries more weight because they offer a better chance of evaluating the information that those that are outside the niche.

Link Evaluation
Search engine consider the following as primary factors in evaluating the value of a link:

· Anchor text
· Relevance of the linking site/page
· Authority of the linking site/page
· Link popularity/Pare rank of the linking site/page
· Different types of links
· Link and site age

Next time I will be discussing on the last 2 items. The value of Link Popularity and Different types of links.

Do you like my post? Feel free to subscribe to my feeds

Part II: Do Follow Blogs

As I have mentioned in my previous post Do Follow Blogs - What Are they?, I am going to post the next batch of this Do-Follow blogs for everyone to use in their quest to link building.This blogs will surely boost your traffic and will ensure some greens in your PR bar on the next Google update.

But i would like to remind you once again that spamming is not allowed. You wont get the desired backlink if you spam and might even be banned from the site. So, make use of it and post relevant comments and be friendly and join the discussion. Good luck!



Watch out for the next batch!

Do you like my post? Feel free to subscribe to my feeds.

Free Dot Edu Backlinks

Blog posting is one of the best way to gain traffic and site popularity. In most cases, webmasters are looking for High PR’d blogs of their niche where they can post and obtain that precious one way link.

Getting a high PR dot edu(.edu) backlinks will surely make every webmasters dream come true. These precious one way links from these sites or blogs will give their site better chances on getting greens or PR on the next google update.

But the problem in getting this links is that, they are hard to find especially for new webmasters. That is the reason why I decided to make this article to help every new webmaster find this precious dot edu blogs and start posting on them. This article will show you how to generate tons of free one way backlinks from .edu blogs.

But before we start, I want you to keep in mind that posting on this blogs needs proper posting etiquette and avoid spamming. Post high quality comments or you end up not getting that precious backlink. Read the post and understand it then post relevant info pertaining to the article you are posting at.

Steps in Getting that precious .edu link:

Use google and your search query should look like this:

site: .edu inurl: blog

This query tells Google to return a list of .edu sites that have blogs. But would it be better if we get results that list only blogs that will allow us to comment on them. There are lots of blogs that are closed or those that doesn’t allow anyone to post on them. So we need to improve our query and make it like this:

site: .edu inurl: blog “post a comment”

Will that query provide us with blogs we are looking for? Not quite. We still need to add some more filtering. We need to eliminate blogs that needs login to allow us to post and those with comments closed. We don’t need those blogs. So we need to refine our query to filter those blogs too. Here is our more refined query:

site: .edu inurl: blog “post a comment” –“comments closed” –“you must be logged in”

This query will provide us .edu blogs that we are looking for. We can now start commenting on them and get quality backlinks. There are also good chances that we get good traffic when posting quality comments on those blogs as well as smart interested readers on your topic.

We can get more specific results if we add one or two more word to our search to get the exact blog that we need. For instance, we are looking for blogs about web design, all we need to do is add that word in our search and we can start posting comments and get excellent backlinks from these blogs. So our final query will look like this:

site: .edu inurl: blog “post a comment” –“comments closed” –“you must be logged in” “web design

Well that’s it and enjoy. I hope you’ll be able to get quality .edu backlinks and get that green you have been waiting for.

Do you like my post? Feel free to subscribe to my rss feeds.