How To Ensure The Search Engines Find Your Website

One of the most fundamental aspects of search engine optimization (SEO) is ensuring that the pages within your website are as accessible as possible to the search engines. It's not only the homepage of a website that can be indexed, but also the internal pages within a site's structure. The internal pages of a site often contain important content such as products, services or general information, and therefore can be uniquely optimized for related terms. As a result, easy access to these pages is vital.

There are many do's and don'ts involved in ensuring all of your pages can be found by search engines. However, it is important to first establish how the search engines find and index web pages.

Search engines use "robots" (also known as "bots" or "spiders") to find content on the web for inclusion in their index. A robot is a computer programmer that can follow the hyperlinks on a web page, which is known as "crawling". When a robot finds a document it includes the contents within the search engine's index, then follows the next links it can find and continues the process of crawling and indexing. With this in mind, it becomes apparent that the navigational structure of a website is important in getting as many pages as possible indexed.

When considering the navigational structure of your site, the hierarchy of content should be considered. Search engines judge what they feel to be the most important pages of a site when considering rankings and a page's position in the site structure can influence this. The homepage is generally considered the most important page of a site - it is the top level document and usually attracts the most inbound links. From here, search engine robots can normally reach pages that are within three clicks of the homepage. Therefore, your most important pages should be one click away, the next important two clicks away and so forth.

The next thing to consider is how to link the pages together. Search engine robots can only follow generic HTML href links, meaning Flash links, JavaScript links, dropdown menus and submit buttons will all be inaccessible to robots. Links with query strings that have two or more parameters are also typically ignored, so be aware of this if you run a dynamically generated website.

The best links to use from an SEO perspective are generic HTML text links, as not only can they be followed by robots but the text contained in the anchor can also be used to describe the destination page - an optimization plus point. Image links are also acceptable but the ability to describe the destination page is diminished, as the alt attribute is not given as much ranking weight as anchor text.

The most natural way to organize content on a website is to categorize it. Break down your products, services or information into related categories and then structure this so that the most important aspects are linked to from the homepage. If you have a vast amount of information for each category then again you will want to narrow your content down further. This could involve having articles on a similar topic, different types of product for sale, or content that can be broken down geographically. Categorization is natural optimization - the further you break down your information the more content you can provide and the more niche key phrases there are that can be targeted.

If you are still concerned that your important pages may not get indexed, then you can consider adding a sitemap to your website. A sitemap can be best described as an index page - it is a list of links to all of the pages within a site contained on one page. If you link to a sitemap from your homepage then it gives a robot easy access to all of the pages within your site. Just remember - robots typically can't follow more than 100 links from one page, so if your site is larger than this you may want to consider spreading your sitemap across several pages.

There are many considerations to make when optimizing your site for search engines, and making your pages accessible to search engine robots should be the first step of your optimization process. Following the advice above will help you make your entire site accessible and aid you in gaining multiple rankings and extra traffic.
About the Author: Craig Broadbent is Search Engine Optimization Executive for UK-based internet marketing company, WebEvents Ltd. Clients of WebEvents benefit from a range of services designed to maximize ROI from internet marketing activities. To find out more, visit
Search Engine Submission Articles
Search Engine Ranking: Anchor Text is Key

This Anchor Text Experiment is a must......

If you have a web site, try this experiment when you have some spare time. Pick a nonsense phrase, like "bed happy meatball" or anything equally silly. Make sure it's something very unlikely to appear on a web page anywhere, and make sure it's a phrase (not a single word). Also, make sure it does not appear on your own site.

Next, get a few friends or co-workers with web sites of their own to post a link to your site using that exact phrase (without the quotation marks) as the anchor text. What's anchor text, you ask? It's simply the word or words that form the clickable part of a link.

Now, wait a while. Make a note to yourself to check your web site's ranking in the results for a search of your chosen nonsense phrase at a major search engine in a month or two. Unless you picked a phrase that actually appears on other sites, you'll find that your site is #1! Moreover, that's in spite of the fact that the chosen phrase does not appear anywhere on your actual web site. Think about that.

An infamous, large-scale example of this same test involved the phrase "miserable failure." Some enterprising bloggers got together a few years ago and decided to link lots of sites to the official biography page for President George W. Bush at the site. The goal, of course, was to make that page show up in the #1 position whenever unsuspecting (or in this case, many suspecting) searchers typed in that phrase. It worked. (Side note: at last check, Michael Moore - famous film maker and Bush detractor, was in the #2 position at Google for this same search). Again, keep in mind that the phrase "miserable failure" does not appear anywhere on either man's web site.

Is there a point? OK, so why bother with this seemingly asinine experiment (that's actually been dubbed 'Googlebombing')? Ahh, Grasshopper, for the lesson it imparts. Which is? Well, it points to the power of anchor text in determining search engine ranking. And it has definite relevance to your activities as a webmaster.
Many of your fellow site owners - including a lot of them who run sites in direct competition with yours - have never heard of anchor text. Some of you reading this may be unfamiliar with it. But, as should be clear now, anchor text plays a major role in search engine ranking positions.

In basic terms, it works like this...
Search engines rely on links to help them ascertain both the theme of a given web site and its popularity. Knowing that, consider two scenarios. In the first, your site has built up a lot of links pointing to it, and each one has your domain name as the clickable part of the link (anchor text). Let's say your domain name is your company's name, - and you sell baking supplies. OK, great - now your site will show up in the #1 position at the search engines whenever anyone searches for your domain name! Hmm. Think that one through. If they know your domain name, why would they need a search engine to find it?

In the second scenario, you have lots of links pointing to your bakery site, but instead of the domain name as the anchor text, you wisely chose a phrase that lots of people search for, like 'baking supplies.' Easy question: which would you prefer - being #1 at Google when people search for your domain name or being #1 when people search for bakery supplies? This is why the anchor text you choose for the links you build is so important.

A Plan of Action
Now, here's a simple plan of action to improve your site's link situation and search engine ranking going forward from this day...

Step 1 - Research Keywords
A great service is provided by the folks at They catalog search activity at the major engines, and then make available those numbers to the general public. You simply type in a word or phrase related to your site's theme, and wordtracker shows you the number of times that entry is being searched at the major search engines. Cool, huh? The service will also give you a list of related terms, so you can look for other important search words to target.

Step 2 - Pick a Few and Get Some Links
Compile a list of several search terms that are most closely related to your site's theme and that get searched for often. It's up to you, of course, but you should pick those phrases that get a few hundred to several thousand searches. These will be the terms you use in the anchor text of the inbound links you build from now on. Doing so will really increase your site's search engine traffic - once all your new links begin to boost your rankings.

Nothing Else Changes
Now, just carry on with your usual link building activities: reciprocal links, one-way links from directories and article distributions, etc. The only change is to make sure you choose a phrase from your list to use as the clickable part of the link you ask for (the anchor text). If you rotate your choices, your site will move up in the rankings for each phrase. The only downside is that you'll be getting fewer links per phrase, so it may take longer for any single phrase to rank high.

Keep in mind that the phrases you pick will be popular, unlike those in the examples that began this article. To score high rankings, you'll need to be diligent and get lots of links. Never stop! Over time, this strategy will really help your site's traffic, but it does take time. As the famous poet, Henry Wadsworth Longfellow, famously wrote: "All good things come to those who wait."

About the Author: John Schwartz is the owner of - specializing in professionally written web site content and articles. Our goal is to help clients increase web site traffic through links from related sites and higher search engine rankings.
An Easy Guide To Building Links to Your Website

Link Building is vital for a good placement in any search engine and most website owners are unaware of this or don't really understand how it actually works.
Apart from good Keyword research and the right meta tags, linking building is an absolutely must for the success of any website and is an on going process, so make sure your put some time aside for this on a weekly basis or pay some one with experience to do it for you. Without it your website will certainly get no traffic.

There are 4 different types of Linking involved and each one is as important as the other.

If they are done correctly, they most certainly will give you a good ranking in the search engines.

They are : Reciprocal Links, Oneway Links, MultiSite Links & Directory Listings.

Reciprocal Linking
Quite simply - this is a link from your site to another and they link back to you. A bit of advise here is to only link to sites that are relevant to yours. i.e - If you are selling Cars - only link to Car websites, a link to a Health website will really do you no justice, and trust me I have seen hundreds of website owners make this mistake. What's important here is not number of the links you have, but the quality and relevancy of the site your are linking back to. Be Selective and also take a good look at the sites linking to the site you want to link. It's really no good if this Car Site you want to link to has 40 reciprocal Links from Online Pharmacies.

How to Do this :
1. Pay a SEO Company to do it for you. (can be quite expensive)
2. Buy Them
3. Link Exchange Sites.
4. Search for them on your own.

Important : Do not add to many reciprocal links to quickly, build it up gradually otherwise you will be penalised by the search engines, as it will be seen as un-natural. A good way to start is not more than 10 to 20 in one month. As your site gets older you can start adding more.

One Way Links
This is what the search engines call a natural link, and these links are given a far better ranking than a reciprocal link. The easiest way to do this is to write articles on what you are selling and then submit them to article directories with a link back to your website. Website owners are always looking for content for their sites, and Article Directories are the easiest way for them to get content, with out them having to write them on their own. There are hundreds of these directories around and the more you submit to the quicker you will build up your one way links. It also is less time consuming than reciprocal linking and you will get far better results.

MultiSite Links
For this to be effective you need at least 3 to 4 websites to be involved in this. It is also seen as a natural link by the search engines and can be quite difficult to do if you only have one website. Although it can be done. You will just need to find 3 other websites that are interested.

How it works is this:
You link to site B, Site B links to site C, Site C links to site D and site D links back to you. This way all the sites get a link with out any of them creating a reciprocal link.

Directory Listings
This is also seen as a natural link as most directories don't require a link back. It's very simple and all you need to do is submit your site to as many directories as possible. There are thousand's of them on the Internet, so all it requires is a little bit of time and hard work. If you make it your goal to submit to one a day, the process won't become a tiresome.
Are Reciprocal Links Dead?

If the current indications are correct we may be looking at the end of reciprocal linking as a method of building rank and link popularity, at least as far as Google is concerned.

The latest 'Google Dance', nicknamed 'Jagger', has caused major concern by those suffering loss of position on the top ranks of the search engine's listings. So we decided to take a close look at what is happening and see what we could learn.

We have a few small websites that have a limited number of links. These sites are used mostly for research and testing of our primary business in Web Analytics. By analyzing these sites, we were able to quickly get an idea of what is happening in Google's Jagger Update, which is still in progress at the time of this writing.

By using our web analytics tools, we were able to look at the history of visits by the bots and the links to these small sites. We had to go back as far as January in order to build a picture of Google's actions. Our software also allows us to look at all links from the SEs, not just those shown by using the browser's 'link:' command. G only reports some of the links to your site, not all.

Here is what we have seen:
Like many other sites, we noticed a sharp drop in rank in our test sites around the first of July. They lost about 40% of their previous link popularity and moved down sharply in rank. Also, duplicate links on a single site disappeared. We now only showed one link from each linking site.

As Jagger started, unlike many others we have seen complain about G's actions and timing, our sites stayed rather stable. Evidently they had already suffered their major losses. However, there was a small increase in the number of links. This caught our attention. We had expected that, like many others, we would experience further disruptions to our link structure.

But when we examined these links, we were surprised to see that not one of them had been listed with Google a few weeks earlier. Not one. Our research showed that these links had been live in G's archive, but none had shown up publicly before now. It appeared that there was some sort of 'aging' process taking place, but this may just be coincidental. It is more likely that older links disappeared because the host site was lost in the shuffle and our links no longer appeared 'relevant'.

The other thing we noticed was that not one of these new links was listed on our reciprocal links pages. In other words, all reciprocal links had vanished. We think that this is because G is down-grading or eliminating reciprocal links as a measure of popularity. This does make sense, actually. Reciprocal links are a method of falsifying popularity. Sort of a cheap method of buying a link, if you want to think of it that way.

If your web sites have suffered from the latest 'dance', you may want to take a look at the type and source of your links. If they are mostly from link exchanges, you are probably looking at the reason for your move down the list on the search engines.

During the second week of the Jagger Update, a few of our reciprocal links did come back up. However, we also noticed that these were from places where we had highly relevant content. They came from articles where we discussed our area of expertise: Web Analytics, or from forums where we had relevant threads. So we feel that these links came back because of content, not linking.

The other group that came back up was one-way inbound text links, regardless of the originating web site. These links also had strong relevance to our web analytics business. In other words, they contained keywords and/or phrases related to our site and its business.

This research has us now re-evaluating our linking strategy. We urge others to do the same. We are now concentrating only on building strong one-way inbound links. We are focusing on publicity, articles, directories, and other direct methods of building our image and consumer awareness.
In addition, we are also looking for associated but non competing firms like web developers, Search Engine Marketers, SEOs, web site owners and designers to partner with us to build direct business relationships and the resulting inbound links. This strategy may not be the fastest method of building links, but we feel it is rock solid and within the spirit of good business practices. The best thing is that it is search engine independent.

We will no longer worry about chasing (or beating) the search engines and their ever changing algorithms. That is a fool's game we are sure to lose.
Instead, we will focus on building rock solid links and popularity with the group that counts: our customers. By focusing on beating our competition and providing a top quality product, plenty of educational information and relevant content, we are sure to move up and stay at the top of the search engine rankings.

It's something to think about.
About the Author: Will Moore is a web analytics specialist with over 20 years of hardware, software and web development experience. He has sat on the ANSII and ISO standards committees, been a speaker at major technical conferences in the US, Europe, China and Singapore and has written numerous articles on various technical subjects. Visit Web Stats Gold at for more articles and information.
The Real Secret to Understanding Web Statistics

Understanding what your visitors do on your site is crucial information. If your visitors proceed to purchase a product but then a large majority leaves the site when they get to a specific page in the order process, you need to know about it. It could be that this page is confusing or hard to use. Fixing it could increase your sales by 200%. This is just an example; there are many reasons why you want a detailed analysis of your site visitors.

Most website hosting services offer a stats package that you can study. If you're not sure where this is, call up your hosting service and ask them. Statistics are a vital part of tracking your marketing progress. If you don't have access to website statistics get a package that can help you in this area. Do not get a counter that simply shows how many visitors you've had. You'll be missing out on vital information that can help strengthen weaknesses in your site.

A good website hosting service offers traffic logs that provide an invaluable insight into the traffic being referred to a web site from various sources such as search engines, directories and other links.Unfortunately traffic tracking provided by web hosting services is often in the form of raw traffic log files or other difficult to understand cryptic formats. These log files are basically text files that describe actions on the site. It is literally impossible to use the raw log files to understand what your visitors are doing. If you do not have the patience to go through these huge traffic logs, opting for a traffic-logging package would be a good idea.

Basically, two options are available to you and these are: using a log analysis package or subscribing to a remotely hosted traffic logging service. A remotely hosted traffic logging service may be easy to use and is generally the cheaper option of the two. WebTrends Live and HitsLink are two good, remotely hosted, traffic-monitoring services worth considering. However, WebTrends Live is a more complicated system and is suitable for larger ecommerce websites. "SuperStats" is another recommended traffic logging service.

These services do not use your log files. Typically a small section of code is placed on any page you want to track. When the page is viewed, information is stored on the remote server and available in real time to view in charts and tables form. Log analysis packages are typically expensive to buy and complex to set up. Apart from commercial packages there are also some free log analysis packages available, such as Analog.

A good traffic logging service would provide statistics pertaining to the following:
" How many people visit your site? " Where are they from? " How are visitors finding your site? " What traffic is coming from search engines, links from other sites, and other sources? " What keyword search phrases are they using to find your site? " What pages are frequented the most - what information are visitors most interested in? " How do visitors navigate within your web site?

Knowing the answers to these and other fundamental questions is essential for making informed decisions that maximize the return on investment (ROI) of your web site investment. The most important aspect of tracking visitors to your website is analyzing all the statistics you get from your tracking software. The three main statistics that will show your overall progress are hits, visitors and page views. Hits are tracked when any picture or page loads from your server on to a visitor's browser. Hits, however, can be very misleading. It is quite an irrelevant statistic for your website.

The statistic that is probably the most important for a website is Page Views/Visitors. This gives you a good indication of two things. First, how many people are coming to your site, and secondly how long are they staying on your site. If you have 250 visitors and 300 page views you can figure that most visitors view one page on your site and then leave. Generally, if you're not getting 2 page views per visitor then you should consider upgrading your site's content so your visitors will stay around longer.

If you see the number of visitors you have increasing as well as the number of page views per visitor increasing then keep up the good work! Always look for this stat as an overall barometer of how your site design is going and if your marketing campaigns are taking hold. Also, a good stat to look for is unique visitors. Once a person visits your site they will not be added to the unique visitors' category if they visit again. This is a good way to track new visitors to your website.

Page views are a good indication of how "sticky" your website is. A good statistic to keep is Page Views divided by the number of Visitors you have. This statistic will give you a good idea if your content is interesting and if your visitors are staying on your site for a long time and surfing.

Some people are intimidated by web traffic statistics (mostly because of the sheer volume of data available), but they shouldn't be. While there are many highly specialized statistics that can be used for more in-depth web traffic analysis, the above areas alone can provide invaluable information on your visitors and your website performance. Remember- this data is available for a reason. It's up to you to use it.
About the Author: Alden Smith is an award winning author who has been marketing on the internet for over 7 years. His site,, is loaded with articles and information for the beginning blogger and internet marketer.
Using New Content to Build Links

Sometimes, link building is more than just searching out sites to request links from. Sometimes you have to get creative in how you build links.
In this article, we look at another way of building links that doesn't really require you to go out and search for relevant sites to request links from.
The web is growing at a phenomenal rate. Technorati, a popular blog search and syndication site estimates that the blogosphere alone doubles in size every 5 months. As of the end of July 2005, Technorati was tracking over 14.2 million weblogs, and over 1.3 billion links.

Who knows how much the rest of the web grows? I would bet that while it doesn't double every 5 months its rate of growth is pretty impressive.
It is because of this growth in the web that other forms of link building become somewhat easier. I am talking about building links through content creation and publishing.

Chances are you are reading this article on the Text Link Brokers blog, or one of a number of syndication partners who agree to republish the article with links intact.
Through such syndication, you could come across this article through a variety of high profile websites on the web. In addition, these high profile sites are industry specific. This means that any links I embed into this article (which is then syndicated) will ultimately point back to this site on important key phrases.

Think about this, for the time it takes me to write this article, I could have built as many as 2 dozen high quality, keyword rich links back to both the main site as well as the blog site. Normally, for me to build 2 dozen high quality links for one of my clients I'd have to start with a list of about 500 somewhat related sites, filtering out those that are lower quality and submitting to 50 to 100 sites in hopes of achieving those 20 or so links.

And it has been my experience that I would be lucky to achieve 5 to 10 links from that initial list of 500 sites. All this would have taken me about 5 or 6 hours ? even longer if I hadn't used a few tools to help gather those links. Yet here I am with much less effort, able to achieve almost the same number of links.

That's the nice thing about content, it can do so many things for your site: -A growing site helps encourage search engine crawlers to visit repeatedly. -A growing site has more pages which have the potential to rank for other phrases. -A growing site offers more entry points to searchers. -A growing site offers more opportunities for others to link to it. -A growing site can help positively influence link popularity (if internal navigation is coded properly -And More...

There are many other great reasons for starting an ongoing content development program. Aside from the link building opportunities, you can also begin to develop your online reputation as an expert in your field.

Further, as visitors do searches on search engines, there is a greater opportunity for your content to appear for those searches, helping to build your brand.
If you take you content development program a step further and syndicate your content to a wider audience (via blog pings and so on) you can reach even more people, potentially building even more links and allowing your name (and brand) to reach beyond the ?traditional? web.

For example, when I do a search for my name, I find myself in traditional organic SERPs but also on sites like Google News, as well as most of the main blog search engines. This is because this site, and others I write for, are syndicated. Plus those sites that I mentioned earlier ? the syndication partners ? are also syndicated.
So my articles appear numerous times for the same search. This helps build my reputation online. Not only does my name appear throughout the web, but articles like these also get picked up by even more sources. Ones that perhaps didn't read this blog, or one of the syndication partners, but they may have found it on Bloglines, Technorati or any of the other large blog search engines.

Then, the article gets picked up by even more sources, in its entirety, with links and all. So, the number of new links I've created has now jumped from the original two on this article, to a couple dozen on our syndication partners to ???? It's interesting to see where articles get picked up. I've found myself quoted in PDFs belonging to Universities, on foreign sites where I've been translated into Korean, Chinese and even Russian. And, you guessed it, the links remain intact.
That's because these articles aren't like news ? they last much longer than a press release which, while gaining huge exposure for 2 or 3 days, quickly disappears.
The articles last "forever" because they continue to be circulated by various sites who find them in searches, and either copy them or link to them. Then their sites get syndicated and found by others who then also link or copy the article.

You may begin to see that this type of linking can go on almost forever, because what I'm writing here isn't necessarily newsworthy, but it is an article that people will find useful for months and years to come (I hope). As it becomes more and more established on the web (and more entrenched, because of the number of high quality related links already pointing to it) it begins to take on a life of its own.

And the more articles which I write for this site which appear like this, the better it is for the site. So, what is the downside to this plan?
The only one, really, is that you have to be able to write. And not just scribble your ideas down, but make them intelligible and easy to read.
This is what takes the practice. But I can tell you that while you may (and likely will) labor for hours over your first few articles, over time they do get easier.
So much so that you will begin composing them in your sleep, or while you are waiting for your bus, or any place else where you have "down time".

If you are concerned that a massive and costly link building campaign is your only option to increasing your online visibility think again. Sometimes something as simple as an "I was thinking" article can drive dozens of new relevant links to your site.

With such value in your search listings, it’s wise to extend your branding efforts through SEM campaigns. The coordination of SEM and advertising expertise ensures that all critical ad elements work together for both conversions and the elevation of your brand.
About the Author: Rob Sullivan -
Rob Sullivan - SEO Specialist and Internet Marketing Consultant.
Branding With Search Marketing

Branding is a major goal for marketers around the world, and companies typically allocate big bucks for this objective. But branding was always hard to measure, that is until the web became a commercial medium. Online advertising started with banner ads, and it didn´t take long before marketers realized search engine listings drive large volumes of targeted traffic to websites. Search traffic is golden because it doesn´t interrupt consumer behavior. Users are actively seeking information and want to be driven to its source. As they view the search listings for their query, these text descriptions function as ads that produce awareness.

Search Branding Studies
In 2001, NPD Group examined the effectiveness of three types of search engine ads: search listings, banner ads, and the tile ads to the right of search listings. The search editorial listings were read and clicked upon significantly more often than banners or tile ads, and they also produced more sales. Conclusion: Search listings provide more brand awareness than any other ads in a search environment.

In 2004, Interactive Advertising Bureau (IAB) and Nielsen/NetRatings explored branding produced by search listing text ads versus the contextual ads in the right column, focusing on four branding attributes (unaided brand awareness, aided ad awareness, familiarity and brand image associations). Conclusion: Branding in search listings is stronger than contextual ad branding, particularly when the brand holds the top position in the results page.

The Importance of Branding
Your brand is what identifies your business to consumers. It resides in the hearts and minds of your customers and prospects as the sum total of their experiences with, and perceptions of, your company. Good branding ensures loyal customers, and your existing customer relationships are the key to profitability. So it´s no wonder that branding is a major marketing goal.

SEMPO research on business marketing goals shows that most companies place "increase brand awareness" at the top of their list. Other goals include, "selling products/services online", "generating leads", "increasing traffic", "generating leads for distributors", and "providing information/education".

The Branding Component of Search Marketing
Up until now, the major goal of search engine marketing (SEM) has been to drive targeted traffic to your site for lead generation and online or offline conversions. However, it’s now evident that during the search process, another valuable advertising goal is achieved -- that of branding.
Some SEM firms don’t hype branding because of their focus on driving qualified traffic to produce leads and sales. However, at Bruce Clay we feel the branding aspect of SEM is too important to be overlooked.

How Search Branding Works
When indexing your site, most search engines will use your website’s Title and Description, or the information therein, to create the text link that appears in the SERPs (search engine results pages). This link, and the brief description of your site that follows, function as an “ad” when users view your search engine listing.
Every time your listing shows prominently in the SERPs, branding takes place. You can achieve better branding when you ensure that all your ad elements encourage maximum awareness upon click-through to your landing page. There are two major factors of importance in an SEM branding campaign: your branding message on the results page (Title/Description or Paid Text Ad) and your call to action on the landing page.

Coordinating Critical Ad Elements
Your Title and Description Tags are critical elements in a professional search engine optimization (SEO) campaign. When your website architecture, linking and content are properly optimized, these elements will help bring you to prominent positioning in the SERPs.

With paid search ads, a professional SEM firm will research and identify strategic key phrases, write the text ad, develop your bidding strategy, monitor bids, and track and fine-tune changes. Here, too, there’s a Title and Description that shows in the SERPs.

Your landing page is an important ad element for both SEO and Paid Search campaigns. Copy and creative should be strategically composed as an extension of your “search ad” on the SERPs, with the landing page focused solely on the desired action. Ideally, these marketing elements should be prepared by SEM pros with advertising expertise.

Measuring Search Branding
How do you measure the effects of branding in a search engine marketing campaign? A web analytics program can do more than simply report statistics. These systems for compiling data will analyze your web logs to effectively manage your SEM campaign and measure your brand effectiveness. Below are some of the data points that can be used to measure the brand impact of an SEM campaign.

- Average Time On Site: The longer your visitors browse your site upon arrival from a search engine, the better chance you have for future conversions.
- Page Views Per Visitor: The more pages your prospects visit and read, the greater the odds of communicating your marketing message. This contributes to branding   awareness.
- Path Views to Registration or Subscription Sign-Ups: This is the same as the visitor giving you permission to form a business relationship. It starts a dialog and 
  allows you to continue building your brand, moving the visitor closer to conversion.
- White Paper Downloads: The more interest is shown in your products/services, the more branding takes place, and the user moves closer to conversion.
- Navigation Report: This shows where visitors go next, pointing prospects to distributor or retailer sites. When search listings result in a click to a seller site, there is
  likelihood of purchase and proof of branding.

Maximize Your Branding Efforts
With such value in your search listings, it’s wise to extend your branding efforts through SEM campaigns. The coordination of SEM and advertising expertise ensures that all critical ad elements work together for both conversions and the elevation of your brand.
About the Author: Paul J. Bruemmer
Paul J. Bruemmer has provided search engine marketing expertise and consulting services to prominent American businesses since 1995. As Principal Business Analyst for Bruce Clay, Inc., he is responsible for strategizing and implementing business development activities. Paul is a well-known industry columnist, having written articles for ClickZ, Search Engine Guide, Pandia, MarketingProfs, iMediaConnection, and SitePoint.He has also been a featured speaker at the Search Engine Strategies Conference and at eComXpo.
Search Engine Marketing Research Findings

This research attempts to obtain business firms' views on search engine marketing. It gives valuable implications to both business and search engine marketing firms.
The research collects findings from 121 Hong Kong trading firms involving in foreign trade through telephone interview. Also, a separate focus group was held with 8 marketing managers and 1 search engine optimization (SEO) specialist. We believe the implications also apply to other geographical areas.
We have extracted part of the results to present in this article.

1. Employed SEO service or Exercised SEO Before?
103 (85%) respondents reported that they have neither employed any search engine optimization service nor self-exercised any search engine optimization service. However, 18 (15%) respondents pursued some sort of search engine marketing including search engine submission to search engine optimization.

2. Importance and Concerns of Search Engine Marketing/Search Engine Optimization
A rating scale of 1 to 5 was employed, with 5 means Very important and 1 means very unimportant. Respondents reported that they perceived that search engine marketing could be a cost-effective method to help them get new overseas customers (rating 4.7). Respondents also considered the following factors whether they would use a search engine marketing firm: 1. Pricing (4.6), 2. Deliverables (4.8), 3. Guarantee Clause (4.0), 4. Ranking against rival firms (4.7), 5. Possibility to get to Top 10 ranking (4.1), 6. Chance to increase response rate (4.8)

3. Paid for Performance Search Engine Marketing
This question has an explanation about the meaning of paid for performance search engine marketing. The question includes pay-per-click and paid-for-inclusion plan.
Again, a rating scale of 1-5 was employed, with 5 means Very important and 1 means very unimportant.
The respondents expressed that they concerned the possible high cost (4.8), potential fraudulent clicks (4.9), delivery of message to right persons (4.7), and time required to manage the campaign (4.2).

4. Understanding of Search Engine Optimization
This question asked about their knowledge of search engine optimization techniques. Respondents were given 3 choices: 1. Yes, 2. No, 3. Don't know.
31 respondents perceived that the major task of search engine optimization was adding keywords in Title and Meta Tags while another 62 of them were not sure about the answers. 108 respondents did not know link building was an important factor that could help them improve search engine ranking. Interestingly, 72 respondents believed that program ¡§submit to 1000 search engines¡¨ could increase their website traffic. 101 respondents did not realize that content analysis and improvement measures such as keyword density ratio and keyword proximity ratio played a role in search engine optimization.
In general, most respondents believed that they did not possess sufficient knowledge on search engine optimization.

5. Focus Group
Another 8 marketing managers were invited to a focus group interview. 4 of them had experience in employing search engine optimization firms, and another 4 considered using search engine optimization services but finally had not taken action. An SEO specialist was acted as a moderator in the interview.

In the interview, they generally agreed that search engine marketing could help them to generate business leads. For those who did not use SEO service expressed that they abandoned to pursue as they could not make up a decision. First, they were not familiarize with how to achieve good search engine ranking. Second, the pricing scheme offered by different SEO firms varied greatly from US$19.99 to US$5,000 up-front fee, and some also required US$500 to US$5,000 monthly service fee. However, the SEO firms could not express clearly the deliverables that clients could obtain and they also refused to clearly explain the methodologies that they would use. These facts made the interviewees very difficult to persuade their boss to use SEO service.

In addition, the interviewees expressed that they feared that SEO firms would require many website changes, and some changes like keywords stuffing in Alt Tag or put too many texts and too little graphics in web pages make the site looked unprofessional.

Another point made by interviewees was very interesting. For those who have employed SEO firms, some of them perceived that they were being kept in the dark as they did not know what the SEO firms were actually doing! When they scrutinized the changes made by SEO firms, they found that the SEO firms have just written some keyword phrases in Alt Tag, Title Tag and repeated them in Meta Tags. Then, the SEO firms only ran reports for them in the subsequent months and asked them to wait for the results. Finally, they found that they had not got any extra traffic from keyword phrases suggested by SEO firms while the SEO firms claimed that they had achieved more than what was written in the guarantee clause (e.g. 20 Top 20 rankings in major search engines). Only 1 interviewee felt satisfied to the chosen SEO firm. This was because they knew the deliverable, got desirable number of top rankings, and a piece of link exchange software and they could continuously manage link building themselves after the project end. The SEO firm also provided suggestions for them to maintain ranking afterwards.
Also, participants voiced out that they also wanted to know their search engine rankings against their rival firms.

6. Implications to Business Firms

In general, it was a consensus that search engine marketing could help to generate new business leads. However, it was not a simple activity. Most business firms lacked the knowledge and tempted to use an SEO firm. However, quality of SEO firms varied. In order to select a good SEO service, business firms should define clearly their objectives. E.g. some firms might simply define that they wanted a better ranking than their rivals or that they wanted a better conversion rate. Then, it was suggested that the business firms talk with the SEO candidates and try to differentiate which one was more knowledgeable. Moreover, business firms should discuss with the selected SEO firm in order to get an agreement on the number of deliverables and project schedule.

From the research findings, many business firms were yet to optimize their websites. That means if you first optimize your website, you could have an advantage over your rivals in search engine exposure perspective.

7. Implications to SEO firms
SEO firms should spend time and clearly explain to clients the SEO techniques and methodologies that they would use. Some SEO firms might fear that they would expose their ¡§top secret¡¨ techniques. However, you could not get business if your clients could not understand the SEO processes and failed to persuade their bosses. In addition, SEO firms should apply optimization techniques with a consideration of usability of websites. Do not make the site look silly or unprofessional. Also, clients would not mind your using of optimization software such as Webposition Gold and they do want to know how you would serve them.

SEO firms should assist clients to devise a measurement matrix for the campaign. This helps the decision maker to make judgment and explain to his/her colleagues and boss. Clients would feel more comfortable to do business with you.
Moreover, SEO firms should define clearly what would be delivered to clients and the delivery schedule, e.g. how many ranking reports, keyword research result, etc. In addition, SEO firms should contact clients frequently and report the progress. Clients would then understand that you were indeed working for them and you could help to set or adjust their expectations.
Finally, SEO firms should offer assistance for clients who want to terminate the relationship and to maintain rankings themselves. The idea looked ridiculous but the clients would be satisfied by such offer and would probably recommend you to other firms. After all, clients who ceased to work with you might be due to many other reasons but not your performance. Why don't you give them another ¡§candy¡¨ and maintain a good relationship?
About the Author: Jimsun Lui -
Jimsun, from Agog Digital Marketing offers Search Engine Optimization service including English and Chinese search engines. also owns Chinese website promotion portal and is developing an advance Ecommerce Shopping Cart Software
Future Of Web Design Is Content Management

Web development has greatly increased in popularity over the last 5 years. Many new design concepts, code standards, and technology advances have happened in a short amount of time. With that, so has the knowledge and demand for better, more independent and functional web design packages.
More and more we are starting to see a shift in consumer demand for the increasingly popular website content management system.
Most website owner's are typical business entrepreneurs who don't have the time to chase down their web design company for some minor updates that usually cost an arm and a leg. Updates for websites are becoming more and more necessary. It's now a reality and a trend that in order to make something happen with your website online, you need to stay on top of things and create new content to keep visitors coming back.
As entrepreneurs, we all get new creative ideas almost every day on how to improve our products or services. Without the ability to update our own website's, those fresh, new ideas may not become a reality for a long time.

Website Content Management Systems Are The Future
This is somewhat of a call out to all web design companies. If you cannot offer content management to your clients, you may be left in the dust within a few years. The more affordable content management becomes, the more in demand it will be. Without giving your prospects this crucial option, you may lose a great chunk of your potential clients to the next web design company that has a fully automated system that states: "all the consumer has to do is login and get started."
Granted, there will always be a need for web designers. That is an understatement but with the option of content management, you can also decrease the amount of work needed to put into each project and concentrate more on marketing your business and it's services.

Content Management Gives The Consumer The Freedom They Need
Without giving too much freedom that may make the website look bad, there is a high demand for the ability to update a website when needed, not when convenient. People like to have power over managing their own company, content management gives them the freedom they need to expand on their own terms, without extra costs.

Here Are Typical Features Of A "CMS":
- Add/remove/edit pages.
- Update content within each page.
- Add images where needed.
- Update contact information.
- Show updated listings (i.e. Real estate listings, Mortgage rates).
- Add new tips on their industry everyday (The spawn of blogging).
+ Many extra features not listed here.

Take The Real Estate Industry For Example:
In the last 2 years, "Real Estate Content Management Systems" are popping up everywhere we look. I can recall reviewing over 50 websites that offer this style of service. And why not! Real estate agents as a whole spend a great deal of money marketing themselves. Just in the last couple of years, real estate agents have seen more value in marketing online than they have through regular print media. Many real estate agents I know would rather spend $4,000 for a website rather than spend $4,000 getting listed in the local telephone book.

In Conclusion:
If you offer web design services and have (CMS) Content Management Systems available to your visitors, this might be the time to consider this ever growing popular service for your company. You won't regret putting in the effort of developing your own system and marketing it, there is a shifting demand for this ever-popular freedom online.
About the Author: Martin Lemieux -
Martin Lemieux is the president of the Smartads Advertising Network. Smartads is here to help small to large companies grow online and offline. Visit the Smartads Network today! International:
Site Defacements

A valid fear every webmaster faces is the defacement of their site. According to the Computer Security Institute (CSI), 2005 Computer Crime and Security Survey, web site defacements are the “fastest-growing” area of incident. A check of seems to validate the finding with a display of over 750 sites defacement for a single date (8/15/2005).

To address defacements, it is first important to understand how defacements occur and what can be done to prevent them. Generally, sites can be vulnerable due to undisclosed vulnerabilities in vendor software, a missing security patch, misconfiguration, and/or bad site programming. Any of these vulnerabilities could permit an attacker to gain access that would allow defacement.

While not much can be done concerning undisclosed vendor vulnerabilities, the other causes are correctable. When vendor security patches are released, install them quickly. When patches are released, many attackers are reverse engineering the patch to discover the vulnerability being addressed. It is not uncommon to find exploit code published on the internet within 48 hours of a patch’s release.

Verify your server and site configurations. Specific areas of concern are normally FTP upload rights, site publishing rights, server login privileges, open ports and passwords. Delete or seriously restrict the ability of people to anonymously upload files. Check for the use of default passwords and for ones that can be easily guessed. Double check your systems open ports and the publishing rights of your web server software. Numerous companies offer free products or free initial vulnerability scans that can confirm your system settings. Using the search engine term “free vulnerability scanning” will yield dozens of companies and products.

Check your site code to verify errors and unintended data are being dealt with correctly. Regardless of what a visitor does, input should be validated and all errors should return a graceful message. A few areas to check: are your pages vulnerable to buffer overruns due to incorrect data being entered; are your pages vulnerable to SQL or scripting code injection; does your error messages reveal sensitive information such as connection strings, passwords, or system information?

Establish a schedule and process to monitor system changes, configurations, and code. While researching this article, I noticed a Zone-H posting that a Microsoft United Kingdom site was defaced. While the attacker did not publish how the attack was executed, it is safe to assume configuration played a large role. Software features change with each patch applied, mistakes happen and code changes.

The CSI report points out that the dollar losses caused by web site defacements are actually very low in relation to losses suffered by viruses and the theft of proprietary information. The report goes on to state that “losses (such as the lost future sales due to negative media coverage following a breach)” were not largely represented in the cost figures. I believe that most victims of site defacements will agree that embarrassment far outweighs the dollar loss suffered.

When considering defacement strategies, web site monitoring services should also be considered. Many monitoring services offer the ability to check for the existence of keywords or page changes. While monitoring services will not prevent defacements, site monitoring will at least alert you of the event. Hopefully, before you suffer negative media coverage.
About the Author: Lew Newlin -
Search Engine Wars

Quality Searches Vs Quantity

It is no secret that Google and Yahoo are on a continuous battle to win our hearts and get everyone to convert, but is converting someone really a matter of the quantity or the quality?

Let's take a look at some top key searches and compare them with some search engines online. I will outline a few things for each search result:

1) Search Engine
2) Number of results found
3) Quality & content of the top 10 sites
4) What you find going beyond the first 10 pages

Each section will get ranked out of 10 points for quality (information taken on August 26,2005).

Starting with my all-time favorite search term: "INTERNET MARKETING"
- 99,000,000 results
- 10/10 on quality
- 10/10 Past 10 pages still delivers top quality results
- 281,000,000 results
- 7/10 on quality. There's no reason to list a "Hotel Marketing Firm" & "Building websites". The #1 spot was reserved for Yahoo marketing.
- 9/10 Past 10 pages, it is very generic for business, not specific.
- 93,661,176 results
- 10/10 on quality
- 10/10 Past 10 pages still delivers high quality results. I am surprised and give MSN two thumbs up for their attention to detail.

Moving onto the search term for "BUSINESS NEWS"
- 627,000,000 results
- 10/10 on quality
- 10/10 Beyond 10 pages delivers high quality & local news centers.
- 1,260,000,000 results (wow)
- 10/10 on quality
- 9/10 Beyond 10 pages. There are still some sites that should never be there.
- 381,631,054 results
- 8/10 on quality - Some aren't related at all and brand new sites as well.
- 10/10 beyond 10 pages. Their results seem to tighten up and get better.

Let's now take some more "local" search results to see deeper and more targeted results...

Moving on to the search term for "WASHINGTON UNIVERSITIES".
I hope to find the "University of Washington" come up #1 or thereabouts.
- 22,700,000
- 10/10 on quality ( is #1)
- 10/10 beyond 10 pages. The results still deliver "university" topics.
- 143,000,000 results
- 9/10 on quality - The top spot is reserved for Yahoo on Washington University in St. Louis. There seems to be a strong battle going on here on which university to list.
- 10/10 beyond 10 pages. Content is specific and relevant.
- 34,442,536 results
- 9/10 on quality - Again another battle going on and is #10 tilting on the verge of page 2.
- 10/10 beyond 10 pages. All related to Washington & University living

Let's now get even more specific than that... For this search term I will be using something very local that I can relate to and give a better analysis.

Moving on to the search term for "HAMILTON ONTARIO CANADA".
- 3,700,000 results
- 10/10 on quality - Great job
- 10/10 beyond 10 pages - Anything goes but is directly related to Hamilton.
- 14,900,000 results
- 10/10 on quality - Almost the same as Google but a couple of different choices.
- 9/10 on quality - Some results are found by keyword stuffing their pages.

MSN Local *New (
- 1,280,405 results
- 9/10 on quality - Getting some random placements & keyword stuffing
- 10/10+ beyond 10 pages - Many local companies well listed in the results.

Let's now take a look at these results as a total. Out of a possible score of 80 points, here are the total scores for each:

Google - 80 points
MSN - 76 points
Yahoo - 73 points

Total search results delivered:

Google - 752,400,000
Yahoo - 1,698,900,000
MSN - 511,015,171

And people wonder why Google is the king of searching?! But wait, without even noticing the results, as I totaled the final point standings, MSN came up in second place! In my study, I tried to be as neutral as possible.

How is it that Msn & Google have 50% the amount of results shown in Yahoo, but outrank Yahoo in quality?

In conclusion:

Even though Yahoo delivered a report stating that it has over 19 billion search items, what does it matter when you're still trying to figure out how to deliver all that content? MSN & Google seem to know exactly what to do with their results. They don't seem "off the wall" at all. If Yahoo is to step up and be crowned the search king, I really think that they need to refine the amount of search terms they have, to match the quality of search results as well.

Yahoo has major potential to outreach Google but they still have a lot of work ahead of them and by the time they figure things out, Google may even step up their game even further and wow us all on some other search plateau.

About the Author: Martin Lemieux -
Martin Lemieux is the president of the Smartads Advertising Network. Smartads is here to help small to large companies grow online and offline. Visit the Smartads Network today! International:

Google SiteMaps and You


Last week, we looked ( at the recent news that Microsoft had decided to embrace RSS in a big way in its upcoming releases of Internet Explorer and Windows "Longhorn" and determined that this was a Good Thing. This week, we're taking a look at implementing Google Sitemaps, a similar technology developed by Google in order to help you define your site more effectively to the search-engine behemoth.

This is not a ticket to a higher Google ranking (at least not that we know about); but it is a useful tool that lets you apply RSS-like control to your website's interactions with the Googlebot.

RSS (Really Simple Syndication) is the current heavyweight of so-called "disruptive technologies" (loosely defined as those that have the effect, if not developed with the intention, of changing the way we use technology in general) and its use is skyrocketing among content providers looking for a way to get their content in front of more eyes and ears. But RSS originally stood for Rich Site Summary, a standard way of cataloging your site's content for third-party aggregators.

Google Sitemaps have a similar function, in that they are an XML-based way to describe website content in a standard, predictable way; but they differ in that Sitemaps are intended for the Googlebot's eyes only, rather than for any third-party. Think of them as an automated way to make sure Google knows about your site's content (please note, however, that Google does not guarantee inclusion of your content based solely on the presence of a Sitemap file).

This sounds like a very specific undertaking, but the importance of Google to getting your site's content noticed can simply not be overstated. And with Google's expanding reach into more and more areas of Web content presentation, chances are that you can be assured that the information your Sitemap provides will eventually find some use you haven't yet thought about. That's what disruptive technology is all about, and Google has become one of the more innovative champions of such technological advances.

Where To Start:

The first thing you should do as a website developer is create a Google Account for yourself or your company. This will allow you to do other things besides access the Sitemaps infrastructure; but we'll leave that for another day. Create the account here
( and then proceed to the Sitemaps area at (
Once you've logged in, you'll see the sparse Sitemaps interface. Don't be fooled, however, because like the simple interface to its search engine, this one hides quite a bit of information regarding the creation and use of Sitemaps, presenting it in digestible bites as you walk through the process.

There's probably more there than you need to know at this point, provided you don't have a huge site with a need for multiple Sitemaps and so on. But if you do have such a site, the information is there for creating truly complex Sitemaps and Sitemap Indices referencing many Sitemaps and you can familiarize yourself with that as needed. For now, we'll concentrate on what's required to establish a Sitemap for our site at Cafe ID (

Like creating RSS feeds, creating a Google Sitemap is as simple as putting together an XML file at the root level of your site that describes the site according to the instructions that Google has laid out. You can use any text editor for this purpose, but some editors do a better job of helping you create properly formatted XML files. We heartily recommend two that cost money, BBEdit on Mac OS X
( and Macromedia's Homesite on Windows (, but there are excellent free alternatives out there and when it comes to text editors, personal preferences take on an almost religious importance, so we won't proselytize about that here.

The Googlebot recognizes several Sitemap formats, ranging from a simple list of URLs to Sitemaps already created using something called the "Open Archive Initiative protocol for metadata harvesting", a format apparently popular with library collections. The OAI protocol is an advanced XML specification that you don't need to worry about if you don't already understand. An intermediate XML format is what we recommend, over the simple URL list, because of the additional information you can associate with each constituent URL of your site.

If you do want to just get started quickly, simply create a text file that looks like this:

making sure that the file in question does not include embedded newline characters and uses the UTF-8 text encoding (check your text editor settings). Also, your sitemap may not contain more than 50,000 URLs and all URLs must me fully-formed since they will be used directly during the Googlebot's crawl.

Getting Fancy:

The more advanced format isn't much more difficult to create and lets you specify additional information about each URL. The protocol is described fully here ( and is too detailed to explain here. Your finished file will look something like this, except (hopefully) with more URLs specified:

Your Sitemap's location dictates what URLs can be included in it. A Sitemap placed at the root level of your site can specify any URLs on that site, while a Sitemap placed at can not include URLs under, for example.

You can take as full or as little advantage of the availability of the various additional XML tags available in this format. Each needs to include at least the specification, but need not include the other three, and all URLs in a Sitemap file must be encapsulated within the tag. We recommend using at least the tag and the flag to let the Googlebot know how often it should check your site for updated content. Be sure to change the date, and maybe even the time, specified in the tag any time you actually update your site.

One more caveat is that your URL specifications must be XML-encoded, similarly to the way they're encoded under RSS. What this means is spelled out in detail here (, but essentially, what you're doing is converting a URL like>2 to look like this:>2 (Note the substitution for the HTML entities & and > for the "&" and ">" symbols.)

Done. Now What Do I Do With It?:

You're almost home. Upload the Sitemap file you create to your server and then add the URL to the file itself using your Google Sitemaps account. You don't need to use the account, but doing so will allow you to keep track of what you've uploaded.
You're welcome to compress your Sitemap file using gzip, found typically on Mac OS X, Linux and BSD (normal PC zipping won't work, although you can certainly find a third-party gzip program for your Windows box). Click the "Add Your First Sitemap" link on the main Sitemaps page after you've logged into your Google Sitemaps account, and that's all there is to it!

You can use your Sitemaps account to keep track of and receive diagnostic information about your Sitemap submissions. You don't need to create a Sitemaps account, however, and if you already have a Google account for receiving Alerts, for accessing the Web Developer APIs and so on, your existing account will work as a Sitemaps account automatically.

Google has already played a significant role in shifting the paradigm of discovering the Web from doing so by following links to doing so by searching, and the company shows no signs of slowing down. Subscribing may well be the next paradigm, based on the flexibility of the protocols that put content syndication in the hands of mere mortals, and getting your content cataloged in these formats should be among your first priorities. The web browser and operating system is adjusting quickly to this new paradigm, and you should be too.

About the Author: Trevor Bauknight is a web designer and writer with over 15 years of experience on the Internet. He specializes in the creation and maintenance of business and personal identity online and can be reached at Stop by for a free tryout of the revolutionary SiteBuildingSystem and check out our Flash-based website and IMAP e-mail hosting solutions, complete with live support.


To discontinue mailings, click here
7 Search Engine Optimization Mistakes and Solutions

To many websites, webmasters discover that major sources of website traffic come from search engines. Therefore, they are all keen on gaining top search engine placements through search engine optimization. Based on our several years of SEO experience, we point out some common mistakes and shed some lights to correct it.

1. Cannot Get Indexed by Search Engines, Really?
A garment ERP software solution provider came to me and asked a question: "I have established a new website for 1 year, and found a SEO company to submit my website. However, my website can be found in search engines only when I type my domain name in search query box. The SEO company told me:

"My domain got banned. What can I do?"
Ah, I told them they have been fooled by that SEO company. If you type your domain name in search box and your website can be found, which means your website has not been banned. Interestingly, your SEO company's domain name is no longer found in Google. Their domain got banned only.
To help them find out their real problems, I found the following in search engines:

a. When typing their domain name in search query box, their site will be displayed in search result. Instead of displaying
Title Tag in the search result, search engine displays their domain name only.

b. Only 1 page is being indexed.
What's the problem then?

With a closer look to their coding, I found that at the top of their webpage, they heavily use Javascript to present their heading, company name, visible content, and website menu. Actually, many search engines have difficulties in reading Javascript codes.

Put those Javascript in external js files and leave the webpages with plain HTML codes. Search engines usually have problems in crawling Javascript. Since their whole site navigation menu is written in Javascript, they should build a sitemap using plain HTML so that search engines can read all of their pages.

One more tips: When you cannot get indexed by search engines, do not blindly believe someone saying that your website get banned. You should firstly look at your website structure and see if your coding are clumsy and make search engines hard to "read" your website. The problem created by Javascript is not uncommon nowadays. Even a very big Asia Market Research Firm also commits this error. They are now starting to rectify it.

2. Link Building
In many forums, some people are strongly opposed to link exchange and claim that it is a devil and it hurts your search engine ranking and even get you banned.
I personally do not agree. Evidence shows that many top sites are doing link exchange and getting top rankings across all major search engines. In addition, link exchange has been a long used way for traffic building by smaller websites. I do not see why search engines will block people to promote websites using this simple and long used method. Indeed, link exchange becomes a problem only when you manipulate link text with a pure purpose of tricking search engines, exchange links with link farms, or buy hundreds of domain names and cross-link with your site.

A solution to avoid link exchange hurting you includes:
-Use a wide variety of link text
-Seek link exchange with sites of similar theme
-Emphasize on how many traffic you can get from your link partners instead of search engine ranking

Some webmasters know the importance of link exchange. However, they think that it consumes most of their time and finally give up doing it. A solution is that you can consider using to manage your links. This kind of software saves you time in link checking and link page update.

3. Use of Flash Intro
Web design companies try hard to persuade you to use a Flash introduction as your homepage. You think that "Wow! The flash animation is very appealing and it makes your site looks more attractive". However, do you know that it can hurt your website in search engine ranking?

Let me explain:
Search engines analyze website based on text. Flash, unluckily, is not text. According to "How to Design Website Guideline" of Siuchu Suga, search engines are not able to read content presented by Flash. They just treat Flash as an embedded object or graphics only. If you use a Flash Intro as your homepage, you will never get good rankings.

In addition, many Flash intro do not offer additional and meaningful content to visitors. Ask your visitors, how many of them are really interested in reading your Flash Intro before going straight to your website content? A solution to use Flash or not, please ask yourself whether the Flash intro is really useful and can offer additional information to your visitors? Secondly, instead of a Flash homepage, you may consider making a Flash header together with content at your homepage.

4. Hidden Text and Meta Tags
Webmasters understand that keyword density is a way to improve search engine ranking, and some webmasters use a technique called "keyword stuffing". They will stuff keywords into their webpages repeatedly, e.g. "keyword 1, keyword 2, .... keyword 1, keyword 2...." Well, everyone knows it does not make sense to visitors. Therefore, those "clever" webmasters will make those text invisible. For example, make the text color identical to background color.
Unluckily, this trick no longer works. Search engines are able to detect it, and penalize websites using this nonsense technique. If you do not want your site being penalized, remove those text immediately in case you have adopted this method.

Okay, some webmasters only stuff keywords in Meta Keyword Tags, an area for putting keywords. I am sorry to say that stuffing keywords in Meta keyword tags is also no good. You can only repeat a keyword for 3 times at most. Nowadays, search engines place less emphasis or even no emphasis on Meta keyword tags. So, what is the point of risking yourself by stuffing keywords there?

5. Use of Dynamic Pages
Many websites use content management system (CMS) to generate their webpages. As generation of dynamic pages are easier for website development, CMS generally use dynamic pages. However, search engines have difficulties to spider and understand them. To solve the problem, you can consider using Mod Rewrite in case you use Apache server or finding a CMS that can generate static HTML pages.

6. Be a Pagerank Monster
Many search engine marketers are too focused on their website's pagerank. Everyday, they are talking and checking pagerank.
They perform link building based on pagerank only. In fact, pagerank is only one factor for Google to determine search engine ranking. It also cannot affect your rankings in other search engines like Yahoo and MSN. If you are too concerned about pagerank, you finally will ignore other important optimization criteria. Therefore, you should repeatedly remind yourself that pagerank is only one of the many factors and you cannot only work on it.

7. Too Believe in Sandbox
Some webmasters propose that a new website will be put into sandbox by Google so that you cannot get any high rankings for highly competitive keywords. Even though their website cannot get any rankings after 1 year, they still believe that it is sandbox effect. Some even says Yahoo has Sandbox, MSN has Sandbox, etc...

From Google's patent information, there is no clue that such sandbox occurs. From my experience, if your site is new and cannot rank high in highly competitive keywords, it is solely because other websites are more established in search engine world. For example, they have more inbound links, more content, etc. Therefore, webmasters should not focus on finding how to get out from sandbox. Instead, you should put more effort in link building and content optimization. Eventually, you will see your website's rankings rise.
About the Author: By Jimsun Lui, who is working in Search Engine Optimization division of Agog Digital Marketing Strategy Limited. The company assists clients in both Chinese and English search engine optimization
Key search engine optimization strategies

Smart internet marketing should include lots of different marketing methods but it should start with good search engine optimization strategies.

What are the best SEO strategies?
First of all, keep in mind that search engine optimization is a process. You won't achieve it overnight. Search engine algorithms are all different and can be quite complex comprising many different elements.

While it's not feasible to delve into the bowels of search engine algorithms, there are two basic optimization factors that play key roles in optimizing your site. They are typically referred to as on-page optimization and off-page optimization

On-page optimization refers to the elements of your web page that you can optimize yourself. Off-page optimization refers to the elements of linking and how your link partner sites link to you.

Let's discuss each of them On-page optimization (website strategies:
- Each page of your site should have its own descriptive title tag, description tag and keywords. The keywords tag is seldom used nowadays, however each page   should be optimized for a maximum of 3 per page, preferably one keyword per page.
- Be sure your title tag contains your best keyword or keyword phrase.
- Use header tags (h1, h2...) in the body of your page text.
  Search engines look for structure and organization. Using header tags indicates good outline form.
- Bold your keyword or keyword phrase once. It's also a good idea to italicize and underline them once as well.
- Use keyword alt tags to describe an image used on your page. Since search engines don't index images, alt tags allow the search engines to recognize all
  your content. Be sure not to stuff them with lots of useless information. Use your best keyword or keyword phrase and keep it short.

- Include quality, informative content on each page and keep your content fresh and updated.
- Pepper each page of your site with your keywords Use your keywords in the first paragraph, the last paragraph and sprinkle them in between. Keep your 
  keyword saturation at about 1-2%.
- Cross link each of your site pages by placing a navigation bar on each page. Be sure the title of each page contains what the page is about.
  For example, if a page is about "jewelry beads," the page name should be "jewery-beads.html" or "jewelry_beads.html" and all the links to that page should be    named 'jewelry beads' and set up as:

<a href="jewelry-beads.html">jewelry beads</a>
<a href="jewelry_beads.html">jewelry beads</a>

- Create a site map. Site maps act like a table of contents. Site maps can help search engines find, crawl and index all the pages in a website ensuring that no  
  page is left behind.

Off page optimization (linking strategies):
Linking plays a large part in the optimization of your site so don't just link to any site. You should adopt a methodology in obtaining useful, quality links. Here's a few tips:

- Keep the majority of your links to sites that have a similar theme as your site. Be sure the site complements your site. It should be one that you would use and   recommend.
- Links should be slow and gradual. A sudden burst of links could indicate spam to a search engine.
- The higher the page rank of the page your link is on, the better. This doesn't mean you shouldn't link to pages with a low page rank, however the higher the
  page rank the better it is for you.
- The title of the page your link is on should be something other than 'links.html'. If possible try to ensure it has some relevance in meaning to your site.
- When exchanging links, be sure to submit your best keyword in the anchor text.
About the Author: Elizabeth McGee has spent 20 years in the service and support industry. She has moved her expertise to the world wide web helping businesses find trusted Marketing Tools, enhance customer service, build confidence and increase sales. Get her Free Tips newsletter at:
Guide To Getting Linked

Getting links from other websites pointing to your site can sometimes seem very hard and time consuming. There are several ways to get people interested in your site, you've just got to be creative with it and think more in terms of putting yourself in the other webmaster's shoes- figure out what the advantages are for them to be linked to you!

One thing that you often hear is that to attain a high Page Rank thru Google, (which, let's face it, is the Big Daddy of Search Engines) that you have to link to other high PR sites. This is not necessarily true. No one is 100% on the relevance of PR rank affecting a link exchange. Link to as many non-link farm sites that are similar to your site as you possibly can, because the more inbound links pointing to your site that you have, the more search engines will look at your site as an important site in the business or opportunity that you are promoting or selling.

So don't look down on sites that have low a low PR, everyone at one point or another has a zero PR, doesn't mean that they don't get traffic, you just never know. Just because someone has a low PR doesn't mean that they don't get monster traffic to their site, whether it's through pay-per-click advertising, or other means, so no one should turn their nose up at anyone who doesn't have a great PR.

If you are a new webmaster and have just started a new website, sell the fact that you're new, and that you're going to be focusing on putting quality CONTENT on your site.Because, A Number 1 as far as Google is concerned is to have relevant content on your site relating to your business or opportunities. Making a business grow and succeed is based solely on the ability of a webmaster to make his/or her site relevant to search engine spiders, whom look every day for great content.

Another idea is to promote what you are advertising to a webmaster, showing that the product/or opportunity is viable and is something that makes your website worth linking to. Either by having testimonials, or by having them sign up as an affiliate to see results for themselves, showing how much you believe in your opportunity might show a webmaster that you are committed to making your site everything it can possibly be, and thus, possibly giving their website more exposure.

Even with the hassles that go along with linking to other sites it really is a great way to show that your site does have importance. Being partners through links with others in your online business makes for long lasting business relationships, and support for others that are getting started. There's room for everyone to succeed online, and helping others along the way is as important to the growth of online business as anything else. Here's to your success!
About the Author: Erich Winnecke, Jr. is a webmaster who authors articles and his website is geared for people who are interested in finding an online work at home opportunity or starting Home Based Business