Tuesday, July 29, 2008

Overture PPC Ads by Yahoo




Overture introduced Pay Per click advertising to the world. Overture is now acquired by Yahoo! and working as search marketing solution arm of Yahoo! With the first ever PPC program, Overture showed the way to make money to search engines. Currently Overture is used to provide paid search listings on many of the top search engines and internet portals such as: Alltheweb, AltaVista, Yahoo, MSN, and many other portals, search engines, and web sites. Overture's top of page placement and the fact that you have lots of room for descriptive copy gives your ad writer more power to generate click-throughs.

Overture PPC Ads System:


Unlike Google Overture has defined its minimum level for every single aspect of advertising. Their minimum cost per click is .10 cents. Also they charge $50 to open an account. It takes 1 week to setup and activate an account in Overture. You must spend minimum $20 monthly for advertising on Overture.

Ad positioning in Overture is purely based on bidding amount. Overture does not consider the click through rate while positioning your ad on SERP. That Overture doesn't consider click volume when determining an ad's position in search results creates opportunity for some advertisers. In theory, some unqualified clicks can be discouraged without causing the ad to be deactivated.
But in recent development, Overture started to deactivate the ad with usually poor click through. However their critical click through volume is lower than Google.

Why to go for Overture PPC program:

PPC campaign is all about words and keywords. It is the only tool by which you can attract your customer. More words are always better. The more you explain in the ad the more conversion rate you can expect.

And this is where AdWords has edge over google. Overture allows title upto 90 characters and description upto 190 characters. And Google allows 25 characters in title and 70 characters in description. Longer copy in the Overture ad allows advertisers to more carefully qualify their clicks. This can eventually improve conversion rates from Overture clicks.

Overture’s network consists of Yahoo, MSN, CNN, AltaVista and Infospace, which means you might reach up to 80% of all Internet users. This helps in brand awareness.
But since Overture offers monthly budget system, it can create negative impression among advertisers as Google is offering daily budgets. Therefore on Overture, if your monthly budget expires before 10th of the month, your ad won’t appear for remaining days or until you refill your account.

You can place exact bids and know right away what your rank will be. If you want to appear in Yahoo top three results, you must have one of the top three bid positions. There is an auto bidding feature that allows Overture to handle your bids automatically.

Overture offers a keyword search volume reporting tool that helps find keywords that are actually used. Jimtools.com offers a combined volume/price comparison service that is very handy.

Overture PPC System and Tips:

In Google AdWords, ads are added automatically whereas in Overture human editors review your keywords and your site. You may bid on a search term only if the web site has substantial content that is clearly reflective of the search term. Also Overture ask their advertiser to rewrite the ad if they are not receiving proper click through.
Also the landing page for visitors must be according to the query phrase. Overture makes reports available for all of your approved phrases.

They may or may not allow you to bid on misspelled words. They tend to reject three or four word phrases that don't have sufficient search volume from searchers. Although they don't have minimum click through requirements you could end up spending money repeatedly on losing phrases. On the other hand, some phrases may have a very low click through rate, yet still produce some good sales. That's in contrast to Google AdWords program where you may have some good sales from a .04% click through rate, then watch your ad get deleted because it does not meet the minimum click through rate or what is referred to as CTR.

Overture will reject superlatives in your titles or descriptions which tend to make the listing appear matter of fact. There is no keyword exclusion option which means your site will come up in irrelevant searches. Titles cannot exceed 40 characters and descriptions cannot exceed 190 characters. Ads display on one line at Yahoo.

While advertising on Overture you must make sure that the site is not framed, as you can’t directly connect to the inner pages. If you bid several related phrases then you ad might appear multiple times on the same page. Also make your landing page according to the promise you made in the advert. If you are offering free white paper then don’t ask them to subscribe to your site or else mention it in the ad. If you don’t offer detailed information and easy navigation then your conversion rate will go down.

Thursday, July 24, 2008

Another PPC (Pay Per Click) Advertising Resources



With the growth PPC advertising has seen in the last few years, additional services have appeared to help advertisers simply the management of different PPC ad programs.
There are other PPC advertising programs available today such as Business.com, Enhance Interactive, e-Pilot, Kanoodle, e-spotting, ePilot and Search123. Overture has programs for many different countries.

Shopping Search Engines

Shopping search engines have become popular destinations for consumers. Most, with the exception of Froogle charge a price per click-through from their search function. Yahoo's Product Submit charges from 20 cents to 50 cents per click. You'll have to create and upload a list of your products, called a product feed. If you sell products, this is one of the most valuable locations to sell.

Shopping.com also has a merchant program. It charges any where from fifteen cents to a dollar per click. This site also offers good exposure to product shoppers. Bizrate offers a variation on the fixed bidding scheme of the other shopping search engines by allowing merchants to bid on top rankings.

In Europe, Kelkoo comparison shopping search engine has a strong market share, which is why it was purchased by Yahoo. You can bid on top placement for your products and appear across European from Sweden to Italy.

PPC Bid Management

If you use all of the available pay per click advertising programs, it can become a headache to manage your bids without spending all of your time. There are services that can add greater functionality for your AdWords and Overture campaigns, allowing you to do things they don't offer.

Two companies lead the way in, PPC bid management programs. Atlas One Point provides an all-in-one interface for managing PPC bids. Formerly called Go Toast, they offer a free 14 day trial of their service which includes bid management and campaign optimization.

BidRank is another useful automate pay-per-click search engine management tool that takes the pain out of PPC bid management. BidRank will check and change your PPC bids according your preferred ranking. You can set target ranks and bid maximums based on time of day and/or day of week. Basically the primary settings are used when inside the times that you have specified and the secondary settings take effect when outside of those times.

Other Useful PPC Advertising Resources


Clicktracks offers a PPC tracking and reporting feature for any PPC campaign. It can even help you calculate important e-metrics such as return on cost, return on keyword phrase for each search engine. You can compare your returns on paid versus organic search engine listings.

Friday, July 18, 2008

Bidding Strategies - Key of Advertisement on Google



As of now we know that bidding is what decides fate of your PPC campaign. Now lets see what all aspect should be considered while bidding for keyword or key phrase. There are several PPC bidding strategies to apply. Each has its merit, and in some cases, may be more effective with one PPC search engine or with a set of terms. This is just not enough. There are different bidding strategies for Google and Yahoo! Their PPC programs are quite different and hence the bidding strategies also have to be different.

The bidding amount certainly depends on, how much you are willing to pay per click. If you don’t know this value then its better, you stop thinking of PPC advertising. This could be based on an industry rule of thumb or calculated based on internal factors such as profit margins.

For example, let's suppose you're bidding on the keyword phrase "search marketing" but do not know your max CPC. One way to estimate a max CPC involves taking the top 5 bids on Overture and computing the average. The current bids are: $0.51, $0.50, $0.33, $0.32, $0.31. The average is 39 cents. Use that as your max CPC to begin with.

The reverse calculation is very effective to determine how much we can spend per click. Reverse calculation means; calculate the amount to be investment on the basis of revenue you are generating from clicks. Past experience, market understanding and proper research will certainly help to calculate you CPC. Let's suppose you sell SEO package for Rs.100k and your profit margin is 20%. That leaves Rs.20k of profit for each package. Also, assume that your conversion rate will be 1%. For every 100 visitors from a PPC ad, you expect 1 sale. If you have Rs.5k of ad spend to spread over 100 visitors, you have Rs.50 to spend per click.

Also you can decide on it by calculating overall online marketing budgets. If you are willing to spend 10% of revenue on website then your total ad spend is Rs.10k. The conversion rate we have calculated is 1%. Therefore with 100 clicks in mind we can spend Rs.100 per click. As your campaign progresses and you determine your actual conversion rate, adjust the CPC accordingly.

You need to use different bidding strategies for Overture and Google as their programs are different. Google considers past performance and click through rate of the campaign whereas Overture only considers your bidding amount. For Google, use the Overture bids as your starting point in the short term and reduce the bids for the long term if your CTR is high enough.

Biding for a position gives you more CTR and not the number one position. By this you can get higher ROI since top positions are very expensive. Just think of searcher’s behavior. They don’t have any specific query in their mind but they use different combinations of keywords they can think of. Bidding has phenomenon that there are always some big gaps in between the bidding amounts. This is again because of race to reach to the top position. Consider a bidding scenario where biding started at the price of Rs.10. Someone else will bid Rs.11 again someone will bid Rs.12. But the point will certainly come when some aggressive bidder will bid on Rs.30 to obtain the top position without any fear of competition. This gap between Rs.12 and Rs.30 will be beneficial for us.

If you are concern about the first position only then initially bid higher and achieve it, which is very easy to do in Overture. And then by constant monitoring the biding you can maintain the top position. On generic keywords it is very difficult to monitor the campaign constantly as they are popular and high traffic terms. But you can do it on specific keywords, which are comparatively less competitive with higher conversion rate.

If you are biding on very specific keywords, which have less competition and low traffic, then one option is, position the advert and rely on the visitors. This strategy can be considered because, one who will search into such a specific query, he is keenly looking for that particular information.


Sometimes you bid relative to your direct competitor's offerings and listings. If you find your direct competitor at position 3 and you have a better offering for this particular search query, bid just above your competitor, but not necessarily at the top position, thus engaging the searcher's attention with a compelling ad. Terms in this category fall into Quadrants 1 and 2 depending on how compelling the offer is once the searcher have landed on your web site. The bidding strategy works well for price and feature competitive offerings.

Wednesday, July 16, 2008

What is Bid Management Strategy in Google Adwards?




The most important step before bidding for any PPC program is, understanding the market value of your keywords. And best way to know it is Overture. But there are several bids going around for a keyword. So you can take an average of some top biding amount and can determine the market value of your keywords. If you can afford the market value you derive, use it. Otherwise, use your max CPC. That max CPC could be set for an entire ad group or for a specific keyword phrase.

Track the ad carefully for a few days. Assuming the bid is high enough and generates sufficient traffic, you should have a good idea of the CTR within a few days. If the CTR is good (at least 2%), lower the CPC and see where your ad falls in the search results. If the CTR is sufficient, lowering the CPC should not result in your ad dropping many positions.

Then use trial and error method with different combinations. Run a query and observe the position. Drop the bid by some amount and check the position again. Again drop it and check it. Follow the procedure until you remain in the top 3-4 or whatever desired position you want to be. If your ad's CTR is very good (better than 7%) you will likely be able to drop your CPC in half without a noticeable drop in ranking.

If your ad group has many keyword phrases and there's a divergence in CTR, consider creating multiple ad groups. The more tightly focused your ad group is, the lower your CPC will ultimately become as you weed out poorly performing keyword phrases. Adding negative keywords to each Google ad group will also help increase the CTR and thereby allow you to reduce your CPC.

Do You Know about Performance matrix?

The main aspect of PPC advertising is not exposure, but clicks and sales conversions. The click-through rate is defined as the percentage of times a paid search ad is clicked on out of the total number of paid search ad views within a given period of time.

Click-through Rate (CTR) = Click-throughs (i.e. Total Visitors) / Impressions

Website conversion is defined as the percentage of users who visit your website and complete your primary objective (i.e. purchased a product) out of the total number of users who visit your website in a given period of time.

Website Conversion (sales conversion) = Sales / Click-throughs (i.e. Total Visitors)

So what role does each play in understanding the effectiveness of a paid search campaign?

Standard practice among advertisers is to concentrate on writing ads that achieve a high click-through rate to send more visitor traffic to their website. Unfortunately this general assumption, “more traffic equals greater positive results”, is flawed.
Consider this. Which click-through rate is better?

* A 20% click-through rate for a paid search ad that achieves zero sales (0% website conversion).
OR

*A 0.2% click-through rate for a paid search ad that achieves 10 sales (10% website conversion).

The answer is obvious. The click-through rate, especially for newly setup PPC campaigns, is relative – it is the website conversion rate resulting from visitors clicking through a particular paid search ad that defines success or failure.

Successful paid search advertisers take a different approach. They start with the end in mind by asking, “what primary objective do I want a visitor to complete on my website?” and then they work backwards. They identify the type of visitor and buying behavior that will most likely result in a completed action (i.e. sale, registration, etc.)

In addition, they perceive their ads as automated salespeople who “qualify” visitors. Regardless of a high or low click-through rates, the focus is on generating a positive return from the advertising dollars spent.

For instance, let’s review two different ads. Ask yourself, which ad best qualifies visitors?

A. Pride Scooters Low prices and huge selection of scooters and other mobility equipment.

B. Pride Scooters From $1850 while stocks last. Houston, Texas, USA.

If you selected B. you are correct.

Ad B. qualifies visitors based on their buying behaviors and customer type most likely to purchase a Pride Scooter from the business’ website.

First, the ad states a price point (i.e. from $1850) to attract visitors seeking the website’s premium product while disqualifying ones seeking discounted or lower-priced scooters. A user researching scooters does not have to click-through the ad to find out a general price range.

Second, the ad targets a geographic region since the majority of people who buy scooters demand an actual test ride. If the company is located in Houston, Texas then users from other locations will not feel compelled to click-through the ad. (Ideally a geographically-targeted PPC campaign like using Google Adwords Regional-targeting works best in this situation).

In essence, ad B.’s goal is to pay “per click” for only visitors most likely to purchase their product. This ad attempts to “filter” unqualified visitors thereby increasing the return on investment per click-through.

Ad A. instead spends money on attracting and generating click-throughs from all visitors and relies on the website to filter qualified versus unqualified ones. This is not a wise economical approach especially if no “visitor exit strategies” are pursued.
Last, successful paid search advertisers rely on testing different ads to determine which appeal generates the best website conversion for a particular keyword. They rely on actual visitor feedback to help them determine which appeals are most effective. Once a positive return is achieved then focus is shifted to increasing the click-through rate for the best converting keywords so more sales can be realized.


So “Are you spending money to bring just anybody to your website or visitors ready to buy from you?” Think about ..is Your Paid Search Advertising Generating Positive Financial Results for your website?


Wednesday, July 02, 2008

Targeting Usage Demographics to Increase Paid Search Conversions

Targeting the campaign to the proper target audience is very important in terms of conversion rate.
Website conversion is when a visitor takes action on your website after clicking through your ad. It is important because it leads to financial results for your web business and generates a return on your advertising spend (ROI).

On internet and for PPC ads, to be very precise, we can target the audience geographically and demographically. Targeting through ads is another part. First we should study the demographic profile of search engine’s users.

User demographic profile study means why visitor chose one search engine over other. It could be because of the functionality, relevance and many other factors that user can perceive and prefer one search engine over other. If you can research into user demographics of Google and Overture then you can create message accordingly and increase your ROI.

To study that, you must know what Google AdWords or Overture consists of! Though there are many sites and search engines in these networks, very few of them are famous and most popular.

Below are the primary search engine usage demographics to consider when developing your PPC strategy:
1.Gender: Male versus Female

In the search engine world it is very often said that ‘Men are from google and women are from Yahoo! and MSN’. And it seems to be very true.
A May 2004 study by Hitwise showed that “55% of women prefer MSN Search while a majority of men favor Google Search”. Yahoo! Search was split even on gender with a greater focus on people 18-34 in age.

A 2004 MarketingSherpa study indicated that MSN’s user profile consisted of time-limited, married females who searched less frequently yet performed greater e-commerce searches. While Google Search was favored by professional males who performed greater news, media, entertainment and education searches with a lesser intent to purchase.

For AOL and Ask Jeeves, AOL is favored by women with less buying intent than MSN Search while Ask Jeeves is preferred by children.

Furthermore, an April 2004 iProspect study uncovered that, “women found paid ads to be more relevant than men did when searching across Google, Yahoo!, MSN and AOL.”
These statistics are startling when considering their influence on your PPC strategy since women represent roughly 75% of major household purchases and as stated in a Women.com study, control 80% of all purchasing decisions.

2.Relevancy: Paid versus Organic Listings

Another usage demographic to consider for your PPC strategy is “perceived relevancy” of paid versus organic listings. Ads perceived as having greater relevancy lead to higher website conversions.

The iProspect study referenced earlier also discovered that “Internet users are more likely to click on an organic search link on Google, and a paid search result on MSN.” Organic listings on Yahoo! were considered 61% more relevant than paid listings while AOL was split 50/50.

3.Age: Young versus Adult versus Seniors

A third usage demographic to review is age. Preferences among the top five search engines are fairly mixed among age groups; Yahoo! is a strong favorite with 18-34 year olds; while MSN and AOL have a stronger preference among the 35-55+ age group. As stated earlier, AskJeeves is favored by teens and adolescents which is growing in their buying power within American households as stated in a recent BusinessWeek research study.

Conclusion:

Google and Overture offers the best PPC programs and always prefer them.
Google has the greatest reach but conversion rate on overture is high.
Go for both the programs simultaneously
Ad copy, keywords and landing page all are equally important.
Consider customer demographics and psychographics while writing copy.
Use relevant qualifiers to get mote targeted traffic.
Usage data generated from your website is the best market research.
Use keyword-level tracking systems to determine which PPC search engine generates the most cost effective and best converting visitors.

Wednesday, June 25, 2008

Advanced SEO - Dynamic Page Optimization

Dynamic page optimization:

As Internet user base started to grow, website owners started to make web site more and more attractive and user friendly. The most important thing to keep in mind is each webpage is not a separate file but is created when a user performs some activity.

Lets see what exactly dynamic site is! Unlike the normal HTML website where The content of static pages doesn't change unless you actually code the changes into your HTML file: open the file, edit the content, save the file, and upload it to the server. All search engine spiders can index static Web pages. Dynamic web page is a template that displays specific information in response to queries. The database is connected to the web site and the response is generated through the connected database only. These sites are easy to update for webmaster. Since it is directly connected to database, the change in database reflects all the pages. It is much simpler than normal HTML pages, where you need to change the desired content in each and every page.

For the marketer, the creation of new pages or updates to existing pages is done by either making adjustments to information in the database or, when it comes to the site’s visual presentation, may mean adjustments to one or a few template pages. Of course one has to make web site like that only, but problem started when these beautiful, content rich sites failed to rank higher in search engines.

But the problem lies in its advantage itself. As studied earlier, dynamic page executes on query. Users send queries through search engines or they are already be coded into a link on the page. But a search engine spider doesn't know to use your search function - or what questions to ask. Dynamic scripts often need certain information before they can return the page content: cookie data, session id, or a query string are common requirements. Spiders usually stop indexing a dynamic site because they can't answer the question.

Search engines only believe in content and not flashy elements in your web site. Search engine crawlers are programmed in such a way that they can read the text only. Crawlers strictly ignore all the flashy elements such as pictures, frames, video etc, read it as an empty space and move on. Some search engines may not even be able to locate the dynamic page very easily. But if we make web sites SE friendly only and not user friendly then most likely you end up losing out visitor. This then presents a big problem for marketers who have done very well with their rankings in search engines using static pages but who wish to switch to a dynamic site.

This is why SEOs came up with the advanced SEO techniques to optimize dynamic pages. Here are few methods that you can use to optimize dynamic pages.

Methods to make search engine spider Dynamic Pages:

1. Use of softwares – There are various softwares available in the market, which will remove the "?" in the Query String and replace it with "/", thereby allowing the search engine spiders to index the dynamic content.

Example -
http://www.my-online-store.com/books.asp?id=1190 will change to
http://www.my-online-store.com/books/1190.

The latter being a static URL, it can easily be indexed by the search engine spiders.

2. Use of CGI/Perl scripts - One of the easiest ways to get your dynamic sites indexed by search engines is using CGI/Perl scripts. Path_Info or Script_Name is a variable in a dynamic application that contains the complete URL address (including the query string information). In order to fix this problem, you'll need to write a script that will pull all the information before the query string and set the rest of the information equal to a variable. You can then use this variable in your URL address.

Example - http://www.my-online-store.com/books.asp?id=1190

When you are using CGI/Perl scripts, the query part of the dynamic URL is assigned a variable.

So, in the above example "?id=1190" is assigned a variable, say "A". The dynamic URL http://www.my-online-store.com/coolpage.asp?id=1190
will change to http://www.my-online-store.com/books/A through CGI/Perl scripts which can easily be indexed by the search engines.

3. Re-configuring your web servers -

(i) Apache Server - Apache has a rewrite module (mod_rewrite) that enables you to turn URLs containing query strings into URLs that search engines can index. This module however, isn't installed with Apache software by default, so you need to check with your web hosting company for installation.

(ii) Cold Fusion - You'll need to reconfigure Cold Fusion on your server so that the "?" in a query string is replaced with a '/' and pass the value to the URL.

4. Creation of a Static Page linked to an array of dynamic Pages -

This approach is very effective, especially if you are the owner of a small online store selling a few products online. Just create a static page linking to all your dynamic pages. Optimize this static page for search engine rankings. Include a link title for all the product categories, place appropriate "alt" tag for the product images along with product description containing highly popular keywords relevant to your business (You can conduct keyword research for your site through http://www.wordtracker.com ). Submit this static page along with all the dynamic pages in various search engines, conforming to the search engine submission guidelines.

Technical methods of Dynamic Pages of Any site

There are few technical aspects need to be considered for optimizing dynamic websites.

Lets start with .htacess & mod-rewrite. These are the two concepts that you will have to master to understand how to cloak search engine unfriendly urls. Also keep in mind that these two components are implemented on apache server. However for IIS server, we have the equivalents available, as can be seen later in this article.

So starting from the basics

.htaccess File:

An .htaccess file just is a plain text file. It has one directive per line like this:
RewriteEngine on

The "RewriteEngine" portion is the directive and "on" is a parameter that describes what "RewriteEngine" should do


The .htaccess file usually lives it the root directory of a site and allows each site to uniquely configure how Apache delivers its content. Its directives apply to the entire site, but subdirectories can contain their own .htaccess and it applies to this sub and all of its subs and so on, down thru all of your sub sub sub sub subdirectories... You could have a different .htaccess in every subdirectory and make each sub behave a little differently.

Mod_rewrite:

Mod-rewrite is a redirect directive to the requesting object on a apache server. Its typical format looks like

Options +FollowSymLinks
RewriteEngine on
RewriteRule ^url1\.html html$ url2.html [R=301,L]

Lets look at this a little more closely. The first directive instructs Apache to follow symbolic links within the site. Symbolic links are "abbreviated nicknames" for things within the site and are usually disabled by default. Since mod_rewrite relies on them, we must turn them on.

The "RewriteEngine on" directive does exactly what it says. Mod_rewrite is normally disabled by default and this directive enables the processing of subsequent mod_rewrite directive.

In this example, we have a caret at the beginning of the pattern, and a dollar sign at the end. These are regex(regular expressions in *nix) special characters called anchors. The caret tells regex to begin looking for a match with the character that immediately follows it, in this case a "u". The dollar sign anchor tells regex that this is the end of the string we want to match.

In this simple examples, "url1\.html" and "^url1\.html$" are interchangeable expressions and match the same string, however, "url1\.html" matches any string containing "url1.html" (aurl1.html for example) anywhere in the URL, but "^url1\.html$" matches only a string which is exactly equal to "url1.html". In a more complex redirect, anchors (and other special regex characters) are often essential.

Once the page is matched it directs it to replace it by the ‘url2.html’

In our example, we also have an "[R=301,L]". These are called flags in mod_rewrite and they're optional parameters. "R=301" instructs Apache to return a 301 status code with the delivered page and, when not included as in [R,L], defaults to 302. Unlike mod_alias, mod_rewrite can return any status code that is specified in the 300-400 range and it REQUIRES the square brackets surrounding the flag, as in this example.


The "L" flag tells Apache that this is the last rule that it needs to process. It's not required in this simple example but, as the rules grow in complexity, it will become very useful.

The Apache docs for mod_rewrite are at http://httpd.apache.org/docs/mod/mod_rewrite.html

& some examples can be found at

http://httpd.apache.org/docs/misc/rewriteguide.html .


Now if we rename or delete url1.html, then request it again. Mod_rewrite can redirect from non-existent URLs (url1.html) to existing ones. This is how essentially we cloak the dynamic pages. The first url can be the dynamic page that we want to be replaced by the the static looking ‘url 2’. This then is how cloaking works on the apache server. Though there are other methods available however this remains the most popular & reliable.

IIS Server Redirects:

As long as one uses one of the mod_rewrite cousins for IIS (iis_rewrite, isapi rewrite), the method will be mostly the same for IIS as it will for Apache. However the place, the rules are inserted will depend on which software is being used (not obviously into httpd.conf or .htacess). But the rule generation pretty much remains the same either way.

The most used framework for this genre is ispai rewrite. For more info on this consult http://www.isapirewrite.com/ . The site has a free download version of their code & a paid version for 69USD

For IIS Rewrite functionality, Qwerksoft remains the most popular alternative(http://www.qwerksoft.com/products/iisrewrite/). Again a basic free downloadable or a 99 USD purchase option exists with them.

However user experience suggests that the ISAPI_Rewrite product outperforms the others due to its ease of configuration and a bunch of other little extras. One of the biggest benefits with ISAPI_Rewrite is that you don't have to restart IIS each time you make a change to the .ini file. In other words once ispai-rewrite is installed, one can have the .ini file within the root folder so that changes can be made, as one goes along if necessary,without a restart.

Also these products support shared hosting. So the hosting provider can be convinced to buy them & install them. Some other products in this category are as under:

http://www.pstruh.cz/help/urlrepl/library.htm ( free ispai)

http://www.motobit.com/help/url-replacer-rewriter/iis-mod-rewrite.asp

Also if you are using .NET platform, this works for free:

http://msdn.microsoft.com/msdnmag/issues/02/08/HTTPFilters/

Sunday, June 22, 2008

Dynamic URLs Rewrites :-

Dynamic pages are roadblocks to high search engine positioning. Especially those that end in "?" or "&". In a dynamic site, variables are passed to the URL and the page is generated dynamically, often from information stored in a database as is the case with many e-commerce sites. Normal .html pages are static - they are hard-coded, their information does not change, and there are no "?" or "&" characters in the URL.


URL rewrites are programming techniques that allow the returned URL to be more search engine friendly by removing the question mark (?) and ampersand (&) from the returned URL found in the location or address bar. This enables the search engines to index the page without having variables or session id's interlaced into the URL.

Pages with dynamic URLs are present in several engines, notably Google and AltaVista, even though publicly AltaVista claims their spider does not crawl dynamic URLs. To a spider a "?" represents a sea of endless possibilities - some pages can automatically generate a potentially massive number of URLs, trapping the spider in a virtually infinite loop.

As a general rule, search engines will not properly index documents that:

  • contain a "?" or "&"

  • End in the following document types: .cfm, .asp, .shtml, .php, .stm, .jsp, .cgi, .pl

  • Could potentially generate a large number of URLs.

In these cases, where page should be dynamic it is possible to clean up their query strings. URL rewriting generally clean up ‘?’, ‘&’, ‘+’ symbols in URLs to more user friendly characters. Check out the following URL: http://www.yourdomain.com/shop.php?cat_id=1&item_id=2

This dynamic URL can be converted into: http://www.yourdomain.com/shoppinglist/apparels/shirts

This makes the page look static but in actual it is dynamic. URL rewriting needs some serious strategy and planning. There are few tools available fro URL rewriting. These are rule-based tools and the most famous tools are ‘More Rewrite’ for Apache and ISAPI rewrite for IIS. Mode Rewrite can be used to solve all sorts of URL based problems. It provides all the functions you need to manipulate URLs. But because of its complex rule based matching engine, it’s hard to learn. However once you understand the basic idea you can master all of its features. ISAPI Rewrite is a powerful URL manipulation engine based on regular expressions. It acts mostly like Apache's mod_Rewrite, but it is designed specifically for Microsoft Internet Information Server (IIS). ISAPI Rewrite is an ISAPI filter written in pure C/C++ so it is extremely fast. ISAPI Rewrite gives you the freedom to go beyond standard URL schemes and develop your own scheme.


There are two types of URL rewrites. Both are there to make I search engine friendly but the advanced URL rewrites is search engine friendly.


Non-URL Rewrite URL

http://www.yourdomain.com/shop.php?cat_id=1&item_id=2

The above URL indicates to the database that the returned information should be from the category with id equal to 1 and the item id equal to 2. This works fine for the system because it understands the variables. Many search engines however do not understand this form of URL.


Simple URL Rewrite

http://www.yourdomain.com/shop/1/2.html

The simple URL rewrite will take the URL and modify it so that it appears without the question mark (?) and ampersand (&). This enables all search engines to index your all of your pages, but still lacks in some important areas.


Advanced URL Rewrite

http://www.yourdomain.com/oranges/mandarin_oranges.html

The advanced URL rewrite enables your URLs to include your keywords. This is another location search engines look for important information about your pages. Being able to include keywords in your URL helps elevate your page to the top of the search engine result pages.

URLs can be cleaned server-side using a web server extension that implements content negotiation, such as mod_negotiation for Apache or PageXchanger for IIS. However, getting a filter that can do the content negotiation is only half of the job. The underlying URLs present in HTML or other files must have their file extensions removed in order to realize the abstraction and security benefits of content negotiation. Removing the file extensions in source code is easy enough using search and replace in a web editor like Dreamweaver MX or HomeSite. Some tools like w3compiler also are being developed to improve page preparation for negotiation and transmission. One word of assurance: don't jump to the conclusion that your files won't be named page.html anymore. Remember that, on your server, the precious extensions are safe and sound. Content negotiation only means that the extensions disappear from source code, markup, and typed URLs.

To avoid complications, consider creating static pages whenever possible, perhaps using the database to update the pages, not to generate them on the fly.

Saturday, June 21, 2008

What is the difference between Cloaking and Doorway Pages?

Cloaking:

As search engine optimization started evolving and search engines became more and more intelligent, webmasters came up with many techniques to rank their sites on search engines. Cloaking is one of those techniques. It is very difficult and time consuming to make a web site both user friendly as well as search engine friendly. So webmasters came up with an idea of Cloaking. In cloaking webmasters delivers one page to search engine for indexing while serving an entirely different page to everyone else. Cloaking is the process of serving different versions of a page based upon identifiable information about the user. Often, pages are based upon Agent name and/or IP address (isp host).

There is no as such clear view that whether cloaking is ethical or unethical. But anyways it is tricking spiders and any attempt to trick a search engine is considered to be spam. Hence cloaking technique is not regularly practiced. A simple way to see if a web page is using a cloaking technique is to look at the cache. Google has a link called Cached next to almost every search result. The cache shows the web page that was indexed by search engine. If a web page that you see in the SERPs differs from cached version, then there’s possibility that the website is using cloaking technique.

As we all know, people wants make web sites user centric. They want their site to be beautiful, attractive and interactive enough to engage visitors. Certainly this enhances user experience. But this does not serve the optimization purpose. So to optimize such a site webmasters use cloaking technique. The few factors are explained bellow, which makes a webmaster to think of cloaking.

Use of flash/splash/ Videos:

HTML days are gone and flash days are in! Many of the sites are build using flash, which is totally no no for search engines. So no plain text and not even flash on the site??? The solution is to create simple HTML text document for search engines and flash pages for visitors. Just recently Google has started to index flash pages but rest of the SEs doesn’t do that.

Websites containing Images:

There are many sites that are full of pictures and images. Also they have image gallery and all. These are image-oriented sites and percentage of images is more than that of text. Obviously there is no way that these sites will rank high on SERP. Hence cloaking comes first in the mind for optimizing these pages.

HTML Coding:

Many of the times there is more HTML code as compared to the text. This is again does not suit for search engine optimization. There has to be substantial amount of text and lesser HTML coding. In this case rather than recoding eth entire websites, they found cloaking as the best option.



Now you know why, it's time to find out how. A cloaking is done by modifying a file called .htaccess. Apache server has a module called "mod_rewrite". With the help of this module in .htaccess file you can apply a cloaking technique for your web pages.

Webmasters gather search engines' IP addresses (231.258.476.13) or User-Agents (Googlebot). If mod_rewrite module detects that an IP address or user-agent belongs to a search engine, it delivers a web page that is especially designed for SEs. If IP doesn't belong to any spider, than it thinks it's a regular visitor and delivers a normal web page.

There are 5 types of cloaking:
User Agent Cloaking (UA Cloaking)
IP Agent Cloaking (IP Cloaking)
IP and User Agent Cloaking (IPUA Cloaking).
Referral based cloaking.
Session based cloaking.

All five have unique applications and purposes, yet all 5 can fit nicely within one program.
User Agent cloaking is good for taking care of specific agents. Wap, Wml pages for the cell phone crowd.

Active X for the IE crowd.
Quality css from the Moz and Opera crowd.
Nice black screen for the web tv'ers.
Specialty content for agents (eg: NoSmartTags, GoogleBot Noarchive)
No sense in sending out stuff with js, java, or flash than a user can't actually run.

IP Address Cloaking is good for taking care of demographic groups. Language file generation for various countries.
Advertising delivery based on geo data.
Pages built for broad band users.
Low impact pages for overseas users.
User-time-of-day determination and custom content based on tod geo data (news, sports weather..etc)
Specifically targeting demo groups such as AOL, Mindspring etal.

IP and Agent cloaking is good for a combo of the above. Custom content for AOL'ers using Wap phones.

Ads based upon geo data and user agent support.
The possibilities for targeting are almost endless. You'll run out of ways to reroll it before you run out of ips and agents to serve.
Indexability. Just getting your site fully indexed can be a challenge in some environments (flash, shock).

Referrer based cloaking is basing delivery on specific referral strings. It is good for content generation such as overriding frames (about.com, ask jeeves, and the google image cache).
Preventing unwanted Hotlinking to your graphics.

Session based cloaking. Sites that use session tracking (either from ip, or cookies) can do incredible things with content. We've all seen session cloaking in action on dynamic sites were custom content was generated for us.

The internet has just scratched the surface here.
Cloaking is the gate keeper that serves your site in it's best light, and protects your custom code from prying eyes.

Search engine cloaking is just one aspect of a much bigger picture. This is why search engines can't even consider banning cloaking. It is so widespread and pervasive, they'd have to delete 1/4th of the domains in their indexes - those would be the best sites they have listed.

Any time you hear a search engine talking about banning cloaking, listen to them very closely -- and remember. If they'd bold face lie about something so pervasive, what are they doing with the really important stuff? They can't be trusted - nor can those that are out here carrying their water.

With the assault of rogue spiders most sites are under, the growing trend of framing, agents that threaten your hrefs (smarttags), I think cloaking has a very bright future. The majority of the top 2000 sites on the net use some form of the above styles of cloaking (including ALL major search engines).


Doorway Pages:

Just like cloaking these pages are also specially created for search engines, the difference is, these are ‘gateway’ or ‘bridge’ pages. They are created to do well for particular phrases. They are programmed to be visible only by specific search engine spiders. They are also known as portal pages, jump pages, gateway pages and entry pages. Doorway pages are build specifically to draw search engine visitors to your web site. They are standalone pages designed only to act as doorways to your site. Doorway pages are a very bad idea for several reasons, though many SEO firms use them routinely.

Doorway pages have acquired something of a bad reputation due to the frequent use (and abuse) of doorways in spamming the search engines. The most flagrant abuses include mass production of machine-generated pages with only minor variations, sometimes using re-direction or cloaking so the visitor does not see the actual page requested. Doorways used in this manner add to the clutter that search engines and Web searchers must contend with.

The purpose behind building Doorway pages is just to trick search engines for higher rankings. So doorway pages is considered to be unethical SEO practice. The fact is that doorway pages don't do a very good job of generating traffic, even when they are done by "experts." Many users simply hit their back buttons when presented with a doorway page. Still, many SEO firms count those first visits and report them to their clients as successes. But these very few visitors go ahead and visit their product’s page.

There are various ways to deliver Doorway pages. Lets check them one by one.

Low Tech Delivery:

When webmasters create and submit a page targeted toward a particular phrase, it is called Low Tech Delivery. Here sometimes webmasters create pages for special search engines as well. But the problem is user doesn’t arrive at the desired page. And it is most likely that if any visitor lands on non-informative page, he won’t navigate any further.

In such a case ‘Meta Refresh Tag’ plays very vital role. It is an HTML tag which automatically refresh the page in defined time. The meta refresh tag they use here is of zero second delay. Therefore use most likely won’t be able to see the optimized content before being sent elsewhere. These META tags are also a red flag to search engines that something may be wrong with the page. Because jump pages manipulate results and clutter indexes with redundant text they are banned by search engines.

Now a days search engines doesn’t accept meta refresh tags. To get around that, some webmasters submit a page, then swap it on the server with the "real" page once a position has been achieved.

This is "code-swapping," which is also sometimes done to keep others from learning exactly how the page ranked well. It's also called "bait-and-switch." The downside is that a search engine may revisit at any time, and if it indexes the "real" page, the position may drop.


But there is another problem with these pages. As they are targeted to key phases, they could be very generic in nature. So the pages can be easily copied and used on other sites. And since they are copied the fear of banning is always there.

Agent Delivery:

The next step up is to deliver a doorway page that only the search engine sees. Each search engine reports an "agent" name, just as each browser reports a name. An agent is a browser, or any other piece of software that can approach web servers and browse their content. In example: Microsoft Internet Explorer, Netscape, Search Engine Spiders.

The advantage to agent name delivery is that you can send the search engine to a tailored page yet direct users to the actual content you want them to see. This eliminates the entire "bridge" problem altogether. It also has the added benefit of "cloaking" your code from prying eyes.

But still the problem is there. Someone can telnet to your web server and report their agent name as being from a particular search engine. Then they see exactly what you are delivering. Additionally, some search engines may not always report the exact same agent name, specifically to help keep people honest.

IP Delivery / Page Cloaking:

Time for one more step up. Instead of delivering by agent name, you can also deliver pages to the search engines by IP address, assuming you've compiled a list of them and maintain it. IP delivery is a technique to present different contents depending on the IP address of the client.

Everyone and everything that accesses a site reports an IP address, which is often resolved into a host name. For example, I might come into a site while connected to AOL, which in turn reports an IP of 199.204.222.123. The web server may resolve the IP address into an address: ww-tb03.proxy.aol.com, for example.

Friday, June 20, 2008

What is Search Engine Spam?

Search engine spamming is the unethical practice for optimizing the site to rank it high on SERP. Spamming is used to trick search engines for higher rankings with the use of some tactics such as repetitive keywords, hidden text and links etc. All the search engines penalize the website that uses spam. Since time immemorial --or at least since the Internet first began-- webmasters have been using these stratagems to dupe search engines into giving irrelevant pages high search engine placement.

Each search engine's objective is to produce the most relevant results to its visitors. Producing the most relevant results for any particular search query is the determining factor of being a popular search engine. Every search engine measures relevancy according to its own algorithm, thereby producing a different set of results. Search engine spam occurs if anybody tries to artificially influence a search engine's basis of calculating relevancy.

Each of the major search engines provide specific guidelines describing what webmasters should and should not do to their web pages in order to achieve a better search engine ranking, though that has not always been the case.

There are overall sixteen tactics that are considered search engine spam. These techniques are

*Keywords unrelated to site
*Redirects
*Keyword stuffing
*Mirror/duplicate content
*Tiny Text
*Doorway pages
*Link Farms
*Cloaking
*Keyword stacking
*Gibberish
*Hidden text
*Domain Spam
*Hidden links
*Mini/micro-sites
*Page Swapping (bait &switch)
*Typo spam and cyber squatting

Not to be confused with the canned, processed meat, spam is the use of redundant or unethical techniques to improve search engine placement. Fortunately or unfortunately --depending on your point of view-- search engines are quickly catching on. Some won't index pages believed to contain spam; others will still index, but will rank the pages lower, while others still will ban a site altogether. Of course, not all search engines take a hard-line on spam. Tricks that are perfectly acceptable on one search engine may be considered spam by another.

Thursday, June 19, 2008

Spamming Techniques Overviews

Invisible Text: Hiding keywords by using the same color font and background is one of the oldest tricks in the spammers' book. These days, it's also one of the most easily detected by search engines.

Keyword Stuffing: Repeating keywords over and over again, usually at the bottom of the page (tailing) in tiny font or within meta tags or other hidden tags.
Unrelated Keywords: Never use popular keywords that do not apply to your site's content. You might be able to trick a few people searching for such words into clicking at your link, but they will quickly leave your site when they see you have no info on the topic they were originally searching for. If you have a site about Medical Science and your keywords include "Shahrukh Khan" and "Britney Spears", that would be considered unrelated keywords.

Hidden Tags: The use of keywords in hidden HTML tags like comment tags, style tags, http-equiv tags, hidden value tags, alt tags, font tags, author tags, option tags, noframes tags (on sites not using frames).

Duplicate Sites: Content duplication is considered to be search engine spamming also. Sometimes what people do is, they copy the content and name the site differently. But search engines can find it easily and they mark it as a spam. Don't duplicate a web page or doorway page, give them different names, and submit them all. Mirror pages are regarded as spam by all search engines and directories.


Link Farms: Link farm is a network of pages on one or more Web sites, heavily cross-linked with each other, with the sole intention of improving the search engine ranking of those pages and sites.

Many search engines consider the use of link farms or reciprocal link generators as spam. Several search engines are known to kick out sites that participate in any link exchange program that artificially boosts link popularity.

Links can be used to deliver both types of search engine spam, i.e. both content spam and meta spam.

Link content spam

When a link exists on a page A to page B only to affect the hub component of page A or the authority component of page B, that is an example of content spam on page A. Page B is not spamming at all. Page A should receive a spam penalty. Without further evidence, page B should not receive a penalty.

Link meta spam
When the anchor text or title text of a link either mis-describes the link target, or describes the link target using incoherent language, that is an example of link meta spam.

Reapetative Submitting: Each search engine has its own limits on how many pages can be submitted and how often. Do not submit the same page more than once a month to the same search engine and don't submit too many pages each day. Never submit doorways to directories. Decorum

Redirects: Do not list sites using URL redirects. These include welcome.to, i.am, go.to, and others. The complete site should be hosted on the same domain as the entry page. An exception may be made for sites that include a remotely hosted chat or message board as long as the bulk of the site is hosted on its own domain. Actually redirecting of page was not developed for spam, but it is becoming popular technique for spamming.

There are many means of redirecting from one Web page to another. Examples of redirection methods are HTTP 300 series redirect response codes, HTTP 400 series error vectors, META REFRESH tags and JavaScript redirects. As studied earlier these are used to move visitor from one page to another without giving them a single second. In this case the page made for search engine is a spam. Everything on it is an example of either content spam or meta spam.

Alt Text Spamming: Tiny text consists of placing keywords and phrases in the tiniest text imaginable all over your site. Most people can't see them, but spiders can. Alt text spamming is stuffing the alt text tags (for images) with unrelated keywords or phrases.

Doorway Pages: Doorways are pages optimized only for search engine spiders in order to attract more spiders, thus more users. Usually optimized for just one word or phrase and only meant for spiders, not users.

Content Spam: It is possible when different URLs delivers same content i.e. content duplication and same URL can deliver different content as well. Both HTML and HTTP supports it and hence spamming is possible. For example, IMG support and ALT text within HTML means that image-enabled visitors to a URL will see different content to those visitors that, for various reasons, cannot view images. Whether the ability to deliver spam results in the delivery of spam is largely a matter of knowledge and ethics.

Agent based Spam: Agent based delivery is certainly not spam. But it is spam when the use of agent based delivery to identify search engine robots by user agent and deliver unique content to those robots. Since the content is only created for search engines and it is not visible for users, it is always spam.

IP Spam: Identification of search engine robots by IP name or address and delivery of unique content to those robots is considered to be spamming. As in agent based spam, though this technique is also spam when you deliver unique content only to search engines and not the users or visitors.

No Content: If sites do not contain any unique and relevant content to offer visitors, search engines can consider this spam. On that note, illegal content, duplicate content and sites consisting of large affiliate links are also considered to be of low value to search engine relevancy.

Meta Spam: Meta data is data that describes a resource. Meta spam is data that mis-describes a resource or describes a resource incoherently in order to manipulate a search engine's relevancy calculations.

Think again about the ALT tag. Not only does it provide content for a HTML resource, it also provides a description of an image resource. In this description capacity, to mis-describe an image or to describe it incoherently is meta-spam. Perhaps the best examples of meta spam at present can be found in the section of HTML pages. Remember, though, it’s only spam if it is done purely for search engine relevancy gain.

Meta spam is more abstract than content spam. Rather than discuss it in abstract terms, we will take some examples from HTML and XML/RDF in order to illustrate meta spam and where it differs from and crosses with content spam.

Generally, anything within the section of an HTML document, or anything within the section that describes another resource, can be subverted to deliver meta spam.


To make sure that you are not spamming, you need to check out few things. The first and foremost is, you should know whether your content is really valuable for your customers and visitors or not. Any trick to attract more visitors is not going to help you for shorter period of time also. Try and make websites according to user’s tests and preferences. Always remember that, Internet users are information seekers and they want latest content all the time. So think and build a site as of there are no search engines. Avoid automated pages. Google and many other search engines do not index auto generated pages.

Inktomi does accept information pages into their free index and into their paid inclusion programs. For example, if a site contains PDF documents, and you create an information page in HTML with an abstract of each PDF document, that HTML page is acceptable to Inktomi.

How to report Search Engine Spam:

Since spamming practices are constantly evolving, it is important to know what the major search engines specifically say about spam and what practices are definitely not allowed if you would like to rank in top-tier search engines. Plus, every ethical SEO should know how to properly report any spam that they see so the search engines can correct their algorithm accordingly.
How Google Defines Spam

As part of their Webmaster Guidelines, Google outlines techniques to use to help Google locate, index and rank your website. They also specificially state that the following techniques may lead them to remove your site from the Google index:
Hidden text or hidden links.

Cloaking or sneaky redirects.
Automated queries to Google.
Pages loaded with irrelevant keywords.
Multiple pages, subdomains, or domains with substantially duplicate content.
"Doorway" pages created just for search engines, or other "cookie cutter" approaches such as affiliate programs with little or no original content.

However you should keep in mind that these aren't the only practices that Google disapproves of. Generally, Google doesn't like their results manipulated by deceptive practices. Their recommendation for webmasters is:

Webmasters who spend their energies upholding the spirit of the basic principles listed above will provide a much better user experience and subsequently enjoy better ranking than those who spend their time looking for loopholes they can exploit.

To combat common search engine spam practices employed by rogue SEOs, Google has also posted a list of practices that should raise a red flag when you are looking for a search engine optimizer. According to Google, feel free to walk away from an SEO who:

owns shadow domains
puts links to their other clients on doorway pages
offers to sell keywords in the address bar
doesn't distinguish between actual search results and ads that appear in search results
guarantees ranking, but only on obscure, long keyword phrases you would get anyway
operates with multiple aliases or falsified WHOIS info
gets traffic from "fake" search engines, spyware, or scumware
has had domains removed from Google's index or is not itself listed in Google
How to Report Spam to Google

Google has a form that allows you to report spam to Google or you can e-mail Google at spamreport@google.com. Note that Google rarely manually removes websites from the engine. Instead, it tweaks the search engine algorithm and spam detection software to try and eliminate the spam technique that is clogging up the engines.

How Yahoo! Defines Spam

NOTE: Altavista, All the Web and Inktomi are all owned by Yahoo!, so the Yahoo! spam policies and webmaster guidelines also apply to these search engines.

According to Yahoo!, search engine spam is webpages “that are considered unwanted and appear in search results with the intent to deceive or attract clicks, with little regard for relevance or overall quality of the user experience.” Officially, Yahoo! does not want to index sites with:

Text that is hidden from the user
Misuse of competitor names/products
Pages that have substantially the same content as other pages
Multiple sites offering the same content
Pages in great quantity, which are automatically generated or of little value
Pages dedicated to redirecting the user to another page
Pages that give the search engine different content than what the end-user sees
Pages built primarily for search engines
Pages that use excessive pop-ups, interfering with user navigation
Pages that use methods to artificially inflate search engine ranking
Sites with numerous, unnecessary virtual hostnames
Excessive cross-linking with sites to inflate a site's apparent popularity
Pages that harm the accuracy, diversity, or relevance of search results
Pages that seem deceptive, fraudulent, or provide a poor user experience
How to Report Spam to Yahoo!

If you find a site that is spamming in Yahoo!, you can report the spam through a form on their website.

NOTE: In addition to reporting spam, you can also report copyright violations to Yahoo!. To request that they remove any content published in violation of copyright protection, e-mail them at copyright@yahoo-inc.com.

How Teoma / Ask Jeeves Defines Spam
One of the most definitive sources of the Teoma / Ask Jeeves spam policy is on their Site Submission Terms page. Among the techniques that will keep you from being ranked are:
Having deceptive text
Having duplicate content
Having metadata that does not accurately describe the content of a web page
Including off-topic or excessive keywords
Fabricating pages to lead users to other web pages
Showing different content than the spidered pages to users
Using intentionally misleading links
Using self linking referencing patterns
Misusing affiliate or referral programs

How to Report Spam to Teoma / Ask Jeeves
To report search engine spam to Ask Jeeves or Teoma, e-mail them at jeeves@askjeeves.com
How MSN Defines Spam
MSN Search has recently added content guidelines to their website, explicitly stating that the MSNBot will see the following techniques as search engine spam:
Stuffing pages with irrelevant keywords in order to increase a page’s keyword density, including ALT tag stuffing.
Using hidden text or links.
Using techniques such as creating link farms to artificially increase the number of links to your page.

Also, in an e–mail announcing the second preview release of the new MSN search, Microsoft mentioned cloaking and having duplicate content on multiple domains as things that will lead your site to being penalized or removed from the MSN Search index.
How to Report Spam to MSN
To report search engine spam to MSN, use the form on their website.
Have you seen any search engine spam lately? Instead of submitting spam reports to each engine, you can also simply submit a spam report through SEOToolSetTM.

Even those who are spamming right now and think they are getting away with it, should keep one thing in mind, when competitors check out your site (and they do), they will see it is spam and they may choose to report you. Once you have been reported to a search engine, you are likely to be penalized in search engine results for using your spam technique.

Tuesday, June 17, 2008

Advanced SEO For Frames Site


HTML frames allow authors to present documents in multiple views, which may be independent windows or subwindows. Multiple views offer designers a way to keep certain information visible, while other views are scrolled or replaced. For example, within the same window, one frame might display a static banner, a second a navigation menu, and a third the main document that can be scrolled through or replaced by navigating in the second frame.


The layout of frame could be like bellow:


A framed page like the example shown is actually made up of 4 separate pages, a frameset page and three content pages. The frameset page tells the browser how big each frame should be, where they should be placed and what pages should be loaded into frame. If the browser or search engine can't display frames or is configured not to, it will render the contents of the NOFRAMES element.

The homepage or index page of a framed site is the document which contains the frameset and as you can see from the HTML above there is very little in the way of content for the search engines to read and index. What is needed is for more information to be added to the NOFRAMES element.

The best way of achieving this is to add a complete web page within the NOFRAMES tag including appropriate keyword rich headings and text. A navigation menu should also be included to provide links to all internal areas of your website. This will allow the search engines to index all areas of your website and improve accessibility for those using a browser or alternate device that does not support frames or has frames support disabled.

Placing nothing but a long list of keywords will not help your search engine position and may even be harmful.

Every web page has a unique makeup and location, which is easily definable, except frames. Frames are multiple pages listing on the same page, and why they can make site navigation simple, they do not show the pages current address. If you have an interesting article deep within your site using frames makes it hard for me to link to it. If you force me to link to your home page then I am probably not going to link to you.

You can get around frames by having a site map from the home page that links to all the framed pages, but even if these pages list high they will probably lack good navigation since the framework that contained it is not located with it in the search results.

There is an HTML tag called the NOFRAMES tag, which, when used properly, gives the search engine spiders the information they need to index your page correctly. I believe it was designed to give frames-incapable browsers — early versions of browsers that cannot read or interpret the FRAMESET tags — the ability to "see" the information on a framed site.

Unfortunately, too many sites that utilize this NOFRAMES tag put the following words into it: "You are using a browser that does not support frames. Update your browser now to view this page." It might as well say, "We are putting the kiss of death on our Web site and have no interest in being found in the search engines for relevant keywords regarding our site! Thanks for not visiting our site because you couldn't find it!"

What happens when you do the above is that the engines will read your TITLE and META tags (if you even included them) and the above information that the browser is frames-incapable, and that is what they will index for your site.

Try a search at AltaVista for the following: "does not support frames" and guess what? 260,882 pages are found! Nearly all of them are framed sites that used those words in their NOFRAMES tag. I bet that the circular-saw maker whose site is ranked number 1 for those keywords doesn't have a clue that he has put the kiss of death on his Web site! I also bet his site is nowhere to be found under the keyword "circular saws." (It isn't.)

If you want to have a framed site for whatever reason, then for goodness' sake, use your NOFRAMES tag properly! The proper usage of this tag is to take the complete HTML code from your inner page and copy it into the NOFRAMES tag.

The above information takes care of your front page. However, there are other issues having to do with getting the rest of your pages indexed properly when you use a framed site.
Most Web designers use frames for ease of navigation. That is, they have a left-hand frame with a static navigational bar or buttons that never change. When someone clicks on a button on the left, the frame to the right brings up the new page accordingly. Because of this type of design, there are usually no navigational links on any of the inner, framed pages.
Why is this bad? It's bad because you could (and should) optimize these inner pages to rank high in the search engines. But if you do, and someone searching in the engines finds them, they will be what I call orphaned pages.

I'm sure you've come across these at one time or another in your searches: a page that has a bit of information about what you were searching for but offers no way to get to the rest of the site!
Savvy Internet users might look at the URL and try finding the root directory, but most users don't have a clue about doing that. It's too bad for the site owner, who just lost some potential eyeballs — or worse, a potential customer.
If you use a framed design, it is absolutely imperative to place navigational links on all your inner pages. At the very least, include a button that links back to your home page. However, I would recommend that you have links to all your major category pages, as this will help the search engine spiders visit all the pages, index them all, and rank them high!