Web Marketing Today

How to Get Higher in the Search Engines: The Science of “Gateway” Pages

Every week I get a number of e-mails that say, “My site has been up for six months, but you can hardly find me in the search engines. What can I do?” Sound familiar?

Only Seven Count

At present only Yahoo (technically a directory) and six search engines (that send robot “spiders” roaming the Web looking for unsuspecting sites to index) bring significant traffic. The six are:

  • Lycos
  • AltaVista
  • Excite
  • HotBot
  • WebCrawler
  • Infoseek

While your site’s ranking on Yahoo is harder to control (and that’s outside the scope of this article), you can dramatically affect your ranking on the Big Six. Submitting your Web pages to these is easy with a free tool such as All 4 One Submission Machine (http://www.all4one.com/all4submit/). But submitting is much different than positioning, since ranking is what we’re concerned about.

And what I have to tell you is pretty confusing.

The case of Nannies Plus

Last February we launched a client’s site, Nannies Plus, a nanny referral agency with a national reputation (http://www.nanniesplus.com). Last week, the company’s owner called me in despair. “Nobody is finding us in the search engines. Nearly all our referrals come through A NaniNet (an organization that places banner ads on Yahoo and other portal sites).”

While we don’t provide search engine placement for the general public, we take an interest in the success of our website design clients. I was intrigued. Using the search word “Nannies,” our client ranked in the top four in four out of the Big Six search engines. We had done pretty well there. But using the search word “Nanny,” Nannies Plus fell dramatically — and “nanny” is the word parents usually enter on to find a nanny! Here were the rankings of http://www.nanniesplus.com on September 29, 1998:

Search Word

Search Engine

“Nannies”

“Nanny”

AltaVista

162

200+

Excite

1

29

HotBot

4

9

Infoseek

1

105

Lycos

65

242

WebCrawler

2

25

This kind of data tells us something. It appears (notice the careful language I’m using) that Excite, HotBot, Infoseek, and WebCrawler are computing rankings in a similar manner. It also raises all sorts of questions. For starters, What in the world happened to AltaVista and Lycos?

Looking at Titles and META Tags

Maybe the answer is found in the META tags, I thought. (If you need a basic course on META tags, you can find several references on Yahoo under “META tags.” Here are the key elements of Nannies Plus home page:

URL: http://www.nanniesplus.com/<TITLE>Nannies Plus Nanny Referral and Placement Service</TITLE>

<META NAME=”DESCRIPTION” CONTENT=”Nanny referral agency specializing in live-in, college-educated American nannies (not foreign au pairs) who provide childcare for families throughout the U.S.”>

<META NAME=”KEYWORDS” CONTENT=”Nanny, Caregivers, Nannies, governess, Childcare, agency, child care, agencies, daycare, placement service, In-home, referral, parents, parenting, working mothers, Nannies Plus, working women, elite, university, live-in, career, American, job opportunities, child development, college graduates, positions, elementary education, early childhood education, development, teachers, students, travel, nurses, preschool, families, family, toddlers, children, babysitting, baby-sitting, newborns, infants, education, babies, employment opportunities, jobs”>

Notice that the word “Nanny” appears in the title. That’s important to being ranked higher under “Nanny”. But notice that the word “Nanny” only appears capitalized in the keywords and description. Does that make a difference? And what about AltaVista and Lycos? Hard questions to answer without lots of research.

Understanding how search engines work

To find some answers I went to the most visible search engine tracker, Danny Sullivan of Mecklermedia’s Search Engine Watch (http://www.searchenginewatch.com). On his site you’ll find a wealth of information, notably a detailed Search Engine Features Chart. Here you learn important tidbits such as:

  • Excite, HotBot, Infoseek, and Lycos don’t support HTML frames (unless, of course, you’re very careful to provide links to your site between the <NOFRAMES> tags).
  • META tags raise the ranking of a site only in HotBot and Infoseek.
  •  ALT tags under images are indexed by AltaVista, Infoseek, and Lycos
  •  Comments are indexed by HotBot
  • Some engines are case sensitive while others are not
  • Nearly all (except Infoseek) skip common words like “web”
  • Infoseek and Lycos can index the root or stem of words. “Swim” might also find “swims” or “swimming.”

And search engines are getting smarter about attempts to trick them with repetitious lists of keywords called “search engine spamming.”

  •  Invisible text (e.g. white words against a white background) is penalized as spam by all except Excite.
  •  Tiny text could be considered as spam by Lycos, AltaVista, HotBot, and WebCrawler.
  •  META refresh tags that automatically redirect a visitor from a bridge page to the main website are treated as spam by AltaVista and Infoseek.

Search engines may also be looking for an “ideal” ratio of search words to the total number of words as a check against spamming.

Looking in the rear view mirror

The real problem, however, is that search engines are constantly changing the algorithms by which they rank sites. What is true today, may not be true next week. (Danny Sullivan’s observations above were nearly two months old when I read them.). Until recently, one search engine provided instant registering of Web pages so you could tweak your pages and then check the ranking, until you got the ranking as high as you could. But that ended a few weeks ago. Paul Bruemmer, a search engine optimization professional the past two years Web-Ignite’s ClientDirect (http://www.clientdirect.com/), puts it this way: “Optimizing Web pages for search engines is like driving a car with a blacked-out windshield; you can only look in the rear view mirror to see where you’ve been. ”

Study the search engines

The only way to figure it out is to study the search engines in detail. So I began by analyzing the Web pages that appeared highest in several of the search engines. I counted the times “nannies” appeared visibly in each Web page and compared that to the total number of words. I gave up after several hours of number crunching.

But the process convinced me that this is extremely complex. Since so many factors are involved, you can’t focus on just one thing. I observed that

  • Nearly all the sites that scored high in Excite and HotBot had META tags that contained the search word, while only one of the top five “nannies” listings in AltaVista even used META tags. Does AltaVista penalize for META tags?
  • Nannies Plus had a high ratio of the word “nannies” compared to total words (11%), which may explain why it was so high on four of the top six for “nannies.” Nannies Plus had a much lower visible ratio for “nanny” (only 2.6%).
  • AltaVista’s listing makes no sense to me. Perhaps Nannies Plus was penalized for a high word ratio. Maybe Nannies Plus is perceived as an evil word spammer. But when I examined word characteristics of AltaVista’s top five listings, I concluded that AltaVista doesn’t seem to help much in ranking relevant documents higher.

My research convinces me that you can’t just get in the top listings with good META tags (though that helps). You need to design special versions of your Web pages to suit the idiosyncrasies of each major search engine. And to do that intelligently, you need help.

Registering vs. positioning

There’s a big difference between registering a site with the search engines initially, and positioning it high on the list. And there’s a big cost difference, too. ClientDirect (http://www.clientdirect.com/) charges $500 to $4,000 per month for search word positioning, plus a per visitor fee, placing them at the higher end of this kind of service.

High End Techniques

I asked Paul Bruemmer what kinds of techniques ClientDirect uses. This is their approach:

  1. Assess whether it is possible to increase search engine rank for a particular word or phrase.  For example, words related to Internet service providers and Internet advertising are extremely competitive. The struggle for some words is so cut-throat that to increase a site’s ranking becomes prohibitively expensive. But many industries have little sophistication regarding search engine optimization, making it possible to raise a site’s ranking significantly.
  2. Prepare a set of gateway pages for each search word or search phrase the client requests.  For example, a client might want to appear high for the word “sports apparel,” “golf shoes,” and “golf clothing.” Each phrase gets its own separate set of specially designed gateway pages (also called “doorway” or “bridging” pages). The set for a particular search phrase will have a separate page for each search engine. “It’s a myth that one strategy applies to everything,” says Bruemmer. “Each engine is its own book of physics. You really must write a Web page for each engine if you’re going to do it justice.”
  3. Host gateway pages on high speed servers. While the client’s website is hosted elsewhere, all the gateway pages are on the ClientDirect’s servers for full logfile access and tracking of visitors.
  4. Write specialized software to feed search engine spiders and redirect surfers. Some search engines, most notably Infoseek, penalize Web pages that have automatic referrals to other sites using META refresh tags. ClientDirect has carefully programmed its high speed servers to instantly identify search engine “spiders” by IP number and deliver to them only the information on the gateway pages. Regular visitors are instantly routed to the client’s own website.
  5. Secure several domains for each client.  ClientDirect registers domain that are variations of the client’s main domain name. Each domain is used for different purposes. For example, Excite allows only 25 Web pages from any given domain name. Another domain is used for Lycos and others, with another for research and development, and still another on hold for future use.
  6. Constantly monitor search engine ranking.  Referral logs are continually analyzed to determine any changes in a search engine’s behavior. Rankings on the various search engines constantly monitored to catch any erosion in a site’s position.
  7. Make changes in the gateway pages as needed to maintain or improve the site’s ranking on a particular search engine.

 

 

 

 

 

 

Cost factors

When you understand the complexity and client-focus of this kind of operation, the high price tag starts to make sense. “We put man hours and a programming team on a particular client’s objective,” says ClientDirect’s Bruemmer. “Some companies claim to offer this technology and ability at incredibly low rates in the 25 cents per visitor range. We’ve found that we can’t do it for that price. But we do offer actual performance for the client.” ClientDirect’s rates are closer to $2 per visitor, a rate beyond many current Internet business models.

Rerouting controversy

ClientDirect’s method of automatically rerouting visitors to the client’s website isn’t without its critics. Fredrick Marckini of Response Direct (http://www.responsedirect.com) considers it a way to break the “rules” set up by search engines and trick them. Marckini prefers to put gateway pages on his clients’ own domain and server.

But Bruemmer responds, “ClientDirect uses approved search engine algorithms. We feed the engine good food, we do not trick the engine. We do not deceive the searcher, since when we create a gateway page for a particular business the searchers find that business’s Web pages. We provided the engine with exactly what the engine wants, and according to its rules.”

Bruemmer also sees keeping the gateway pages on ClientDirect’s server as a way of protecting his client’s ranking. Since rerouting by ClientDirect’s servers is immediate, the client’s competitors never see the code that is bringing high rankings and steal the code to place on their own site. Hiding the code in this way maintains the client’s competitive advantage.

A Lower Priced Approach

Since I was studying search engine positioning, a piece of e-mail spam that would normally get a delete gets a mouse click instead, and I learn about a company that focuses on top 10 placement with “a special promotion until October 2 on three key words or phrases.” All results guaranteed, the phone recording says. Your money is returned if we don’t meet our promises. One year is regularly $2000, but until October 2 only $999.

I talk to the owner, Dave Warren, and ask about the actual guarantee. Top position on the major search engines is defined, at a minimum, as a position in the top 30 on at least three search engines, for three key words or key phrases. “If we don’t think we can achieve top ranking for a search word or phrase,” says Warren, “we don’t take the job.”

This company hosts all the gateway pages on its own server. Each gateway page has an HTML hyperlink to the client’s website. “If someone is interested in the subject, they’ll usually click on the link,” Warren assures me. Rather than securing several domains for every client, Warren’s company uses its own generic domains for the referral pages, sharing these domain names among a number of clients’ gateway pages.

Comparing to banner ads

What does the small businessperson do in the face of such sophisticated systems? After a little despair and some groaning you begin to look at the alternatives.

We’ve been lulled into believing that we could set up Web businesses with little or no advertising, by just listing our sites with our friends the search engines. It just isn’t so any longer. Perhaps we should look at the cost of gateway page doctoring as an advertising expense, pure and simple. Instead of banner ads, you’re “purchasing” links viewed by people who presumably are searching for just your sort of product or service.

$500 per month may seem like a lot of money until you think about what it brings: a constant flow of traffic if you’re in the top ranking. Banner ads, for the same monthly budget, will bring you 14,300 page views and perhaps 150 click-throughs (at an average $35 CPM and a 1% click-through rate). Finely-tuned search engine gateway pages can potentially bring you much more.

Do-It-Yourselfers

With the extensive time investment necessary in studying search engines and their continual changes, Bruemmer doesn’t think the average small business person has time to tweak search pages regularly and run a business: “Positioning clearly becomes an issue that needs to be subcontracted.” At the very least, he says, “You’re better off hiring an employee to do this.”

And what resource materials would you give that employee? One resource is Fredrick Marckini’s book Achieving TOP 10 Rankings in Internet Search Engines (http://www.trafficbuilding.com/aboutsr.htm). But, as Bruemmer observes, “If you’ve had time to write it down and publish it, it’s old.” You might want to supplement this with fresh updates provided by free e-mail subscriptions to Search Engine Watch (http://www.searchenginewatch.com) and MarketPosition (http://www.webposition.com/newsletter.htm).

What software is available? Bruemmer pauses before answering. “The best position monitoring software is WebPosition.” A beta version of WebPosition Gold is being tested and Bruemmer acknowledges that it is the best positioning tool available, “but even that is a year behind the times and it’s beta,” he says. I can see his mind comparing static software to the state-of-the-art intelligence his team at ClientDirect is uncovering weekly. “Until a webmaster puts 8 to 10 days into actually using the software, it’s virtually worthless,” he says, “like having a car without a driver.” Nevertheless, if used contentiously, WebPosition Gold (now in beta) will get you quite a way down the road. It’s help screens teach you the ins-and-outs of search engine positioning and its future KnowledgeBase updates promise to keep the software up-to-date in preparing search-engine-specific gateway pages.

An interim solution

From one standpoint, the whole science of developing finely-tuned gateway pages that lead to “real” websites seems pretty artificial. It’s an exercise in outsmarting search engines — understanding their current “rules” and providing them with an slimmed-down Web page of exactly what they’re looking for. They’re an artifact of the times, and it wouldn’t surprise me if gateway pages prove to be only a temporary phenomenon.

But for now, they seem a necessary element in the marketing mix. And the costs of keeping them up-to-date should be chalked up to general marketing costs — costs siteowners ignore at their own peril.


Dr. Ralph F. Wilson
Dr. Ralph F. Wilson
Bio  |  RSS Feed


Comment ( 1 )

Have Something To Say ?

  1. Gloria Campbell June 9, 2012 Reply

    Thank you so much for your write up. I have found it’s an uphill battle to comprehend search engines so this blog post may help considerably. Please do not stop posting. Thanks.

Email Updates

Sign up to receive our email newsletter
And receive a free ebook
50 Great Local Marketing Ideas