The art of SEO copy writing for search spiders and indexes, whether web site copy writing is outsourced or produced in-house, must be carefully considered, given that text content is the major item on search engine checklists.
Keyword density analysis Although there is no magic bullet as far as optimizing your web site content, starting with accurate, objective data is key to success. One of the factors used by search engine ranking algorithms is keyword density.
In other words, if your site claims to sell widgets in California, this should be confirmed by many occurrences of those words, or whatever yours may be, throughout all indexed pages.
Keyword efficiency assessment Once you have carefully chosen a set of keywords, conduct a global study of all search phrases. Using a tool like Wordtracker, you can see how many times people type those keywords in major search engines and how many competing pages come up in the resulting listings.
This is a primordial step, as results are sometimes surprising. This process allows to find keyword niches and also discard those keywords with few searches and many competing pages.
Search “spiders” (also called “robots, “ “bots” or “crawlers”) designate the automatic processes that examine a given domain name or URL address to index it or update its listing in a search engine. The spider “crawls” all linked pages and sometimes any page not specifically exempted from indexing.
To prevent sensitive or outdated directories or pages from being crawled, web sites require a file called “robots.txt” to be present at the root of the web site structure. Without it, search spiders may index all of your web pages including irrelevant ones or those not intended for public viewing. However, because of the way they were conceived, “robots.txt” files cannot indicate which files to index, only those NOT to be indexed.
As spiders review your site, they will store only relevant content in their database. So they look for specific descriptive information and data. Most spiders look for some or all of the following:
How spiders rate the importance of each parameter depends on search engines themselves, which use their own algorithms and sets of rules. There is no choice but to always be up-to-date with search spider technologies and specifics. One thing that is clear, however, is that alignment is a major factor or focus for all these variables around a single target. When using a SEO software tool like SeoSamba, you can help alleviate most of the initial optimizing and on-going work tremendously because our SEO technology helps you align all SEO variables to aim for your key phrase marketing target—namely, meta tags, URLs, intra-site linking, text headings, and page titles. And thanks to our smart SEO technology SeoSamba creates automated entries that you can further refine and enhance to save time and money.
If a spider detects discrepancies between those items, such as lack of text appearances of prominent keywords, it may disregard keywords and reference your pages based on repeated words or linked text, significantly affecting your chances of high ranking in search listings.
Is creating doorway pages a viable alternative? A target page, otherwise called a doorway page, is simply a single page optimized for one keyword or one phrase for one search engine. You probably have seen target pages before and never even realized it. If you have come across a page that has a logo and says please enter here or something similar, you have likely come across a target page optimized to help you find a company’s main site.
Target pages may be designed for one or all of the following purposes:
Theoretically, if one of your main keywords were "free online auction," multiple doorway pages should be created for that one term for various search engines. Each page must be specifically optimized for that phrase for a given search engine. Each page must be different as each search engine uses different criteria to determine their rankings.
What will then happen if you don't create original content for each page? You will end up with numerous near-identical pages as far as search engines can tell. In other words, you will create competition for yourself, dilute inbound traffic, and most likely make search engines wary of your guerilla-marketing tactics.
To prevent this problem, some online providers create a gateway hosted on their server that redirects and counts search engine hits while letting through known spiders. Dubbed “doorway page cloaking,” this technique will only make matters worse because you are binding search engine traffic to a third party, on which you will depend on solely from then on.
That is why doorway pages are steadily loosing popularity and those providing these services have already made their way to search engine spam lists. And that is why you should only consider a far more involved creation process than creating a cookie cutter software-based HTML page. If you need a multiple entry-page strategy, SeoSamba recommends using original content along with an appropriate linking strategy.