Minggu, 16 November 2008

Choosing keywords

Initially choosing keywords

Choosing keywords should be your first step when constructing a site. You should have the keyword list available to incorporate into your site text before you start composing it. To define your site keywords, you should use seo services offered by search engines in the first instance. Sites such as www.wordtracker.com and inventory.overture.com are good starting places for English language sites. Note that the data they provide may sometimes differ significantly from what keywords are actually the best for your site. You should also note that the Google search engine does not give information about frequency of search queries.

After you have defined your approximate list of initial keywords, you can analyze your competitor’s sites and try to find out what keywords they are using. You may discover some further relevant keywords that are suitable for your own site.

Frequent and rare keywords

There are two distinct strategies – optimize for a small number of highly popular keywords or optimize for a large number of less popular words. In practice, both strategies are often combined.

The disadvantage of keywords that attract frequent queries is that the competition rate is high for them. It is often not possible for a new site to get anywhere near the top of search result listings for these queries.

For keywords associated with rare queries, it is often sufficient just to mention the necessary word combination on a web page or to perform minimum text optimization. Under certain circumstances, rare queries can supply quite a large amount of search traffic.

The aim of most commercial sites is to sell some product or service or to make money in some way from their visitors. This should be kept in mind during your seo (search engine optimization) work and keyword selection. If you are optimizing a commercial site then you should try to attract targeted visitors (those who are ready to pay for the offered product or service) to your site rather than concentrating on sheer numbers of visitors.

Example. The query “monitor” is much more popular and competitive than the query “monitor Samsung 710N” (the exact name of the model). However, the second query is much more valuable for a seller of monitors. It is also easier to get traffic from it because its competition rate is low; there are not many other sites owned by sellers of Samsung 710N monitors. This example highlights another possible difference between frequent and rare search queries that should be taken into account – rare search queries may provide you with less visitors overall, but more targeted visitors.

5.3 Evaluating the competition rates of search queries

When you have finalized your keywords list, you should identify the core keywords for which you will optimize your pages. A suggested technique for this follows.

Rare queries are discarded at once (for the time being). In the previous section, we described the usefulness of such rare queries but they do not require special optimization. They are likely to occur naturally in your website text.

As a rule, the competition rate is very high for the most popular phrases. This is why you need to get a realistic idea of the competitiveness of your site. To evaluate the competition rate you should estimate a number of parameters for the first 10 sites displayed in search results:
- The average PageRank of the pages in the search results.
- The average number of links to these sites. Check this using a variety of search engines.
Additional parameters:
- The number of pages on the Internet that contain the particular search term, the total number of search results for that search term.
- The number of pages on the Internet that contain exact matches to the keyword phrase. The search for the phrase is bracketed by quotation marks to obtain this number.

These additional parameters allow you to indirectly evaluate how difficult it will be to get your site near the top of the list for this particular phrase. As well as the parameters described, you can also check the number of sites present in your search results in the main directories, such as DMOZ and Yahoo.

The analysis of the parameters mentioned above and their comparison with those of your own site will allow you to predict with reasonable certainty the chances of getting your site to the top of the list for a particular phrase.

Having evaluated the competition rate for all of your keyword phrases, you can now select a number of moderately popular key phrases with an acceptable competition rate, which you can use to promote and optimize your site.

Refining your keyword phrases

As mentioned above, search engine services often give inaccurate keyword information. This means that it is unusual to obtain an optimum set of site keywords at your first attempt. After your site is up and running and you have carried out some initial promotion, you can obtain additional keyword statistics, which will facilitate some fine-tuning. For example, you will be able to obtain the search results rating of your site for particular phrases and you will also have the number of visits to your site for these phrases.

With this information, you can clearly define the good and bad keyword phrases. Often there is no need to wait until your site gets near the top of all search engines for the phrases you are evaluating – one or two search engines are enough.

Example. Suppose your site occupies first place in the Yahoo search engine for a particular phrase. At the same time, this site is not yet listed in MSN, or Google search results for this phrase. However, if you know the percentage of visits to your site from various search engines (for instance, Google – 70%, Yahoo – 20%, MSN search – 10%), you can predict the approximate amount of traffic for this phrase from these other searches engines and decide whether it is suitable.

As well as detecting bad phrases, you may find some new good ones. For example, you may see that a keyword phrase you did not optimize your site for brings useful traffic despite the fact that your site is on the second or third page in search results for this phrase.

Using these methods, you will arrive at a new refined set of keyword phrases. You should now start reconstructing your site: Change the text to include more of the good phrases, create new pages for new phrases, etc.

You can repeat this seo exercise several times and, after a while, you will have an optimum set of key phrases for your site and considerably increased search traffic.
Here are some more tips. According to statistics, the main page takes up to 30%-50% of all search traffic. It has the highest visibility in search engines and it has the largest number of inbound links. That is why you should optimize the main page of your site to match the most popular and competitive queries. Each site page should be optimized for one or two main word combinations and, possibly for a number of rare queries. This will increase the chances for the page get to the top of search engine lists for particular phrases.

Indexing a site

Before a site appears in search results, a search engine must index it. An indexed site will have been visited and analyzed by a search robot with relevant information saved in the search engine database. If a page is present in the search engine index, it can be displayed in search results otherwise, the search engine cannot know anything about it and it cannot display information from the page..

Most average sized sites (with dozens to hundreds of pages) are usually indexed correctly by search engines. However, you should remember the following points when constructing your site. There are two ways to allow a search engine to learn about a new site:

- Submit the address of the site manually using a form associated with the search engine, if available. In this case, you are the one who informs the search engine about the new site and its address goes into the queue for indexing. Only the main page of the site needs to be added, the search robot will find the rest of pages by following links.

- Let the search robot find the site on its own. If there is at least one inbound link to your resource from other indexed resources, the search robot will soon visit and index your site. In most cases, this method is recommended. Get some inbound links to your site and just wait until the robot visits it. This may actually be quicker than manually adding it to the submission queue. Indexing a site typically takes from a few days to two weeks depending on the search engine. The Google search engine is the quickest of the bunch.

Try to make your site friendly to search robots by following these rules:

- Try to make any page of your site reachable from the main page in not more than three mouse clicks. If the structure of the site does not allow you to do this, create a so-called site map that will allow this rule to be observed.

- Do not make common mistakes. Session identifiers make indexing more difficult. If you use script navigation, make sure you duplicate these links with regular ones because search engines cannot read scripts (see more details about these and other mistakes in section 2.3).

- Remember that search engines index no more than the first 100-200 KB of text on a page. Hence, the following rule – do not use pages with text larger than 100 KB if you want them to be indexed completely.

You can manage the behavior of search robots using the file robots.txt. This file allows you to explicitly permit or forbid them to index particular pages on your site.

The databases of search engines are constantly being updated; records in them may change, disappear and reappear. That is why the number of indexed pages on your site may sometimes vary. One of the most common reasons for a page to disappear from indexes is server unavailability. This means that the search robot could not access it at the time it was attempting to index the site. After the server is restarted, the site should eventually reappear in the index.

You should note that the more inbound links your site has, the more quickly it gets re-indexed. You can track the process of indexing your site by analyzing server log files where all visits of search robots are logged. We will give details of seo software that allows you to track such visits in a later section.

Selasa, 14 Oktober 2008

External ranking factors - Increasing link popularity

Submitting to general purpose directories

On the Internet, many directories contain links to other network resources grouped by topics. The process of adding your site information to them is called submission. Such directories can be paid or free of charge, they may require a backlink from your site or they may have no such requirement. The number of visitors to these directories is not large so they will not send a significant number to your site. However, search engines count links from these directories and this may enhance your sites search result placement. Important! Only those directories that publish a direct link to your site are worthwhile from a seo point of view. Script driven directories are almost useless. This point deserves a more detailed explanation. There are two methods for publishing a link. A direct link is published as a standard HTML construction («A href=...», etc.). Alternatively, links can be published with the help of various scripts, redirects and so on. Search engines understand only those links that are specified directly in HTML code. That is why the seo value of a directory that does not publish a direct link to your site is close to zero. You should not submit your site to FFA (free-for-all) directories. Such directories automatically publish links related to any search topic and are ignored by search engines. The only thing an FFA directory entry will give you is an increase in spam sent to your published e-mail address. Actually, this is the main purpose of FFA directories. Be wary of promises from various programs and seo services that submit your resource to hundreds of thousands of search engines and directories. There are no more than a hundred or so genuinely useful directories on the Net – this is the number to take seriously and professional seo submission services work with this number of directories. If a seo service promises submissions to enormous numbers of resources, it simply means that the submission database mainly consists of FFA archives and other useless resources. Give preference to manual or semiautomatic seo submission; do not rely completely on automatic processes. Submitting sites under human control is generally much more efficient than fully automatic submission. The value of submitting a site to paid directories or publishing a backlink should be considered individually for each directory. In most cases, it does not make much sense, but there may be exceptions. Submitting sites to directories does not often result in a dramatic effect on site traffic, but it slightly increases the visibility of your site for search engines. This useful seo option is available to everyone and does not require a lot of time and expense, so do not overlook it when promoting your project.

DMOZ directory

The DMOZ directory (www.dmoz.org) or the Open Directory Project is the largest directory on the Internet. There are many copies of the main DMOZ site and so, if you submit your site to the DMOZ directory, you will get a valuable link from the directory itself as well as dozens of additional links from related resources. This means that the DMOZ directory is of great value to a seo aware webmaster. It is not easy to get your site into the DMOZ directory; there is an element of luck involved. Your site may appear in the directory a few minutes after it has been submitted or it may take months to appear. If you submitted your site details correctly and in the appropriate category then it should eventually appear. If it does not appear after a reasonable time then you can try contacting the editor of your category with a question about your request (the DMOZ site gives you such opportunity). Of course, there are no guarantees, but it may help. DMOZ directory submissions are free of charge for all sites, including commercial ones. Here are my final recommendations regarding site submissions to DMOZ. Read all site requirements, descriptions, etc. to avoid violating the submission rules. Such a violation will most likely result in a refusal to consider your request. Please remember, presence in the DMOZ directory is desirable, but not obligatory. Do not despair if you fail to get into this directory. It is possible to reach top positions in search results without this directory – many sites do.

Link exchange

The essence of link exchanges is that you use a special page to publish links to other sites and get similar backlinks from them. Search engines do not like link exchanges because, in many cases, they distort search results and do not provide anything useful to Internet users. However, it is still an effective way to increase link popularity if you observe several simple rules. - Exchange links with sites that are related by topic. Exchanging links with unrelated sites is ineffective and unpopular. - Before exchanging, make sure that your link will be published on a “good” page. This means that the page must have a reasonable PageRank (3-4 or higher is recommended), it must be available for indexing by search engines, the link must be direct, the total number of links on the page must not exceed 50, and so on. - Do not create large link directories on your site. The idea of such a directory seems attractive because it gives you an opportunity to exchange links with many sites on various topics. You will have a topic category for each listed site. However, when trying to optimize your site you are looking for link quality rather than quantity and there are some potential pitfalls. No seo aware webmaster will publish a quality link to you if he receives a worthless link from your directory “link farm” in return. Generally, the PageRank of pages from such directories leaves a lot to be desired. In addition, search engines do not like these directories at all. There have even been cases where sites were banned for using such directories. - Use a separate page on the site for link exchanges. It must have a reasonable PageRank and it must be indexed by search engines, etc. Do not publish more than 50 links on one page (otherwise search engines may fail to take some of the links into account). This will help you to find other seo aware partners for link exchanges. - Search engines try to track mutual links. That is why you should, if possible, publish backlinks on a domain/site other than the one you are trying to promote. The best variant is when you promote the resource site1.com and publish backlinks on the resource site2.com. - Exchange links with caution. Webmasters who are not quite honest will often remove your links from their resources after a while. Check your backlinks from time to time.

Press releases, news feeds, thematic resources

This section is about site marketing rather than pure seo. There are many information resources and news feeds that publish press releases and news on various topics. Such sites can supply you with direct visitors and also increase your sites popularity. If you do not find it easy to create a press release or a piece of news, hire copywriters – they will help you find or create something newsworthy. Look for resources that deal with similar topics to your own site. You may find many Internet projects that not in direct competition with you, but which share the same topic as your site. Try to approach the site owners. It is quite possible that they will be glad to publish information about your project. One final tip for obtaining inbound links – try to create slight variations in the inbound link text. If all inbound links to your site have exactly the same link text and there are many of them, the search engines may flag it as a spam attempt and penalize your site.

External ranking factors II

Relevance of referring pages

As well as link text, search engines also take into account the overall information content of each referring page. Example: Suppose we are using seo to promote a car sales resource. In this case a link from a site about car repairs will have much more importance that a similar link from a site about gardening. The first link is published on a resource having a similar topic so it will be more important for search engines.

Google PageRank – theoretical basics

The Google company was the first company to patent the system of taking into account inbound links. The algorithm was named PageRank. In this section, we will describe this algorithm and how it can influence search result ranking. PageRank is estimated separately for each web page and is determined by the PageRank (citation) of other pages referring to it. It is a kind of “virtuous circle.” The main task is to find the criterion that determines page importance. In the case of PageRank, it is the possible frequency of visits to a page. I shall now describe how user’s behavior when following links to surf the network is modeled. It is assumed that the user starts viewing sites from some random page. Then he or she follows links to other web resources. There is always a possibility that the user may leave a site without following any outbound link and start viewing documents from a random page. The PageRank algorithm estimates the probability of this event as 0.15 at each step. The probability that our user continues surfing by following one of the links available on the current page is therefore 0.85, assuming that all links are equal in this case. If he or she continues surfing indefinitely, popular pages will be visited many more times than the less popular pages. The PageRank of a specified web page is thus defined as the probability that a user may visit the web page. It follows that, the sum of probabilities for all existing web pages is exactly one because the user is assumed to be visiting at least one Internet page at any given moment. Since it is not always convenient to work with these probabilities the PageRank can be mathematically transformed into a more easily understood number for viewing. For instance, we are used to seeing a PageRank number between zero and ten on the Google Toolbar. According to the ranking model described above: - Each page on the Net (even if there are no inbound links to it) initially has a PageRank greater than zero, although it will be very small. There is a tiny chance that a user may accidentally navigate to it. - Each page that has outbound links distributes part of its PageRank to the referenced page. The PageRank contributed to these linked-to pages is inversely proportional to the total number of links on the linked-from page – the more links it has, the lower the PageRank allocated to each linked-to page. - PageRank A “damping factor” is applied to this process so that the total distributed page rank is reduced by 15%. This is equivalent to the probability, described above, that the user will not visit any of the linked-to pages but will navigate to an unrelated website. Let us now see how this PageRank process might influence the process of ranking search results. We say “might” because the pure PageRank algorithm just described has not been used in the Google algorithm for quite a while now. We will discuss a more current and sophisticated version shortly. There is nothing difficult about the PageRank influence – after the search engine finds a number of relevant documents (using internal text criteria), they can be sorted according to the PageRank since it would be logical to suppose that a document having a larger number of high-quality inbound links contains the most valuable information. Thus, the PageRank algorithm "pushes up" those documents that are most popular outside the search engine as well.

Google PageRank – practical use

Currently, PageRank is not used directly in the Google algorithm. This is to be expected since pure PageRank characterizes only the number and the quality of inbound links to a site, but it completely ignores the text of links and the information content of referring pages. These factors are important in page ranking and they are taken into account in later versions of the algorithm. It is thought that the current Google ranking algorithm ranks pages according to thematic PageRank. In other words, it emphasizes the importance of links from pages with content related by similar topics or themes. The exact details of this algorithm are known only to Google developers.

You can determine the PageRank value for any web page with the help of the Google ToolBar that shows a PageRank value within the range from 0 to 10. It should be noted that the Google ToolBar does not show the exact PageRank probability value, but the PageRank range a particular site is in. Each range (from 0 to 10) is defined according to a logarithmic scale.

Here is an example: each page has a real PageRank value known only to Google. To derive a displayed PageRank range for their ToolBar, they use a logarithmic scale as shown in this table
Real PR ToolBar PR
1-10 1
10-100 2
100-1000 3
1000-10.000 4

This shows that the PageRank ranges displayed on the Google ToolBar are not all equal. It is easy, for example, to increase PageRank from one to two, while it is much more difficult to increase it from six to seven.

In practice, PageRank is mainly used for two purposes:

1. Quick check of the sites popularity.

PageRank does not give exact information about referring pages, but it allows you to quickly and easily get a feel for the sites popularity level and to follow trends that may result from your seo work. You can use the following “Rule of thumb” measures for English language sites: PR 4-5 is typical for most sites with average popularity. PR 6 indicates a very popular site while PR 7 is almost unreachable for a regular webmaster. You should congratulate yourself if you manage to achieve it. PR 8, 9, 10 can only be achieved by the sites of large companies such as Microsoft, Google, etc. PageRank is also useful when exchanging links and in similar situations. You can compare the quality of the pages offered in the exchange with pages from your own site to decide if the exchange should be accepted.

2. Evaluation of the competitiveness level for a search query is a vital part of seo work.

Although PageRank is not used directly in the ranking algorithms, it allows you to indirectly evaluate relative site competitiveness for a particular query. For example, if the search engine displays sites with PageRank 6-7 in the top search results, a site with PageRank 4 is not likely to get to the top of the results list using the same search query. It is important to recognize that the PageRank values displayed on the Google ToolBar are recalculated only occasionally (every few months) so the Google ToolBar displays somewhat outdated information. This means that the Google search engine tracks changes in inbound links much faster than these changes are reflected on the Google ToolBar.

External ranking factors I

Why inbound links to sites are taken into account

As you can see from the previous section, many factors influencing the ranking process are under the control of webmasters. If these were the only factors then it would be impossible for search engines to distinguish between a genuine high-quality document and a page created specifically to achieve high search ranking but containing no useful information. For this reason, an analysis of inbound links to the page being evaluated is one of the key factors in page ranking. This is the only factor that is not controlled by the site owner. It makes sense to assume that interesting sites will have more inbound links. This is because owners of other sites on the Internet will tend to have published links to a site if they think it is a worthwhile resource. The search engine will use this inbound link criterion in its evaluation of document significance. Therefore, two main factors influence how pages are stored by the search engine and sorted for display in search results: - Relevance, as described in the previous section on internal ranking factors. - Number and quality of inbound links, also known as link citation, link popularity or citation index. This will be described in the next section.

Link importance (citation index, link popularity)

You can easily see that simply counting the number of inbound links does not give us enough information to evaluate a site. It is obvious that a link from www.microsoft.com should mean much more than a link from some homepage like www.hostingcompany.com/~myhomepage.html. You have to take into account link importance as well as number of links. Search engines use the notion of citation index to evaluate the number and quality of inbound links to a site. Citation index is a numeric estimate of the popularity of a resource expressed as an absolute value representing page importance. Each search engine uses its own algorithms to estimate a page citation index. As a rule, these values are not published. As well as the absolute citation index value, a scaled citation index is sometimes used. This relative value indicates the popularity of a page relative to the popularity of other pages on the Internet. You will find a detailed description of citation indexes and the algorithms used for their estimation in the next sections.

Link text (anchor text)

The link text of any inbound site link is vitally important in search result ranking. The anchor (or link) text is the text between the HTML tags «A» and «/A» and is displayed as the text that you click in a browser to go to a new page. If the link text contains appropriate keywords, the search engine regards it as an additional and highly significant recommendation that the site actually contains valuable information relevant to the search query.

Selasa, 02 September 2008

II. Site structure

Number of pages

The general seo rule is: the more, the better. Increasing the number of pages on your website increases the visibility of the site to search engines. Also, if new information is being constantly added to the site, search engines consider this as development and expansion of the site. This may give additional advantages in ranking. You should periodically publish more information on your site – news, press releases, articles, useful tips, etc.

Navigation menu

As a rule, any site has a navigation menu. Use keywords in menu links, it will give additional seo significance to the pages to which the links refer.

Keywords in page names

Some seo experts consider that using keywords in the name of a HTML page file may have a positive effect on its search result position.

Avoid subdirectories

If there are not too many pages on your site (up to a couple of dozen), it is best to place them all in the root directory of your site. Search engines consider such pages to be more important than ones in subdirectories.

One page – one keyword phrase

For maximum seo try to optimize each page for its own keyword phrase. Sometimes you can choose two or three related phrases, but you should certainly not try to optimize a page for 5-10 phrases at once. Such phrases would probably produce no effect on page rank.

Seo and the Main page

Optimize the main page of your site (domain name, index.html) for word combinations that are most important. This page is most likely to get to the top of search engine lists. My seo observations suggest that the main page may account for up to 30-40% percent of the total search traffic for some sites

III. Common seo mistakes

Graphic header

Very often sites are designed with a graphic header. Often, we see an image of the company logo occupying the full-page width. Do not do it! The upper part of a page is a very valuable place where you should insert your most important keywords for best seo. In case of a graphic image, that prime position is wasted since search engines can not make use of images. Sometimes you may come across completely absurd situations: the header contains text information, but to make its appearance more attractive, it is created in the form of an image. The text in it cannot be indexed by search engines and so it will not contribute toward the page rank. If you must present a logo, the best way is to use a hybrid approach – place the graphic logo at the top of each page and size it so that it does not occupy its entire width. Use a text header to make up the rest of the width.

Graphic navigation menu

The situation is similar to the previous one – internal links on your site should contain keywords, which will give an additional advantage in seo ranking. If your navigation menu consists of graphic elements to make it more attractive, search engines will not be able to index the text of its links. If it is not possible to avoid using a graphic menu, at least remember to specify correct ALT attributes for all images.

Script navigation

Sometimes scripts are used for site navigation. As an seo worker, you should understand that search engines cannot read or execute scripts. Thus, a link specified with the help of a script will not be available to the search engine, the search robot will not follow it and so parts of your site will not be indexed. If you use site navigation scripts then you must provide regular HTML duplicates to make them visible to everyone – your human visitors and the search robots.

Session identifier

Some sites use session identifiers. This means that each visitor gets a unique parameter (&session_id=) when he or she arrives at the site. This ID is added to the address of each page visited on the site. Session IDs help site owners to collect useful statistics, including information about visitors' behavior. However, from the point of view of a search robot, a page with a new address is a brand new page. This means that, each time the search robot comes to such a site, it will get a new session identifier and will consider the pages as new ones whenever it visits them.

Search engines do have algorithms for consolidating mirrors and pages with the same content. Sites with session IDs should, therefore, be recognized and indexed correctly. However, it is difficult to index such sites and sometimes they may be indexed incorrectly, which has an adverse effect on seo page ranking. If you are interested in seo for your site, I recommend that you avoid session identifiers if possible.


Redirects make site analysis more difficult for search robots, with resulting adverse effects on seo. Do not use redirects unless there is a clear reason for doing so.

Hidden text, a deceptive seo method

The last two issues are not really mistakes but deliberate attempts to deceive search engines using illicit seo methods. Hidden text (when the text color coincides with the background color, for example) allows site owners to cram a page with their desired keywords without affecting page logic or visual layout. Such text is invisible to human visitors but will be seen by search robots. The use of such deceptive optimization methods may result in banning of the site. It could be excluded from the index (database) of the search engine.

One-pixel links, seo deception

This is another deceptive seo technique. Search engines consider the use of tiny, almost invisible, graphic image links just one pixel wide and high as an attempt at deception, which may lead to a site ban.

Internal ranking factors I

Several factors influence the position of a site in the search results. They can be divided into external and internal ranking factors. Internal ranking factors are those that are controlled by seo aware website owners (text, layout, etc.) and will be described next.

I. Web page layout factors relevant to seo

Amount of text on a page

A page consisting of just a few sentences is less likely to get to the top of a search engine list. Search engines favor sites that have a high information content. Generally, you should try to increase the text content of your site in the interest of seo. The optimum page size is 500-3000 words (or 2000 to 20,000 characters).

Search engine visibility is increased as the amount of page text increases due to the increased likelihood of occasional and accidental search queries causing it to be listed. This factor sometimes results in a large number of visitors.

Number of keywords on a page

Keywords must be used at least three to four times in the page text. The upper limit depends on the overall page size – the larger the page, the more keyword repetitions can be made. Keyword phrases (word combinations consisting of several keywords) are worth a separate mention. The best seo results are observed when a keyword phrase is used several times in the text with all keywords in the phrase arranged in exactly the same order. In addition, all of the words from the phrase should be used separately several times in the remaining text. There should also be some difference (dispersion) in the number of entries for each of these repeated words.

Let us take an example. Suppose we optimize a page for the phrase "seo software” (one of our seo keywords for this site) It would be good to use the phrase “seo software” in the text 10 times, the word “seo” 7 times elsewhere in the text and the word “software” 5 times. The numbers here are for illustration only, but they show the general seo idea quite well.

Keyword density and seo

Keyword page density is a measure of the relative frequency of the word in the text expressed as a percentage. For example, if a specific word is used 5 times on a page containing 100 words, the keyword density is 5%. If the density of a keyword is too low, the search engine will not pay much attention to it. If the density is too high, the search engine may activate its spam filter. If this happens, the page will be penalized and its position in search listings will be deliberately lowered.

The optimum value for keyword density is 5-7%. In the case of keyword phrases, you should calculate the total density of each of the individual keywords comprising the phrases to make sure it is within the specified limits. In practice, a keyword density of more than 7-8% does not seem to have any negative seo consequences. However, it is not necessary and can reduce the legibility of the content from a user’s viewpoint.

Location of keywords on a page

A very short rule for seo experts – the closer a keyword or keyword phrase is to the beginning of a document, the more significant it becomes for the search engine.

Text format and seo

Search engines pay special attention to page text that is highlighted or given special formatting. We recommend:

- use keywords in headings. Headings are text highlighted with the «H» HTML tags. The «h1» and «h2» tags are most effective. Currently, the use of CSS allows you to redefine the appearance of text highlighted with these tags. This means that «H» tags are used less than nowadays, but are still very important in seo work.;
- Highlight keywords with bold fonts. Do not highlight the entire text! Just highlight each keyword two or three times on the page. Use the «strong» tag for highlighting instead of the more traditional «B» bold tag.

«TITLE» tag
This is one of the most important tags for search engines. Make use of this fact in your seo work. Keywords must be used in the TITLE tag. The link to your site that is normally displayed in search results will contain text derived from the TITLE tag. It functions as a sort of virtual business card for your pages. Often, the TITLE tag text is the first information about your website that the user sees. This is why it should not only contain keywords, but also be informative and attractive. You want the searcher to be tempted to click on your listed link and navigate to your website. As a rule, 50-80 characters from the TITLE tag are displayed in search results and so you should limit the size of the title to this length.

Keywords in links

A simple seo rule – use keywords in the text of page links that refer to other pages on your site and to any external Internet resources. Keywords in such links can slightly enhance page rank.

«ALT» attributes in images
Any page image has a special optional attribute known as "alternative text.” It is specified using the HTML «ALT» tag. This text will be displayed if the browser fails to download the image or if the browser image display is disabled. Search engines save the value of image ALT attributes when they parse (index) pages, but do not use it to rank search results.

Currently, the Google search engine takes into account text in the ALT attributes of those images that are links to other pages. The ALT attributes of other images are ignored. There is no information regarding other search engines, but we can assume that the situation is similar. We consider that keywords can and should be used in ALT attributes, but this practice is not vital for seo purposes.

Description Meta tag

This is used to specify page descriptions. It does not influence the seo ranking process but it is very important. A lot of search engines (including the largest one – Google) display information from this tag in their search results if this tag is present on a page and if its content matches the content of the page and the search query.

Experience has shown that a high position in search results does not always guarantee large numbers of visitors. For example, if your competitors' search result description is more attractive than the one for your site then search engine users may choose their resource instead of yours. That is why it is important that your Description Meta tag text be brief, but informative and attractive. It must also contain keywords appropriate to the page.

Keywords Meta tag

This Meta tag was initially used to specify keywords for pages but it is hardly ever used by search engines now. It is often ignored in seo projects. However, it would be advisable to specify this tag just in case there is a revival in its use. The following rule must be observed for this tag: only keywords actually used in the page text must be added to it.

Senin, 11 Agustus 2008

Common search engine principles

To understand seo you need to be aware of the architecture of search engines. They all contain the following main components:

Spider - a browser-like program that downloads web pages.
Crawler – a program that automatically follows all of the links on each web page.
Indexer - a program that analyzes web pages downloaded by the spider and the crawler.
Database– storage for downloaded and processed pages.
Results engine – extracts search results from the database.

Web server a server that is responsible for interaction between the user and other search engine components.

Specific implementations of search mechanisms may differ. For example, the Spider+Crawler+Indexer component group might be implemented as a single program that downloads web pages, analyzes them and then uses their links to find new resources. However, the components listed are inherent to all search engines and the seo principles are the same.

This program downloads web pages just like a web browser. The difference is that a browser displays the information presented on each page (text, graphics, etc.) while a spider does not have any visual components and works directly with the underlying HTML code of the page. You may already know that there is an option in standard web browsers to view source HTML code.

This program finds all links on each page. Its task is to determine where the spider should go either by evaluating the links or according to a predefined list of addresses. The crawler follows these links and tries to find documents not already known to the search engine.

This component parses each page and analyzes the various elements, such as text, headers, structural or stylistic features, special HTML tags, etc.

This is the storage area for the data that the search engine downloads and analyzes. Sometimes it is called the index of the search engine.

Results Engine.
The results engine ranks pages. It determines which pages best match a user's query and in what order the pages should be listed. This is done according to the ranking algorithms of the

search engine.
It follows that page rank is a valuable and interesting property and any seo specialist is most interested in it when trying to improve his site search results. In this article, we will discuss the seo factors that influence page rank in some detail.

Web server.
The search engine web server usually contains a HTML page with an input field where the user can specify the search query he or she is interested in. The web server is also responsible for displaying search results to the user in the form of an HTML page.

History of search engines

In the early days of Internet development, its users were a privileged minority and the amount of available information was relatively small. Access was mainly restricted to employees of various universities and laboratories who used it to access scientific information. In those days, the problem of finding information on the Internet was not nearly as critical as it is now. Site directories were one of the first methods used to facilitate access to information resources on the network. Links to these resources were grouped by topic. Yahoo was the first project of this kind opened in April 1994. As the number of sites in the Yahoo directory inexorably increased, the developers of Yahoo made the directory searchable. Of course, it was not a search engine in its true form because searching was limited to those resources who’s listings were put into the directory. It did not actively seek out resources and the concept of seo was yet to arrive. Such link directories have been used extensively in the past, but nowadays they have lost much of their popularity. The reason is simple – even modern directories with lots of resources only provide information on a tiny fraction of the Internet. For example, the largest directory on the network is currently DMOZ (or Open Directory Project). It contains information on about five million resources. Compare this with the Google search engine database containing more than eight billion documents. The WebCrawler project started in 1994 and was the first full-featured search engine. The Lycos and AltaVista search engines appeared in 1995 and for many years Alta Vista was the major player in this field. In 1997 Sergey Brin and Larry Page created Google as a research project at Stanford University. Google is now the most popular search engine in the world. Currently, there are three leading international search engines – Google, Yahoo and MSN Search. They each have their own databases and search algorithms. Many other search engines use results originating from these three major search engines and the same seo expertise can be applied to all of them. For example, the AOL search engine (search.aol.com) uses the Google database while AltaVista, Lycos and AllTheWeb all use the Yahoo database.

Senin, 28 April 2008

All About Title Tags

What Is a Title Tag?

The title tag has been — and probably will always be — one of the most important factors in achieving high search engine rankings.

In fact, fixing just the title tags of your pages can often generate quick and appreciable differences to your rankings. And because the words in the title tag are what appear in the clickable link on the search engine results page (SERP), changing them may result in more clickthroughs.

Search Engines and Title Tags

Title tags are definitely one of the “big three” as far as the algorithmic weight given to them by search engines; they are equally as important as your visible text copy and the links pointing to your pages — perhaps even more so.

Do Company Names Belong in the Title Tag?

This is one of the most common questions asked about titles. The answer is a resounding YES! I’ve found that it’s fine to place your company name in the title, and *gasp*, even to place it at the beginning of the tag! In fact, if your company is already a well-known brand, I’d say that it’s essential. Even if you’re not a well-known brand yet, chances are you’d like to eventually be one. The title tag gives you a great opportunity to further this cause.

This doesn’t mean that you should put *just* your company name in the title tag. Even the most well-known brands will benefit from a good descriptive phrase or two added, as it will serve to enhance your brand as well as your search engine rankings. The people who already know your company and seek it out by name will be able to find you in the engines, and so will those who have never heard of you but seek the products or services you sell.

Title Tags Should Contain Specific Keyword Phrases

For example, if your company is "Johnson and Smith Inc.," a tax accounting firm in Texas, you shouldn’t place only the words "Johnson and Smith Inc." in your title tag, but instead use something like "Johnson and Smith Inc. Tax Accountants in Texas."

As a Texas tax accountant, you would want your company’s site to appear in the search engine results for searches on phrases such as "Texas tax accountants" and "CPAs in Texas." (Be sure to do your keyword research to find the best phrases!) You would need to be even more specific if you prefer to work with people only in the Dallas area. In that case, use keywords such as "Dallas tax accountants" in your site’s title tags.

Using our Dallas accountant’s example, you might create a title tag as follows:

Johnson and Smith Tax Accountants in Dallas

or you might try something like this:

Johnson and Smith Dallas CPAs

However, there’s more than enough space in the title tag to include both of these important keyword phrases. (I like to use about 10-12 words in my title tags.)

One way to do it would be like this:

Johnson and Smith - Dallas Tax Accountants - CPAs in Dallas, TX

I’ve always liked the method of separating phrases with a hyphen; however, in today’s competitive marketplace, how your listing appears in the SERPs is a critical aspect of your SEO campaign. After all, if you have high search engine rankings but your targeted buyers aren’t clicking through, it won’t do you much good.

These days I try to write compelling titles as opposed to simply factual ones, if I can. But it also depends on the page, the type of business, the targeted keyword phrases, and many other factors. There’s nothing wrong with the title tag in my above example. If you were looking for a tax accountant in Dallas and saw that listing at Google, you’d probably click on it.

Still, you could make it a readable sentence like this:

Johnson and Smith are Tax Accountants and CPAs in Dallas, TX

I’m not as thrilled with that one because I had to remove the exact phrase "Dallas Tax Accountants," as it wouldn’t read as well if it said:

Johnson and Smith are Dallas Tax Accountants and CPAs in Dallas, TX

It sounds redundant that way, as if it were written only for the search engines.

In the end, it’s really a personal preference. Don’t make yourself crazy trying to create the perfect title tag, as there’s just no such thing. Most likely, either of my examples would work fine. The best thing to do would be to test different ones and see which rank higher and which convert better. It may very well be that the second version doesn’t rank as well, but gets clicked on more, effectively making up the difference.

Use Your Visible Text Copy As Your Guide

I prefer not to create my title tags until the copy on the page has been written and optimized. I need to see how the copywriter integrated the keyword phrases into the text to know where to begin. If you’ve done a good job with your writing (or better yet, hired a professional SEO copywriter), you should find all the information you need right there on your page. Simply choose the most relevant keyword phrases that the copy was based on, and write a compelling title tag accordingly. If you’re having trouble with this and can’t seem to get a handle on what the most important phrases are for any given page, you probably need to rewrite the copy.

I recommend that you *don’t* use an exact sentence pulled from your copy as your title tag. It’s much better to have a unique sentence or a compelling string of words in this tag. This is why you have to watch out for certain development tools. Some content management systems (CMS) and blog software such as WordPress automatically generate the title tag from information you provide elsewhere. In WordPress, for example, the default is to use your blog name, plus whatever you named the page. The problem is that this same info is also used as the headline, plus in the navigational link to the page. Depending on your setup, it could also be the URL for that page. Very rarely would you want all those to be the same.

The good news is that most of today’s CMS and blog software have workarounds so that you can customize your title tags. For WordPress, I recommend installing the "SEO Title Tag" plug-in developed by Stephan Spencer. It works like a charm on all my WordPress sites.

The Art of SEO

As much as Google *pretends* to like SEOs by inviting us to parties at the Googleplex and posting on SEO forums, the bottom line is that they don’t like us — or rather, they don’t like what we do. Google wants to find the best, most relevant sites for the search query at hand all by themselves. Perhaps someday they will actually be able to do that, but for now, they still need our help, whether they like it or not.

Unfortunately, unscrupulous SEOs have given Google good reasons not to like us. Because of search engine spammers, Google is constantly changing their ranking criteria and is always on the lookout for the telltale signs of SEO on any given site. It’s not a huge stretch to say that they may even downgrade the sites that they believe have been SEO’d.

If you think that having your keyword phrases “in all the right places for SEO” is a good thing, think again! You’re essentially telling Google, “Hey look…my site has been SEO’d!” To which they reply, “Thanks so much for letting us know… ZAP … see ya later!” Doesn’t matter if your site is the most relevant (in your mind) to the search query. Doesn’t matter that you’ve placed your keyword phrases strategically throughout the site.

Stuff that worked like a charm for many people in the early years of SEO may actually hurt rather than help now. As to what might trigger an SEO “red flag,” my guess is that it’s a combination of things. Like, if you have a certain number of traditional SEO factors on any given page, those may set off some Google warning bells (otherwise known as a spam filter).

Some of the traditional SEO formulaic elements that you may have been taught to use include putting the keyword phrase:

* in the domain name
* in the file name
* in the Title tag
* in the Meta description tag
* in the Meta keyword tag o in the image alt attributes
* in an H1 (or any H) tag
* as the first words on the page
* in bold and/or italics or a different color
* multiple times in the first paragraph or twice on the page
* in the copy in every single spot on the page where it might possibly make sense to use it, and
* in all the hyperlinks pointing to a page.

If you put the same keyword phrase in many of those spots, you might very well trigger a spam filter. Since it’s difficult to determine how many and which combinations of those things might trigger the filter, the best advice I can give you is to do your SEO without any particular formula in mind.

That’s how I’ve always done it and it’s always worked because every site is unique and has different SEO needs.

Unfortunately, it’s difficult to describe this type of SEO to others, as people are always looking for the magic formula. For as long as I’ve been doing SEO (over 12 years now), I’ve had it in the back of my mind that I wouldn’t want to tip off the engines that my sites were SEO’d. This is one of the reasons I’ve never used keyword-rich domain names or file names. That’s probably the most obvious SEO thing you can do.

The most important aspect to being a good SEO is creativity. You shouldn’t worry too much about the specifics of putting keyword phrases here and there, and again over there. Not every page needs an H1 heading with keyword phrases in it. If your page isn’t designed to use H1 headings, you don’t need to change it to use one just for SEO purposes. And many images don’t really and truly make sense with a keyword phrase in their alt attribute (alt tag). Don’t force one to be there just for the search engines.

Most importantly for Google (and for your users), when it comes to your page copy and how you use your visible keyword phrases, less is definitely more. Please don’t read my Nitty-gritty report and then put the same keyword phrase in every single available spot on your page that you can find. My report is supposed to help you think about a few places you may have missed because you weren’t thinking about being descriptive when you originally wrote the copy. You can definitely have too much of a good thing.

A first paragraph on a page that has, say, 4 sentences, should not have 10 instances of your keyword phrase. It will look and sound dumb. I know that I have stressed this in my conference presentations and in our High Rankings seminars, but no matter how many times I say this, people don’t quite grasp the importance of working this way. If your copy reads poorly to a human, and does not come across as natural professional copywriting, the search engines won’t like it either.

When you do SEO, you don’t follow a guidebook. Think like a search engineer and consider all the possible things they might have to combat both now and in the future. Always optimize for 3 or 4 or even up to 5 phrases, and spread them out throughout the entire page. Never, ever, ever think that it’s the first paragraph that matters and stuff ‘em all in there. There should be an equal distribution throughout the entire page, and you should never use the phrases so much that you hear them constantly when you read it.

If you’ve done it right, an everyday user should not have any idea that a page has been SEO’d. A trained SEO should be able to spot what your keyword phrases are, but it shouldn’t be glaringly obvious. Last, but not least, hire a professional copywriter to work on the important pages of your site. This is the best investment you can make for your site and your business. Even if you don’t want to hire an SEO, you absolutely MUST

hire a professional copywriter. You need someone who really and truly understands target audiences and how to speak to them about the benefits of what you offer. You can easily teach someone like that the SEO writing part.

Hope this helps to give you some ideas on how you might get out of formula-SEO mode and start doing more creative SEO. More than ever, SEO is much more of an art than a science. The science is only a small portion of it.

Minggu, 09 Maret 2008

What is Web Analytics and How it Helps a Website Owner?

One of My clients was asking that how to read an analytical report and what is this and what is that? From these questions, I thought to have written few notes about the analytical reports which would be beneficial for my clients and the readers of this blog.


A web analytics is a series of reports of quantitative indicator of the behavior of the visitors of the website. It tracks the movement of visitor and the reason of his/her visit. In other words, it provides the clear report of the performance of your website and the trends. From these reports and trends the website owner creates and implements more effective program to increase sales.

Web Analytics are based:

1. Collection of visitor's data from logfiles, and 2. by tagging each webpage with a Javascript.

A third web analytics method is a combination of the two whereby more relevant data can be produced than what is possible with either of the two methods.

Web Analytics Terms:

Number of visits means that how many visitors landed on any page of your website from any source. In the case of log files of server, it will log several files for each visit, while the page-tagging script will only consider the page as a whole seen by the visitor. In either case, each web analytics data will clearly identify if the visitor is new or has come before.

Page Views means different for the two web analytics methods. While tagging script considers the whole page as one request, the logfile on the other hand will record multiple hits (one for each file, including images, .js and .css) within a single page-view.

Hits denotes requests for files from server and is recorded only in logfile.

Unique visitors means new visitors, it is noticed by logfiles as well.

The length of time a visitor spends in seeing a website.

The keyword phrase used to arrive at the website.

The unique IP address and therefore the country from where the visits generated

Visits duration

Files type etc

Data transfer to and from the server is always recorded in server's logfile with clockwork precision. And these files can be easily viewed in web analytics programs such as Webalizer, Awstats, etc. which analyze raw logfile data and portray valuable visitor information in easy-to-follow graphics.

What is difference between Logfile and Page Tagging:

1. Logfile analysis is usually already available in the server. Page tagging is an outsourced option, which means that visitors' data is captured by provider's remote server. 2. Logfile analysis can be viewed only in provider's website, while Google Analytics and Click Tracks are examples of page-tagging web analytics. 3. Since page tagging requires javascript to be installed on every webpage, there is always a possibility that some visitors' browsers do not allow the script to run. Whilw logfiles have no such issue. 4. Logfiles enter transfer of all files, including images and scripts, and therefore certain parameters like hits and page views are not as accurate as with page-tagging web analytics. 5. While logfiles record visits by search engines, page tagging does not. 6. Logfile web analytics record failed visits too. Page tagging takes a request into account only when a webpage is successfully displayed.

Web analytics is a type of feedback from visitor which is available all times. These reports are a great source by which you can analyze relative strength and weakness of your website. From these, you can find which pages are most visited and which keywords are more related to your web site.

Please remember, no one can give guarantee of position one on any search engine as far as Search engine optimization is concened. However, professionally experts do their best and follow the procedures and maintain quality and required quantity, than there is no doubt that we can bring your website on the top position of any search engine. For More information visit our quality search engine optimization blog.

Minggu, 10 Februari 2008

Guidelines For Effective Video Production:

The nature of advertising and public relations has witnessed a massive change in the last few decades. With audio video presentation, it has become easier to conduct persuasive marketing. As a result of this, marketing strategies, to be precise, there has been considerable genesis of marketing mix in a dew dimension. Video production became an essential tool for product promotion and at present it is almost indispensable. The advertising and public relation agencies across the world have focused more on creating television commercials, documentaries and audio visual presentations etc.Both audio and video involves high level of creativity and the broadcast scripts are written by skilled scriptwriters who can create scripts as per specifications of the corporate or individual clients. Beside video production, different agencies also employ corporate speechwriters who write script about different domains on the basis of extensive research.
There are specific role of different individuals who remain involved with video production, it is indeed a good idea to have an overview of the different steps and roles and responsibilities of the creative people who make the audio visual commercials, documentaries and presentations. The photographers and the cameraman shoot the necessary clippings and footages that need to be incorporated within a project. These people often follow few specifications and guidelines given by the director and take the necessary shots. Video editing is an intricate process, production control room or PCR is the place where usually the video editing is done. Camera control unit helps to monitor and adjust the films to make a proper video. On the other hand the vision mixer helps to link scenes and audio mixer facilitates better use of background music. Special sound engineers are employed for better sound control.
Few important factors that need to be considered during video production are as follows:
Target group Length of the presentation/documentary/commercial Style of audio-visual presentation Personality reflection Time of the use of the video
Nowadays automatic editing control units are in use and plenty of softwares are available which help is digital editing. Video production is now highly technology dependent and less complicated than previously what it was. In many cases, for proper video production, a clear comparison with different other video presentations and commercials is necessary. The entire process is done through a teamwork and co-ordination among different people. Few popular types of video productions are made for the following purposes:
Informational Marketing Documentaries Training Recruitment Fund raising Medical videos Multimedia For the purpose of corporate video production and corporate speechwriting, it is prudent to select a reputed advertising or public relation agency. The reputed ones use latest gadgets and offers top notch products. It is always advisable to eye for a good quality of video. The reputed video production companies employ skilled and experienced professionals. Scripting for documentaries is indeed important and a wrong script can be the cause of the massacre. There are plenty of online resources that help to know more about video production.

The Incredibles in Google Trends

Have you heard about Google Trends? Probably not, it is not very well known among Google users.

Google trends is a tool who shows how often a keyword is searched in Google. The keyword should have a minimum search volume, otherwise the answer you receive will be:"Your terms....do not have enough search volume to show graphs." More, it gives the option to compare different keywords searches. For this what you have to do is to input the keywords in the search box separated by commas. You can see the trend history, you can choose a region or a city or see in which city was most searched.

Hot Trends shows you what the people search for. It is amazing to see that in US - yesterday November 22nd, 2007 (but, of course you can change the date if you would like to see other date) - the people were very very very interested in ... 'national dog show'. 'circuit city' was on 4th position - of course was Thanksgiving!!! - 'how to curve a turkey' on 9th and "how long to cook a turkey "on 100th (I thought everyone knows that). After you select the term you can see the web results and in which news the term appeared.

The top 100 is funny (unpredictable most of the time - I can't believe what people are searching for!), but the tool is not. Google Trends is really useful for SEO experts.

Senin, 04 Februari 2008

How Businesses Fight for the Best Online Places

It is no longer enough to launch a website and wait for revenues. How are we sure it is correctly positioned among billions of aggressive competitors?

Due to the amazing rate of development of the Internet and the continuous attempts of many companies to keep up with it, the degree to which companies depend on their position on the World Wide Web has now become impressive. Many businesses depend primarily on their relationships with websites and search engines. While several years ago everything came down to shiny premises and good reputation, now search engine optimization and internet relations have become the essential features of each successful modern business.

The Internet is a medium with enormous potential, but there are also billions of people and businesses which willing to take advantage of it. With the great number of websites live on the Internet, the task of getting one of them noticed often seems hopeless. Many businesses have realized this and are now able to offer solutions to help companies and people cut the time and efforts to maximize their position, often unsuccessfully. For instance, Ficstar Software, Inc. (ficstar.com), a successful Toronto-based provider of powerful web data extraction and data mining solutions, is now offering companies an easier solutions for all their web-related activities, including those intended to improve a firm's or website's standing and popularity within this vast medium.

Two of the company's recent products are particularly worth paying attention to, because they go beyond traditional data mining solutions and offer the something extra needed to get a business' contacts updated and make them work for it. These are the Ficstar Website Keyword Monitor and the Search Engine Ranking Tracker.

The Ficstar Website Keyword Monitor is a highly efficient solution which can be configured to monitor target sites for any content updates and modifications of text, metatags, keywords, page layouts, dates and times of posted content. The updated content can be retrieved in virtually any format, depending on customers' needs or preferences. The Website Keyword Monitor uses pre-defined keywords and phrases to mirror and scan entire sites, and the digital files stored on them. Any page with the appropriate keyword combinations, or related forum threads or blogs, will be dynamically tracked for changes. This solution allows companies to monitor competition, news or field articles, any market data, or prevent unauthorized use of trademarks or any other violations - in short, anything a company needs to know about what is going on in the World Wide Web and concerns them.

And while this solution takes care you know what is going on around, the Search Engine Ranking Tracker makes sure others know you are out there, working, and successful. Now that many companies rely on search engine optimization (SEO) to help them increase visibility, improve branding, and boost sales, it is essential to monitor one's own performance. Customer surveys are not as easy with the Internet as they are with real-life products and traditional services. A trouble-free and customized tool proves very helpful and efficient to meet this need. With Ficstar's Search Engine Ranking Tracker, companies can automatically track rankings in all major search engines and directories, as well as positions in pay-per-click campaigns, thus easily gathering all the information necessary to an efficient business.

Services such as the Website Keyword Monitor and the Search Engine Ranking Tracker may now seem surprising solutions which make life easier. Given the fast rate of development of the Internet, the ever-increasing number of websites, and the growing competition among them, they will certainly soon become a must for every business which wants its website to find its right place on the web and keep it.

Selasa, 15 Januari 2008

Manual Website Submission To Free Directories

"How to submit my site to web directories?" - is the question I would like to answer in this article.

As you know link popularity is an important factor for high search engine rankings. For a new site, submission to general or specialized web directories, is a good way to build links and therefore, increase your site search engine rankings. Of course do not expect miracles, directory submission is only the starting point in link building.

Prepare your website listing from the beginning. Most of the directories ask for title, description, keywords, contact name, phone and email, business address. Write few different description and titles, which contain the keywords you would like to rank for. Keep in mind that every directory is different, even if they look the same! Most of them have strict guidelines for submission, read it very careful and follow it. In some high quality directories you will not have the second chance. Make sure that site's title and description are unique and respect the conditions imposed by directory's guidelines. Some directories require an email address on the same domain as URL's site so keep this in mind when you decide which email address you will use. I suggest you to create a special email address for the directory submission because you will receive a lot of confirmation emails (maybe some spam too). Choose the most appropiate category for your site. Do not resubmit the site if the directory's guidelines interdict it, your listing could be deleted.

Increase Return Traffic to Strengthen SEO Services

When businesses get into search engine marketing, their main objective is driving new visitors to their website. Almost all optimisation strategies devised by webmasters are centered around the same objective, and this is possibly the greatest mistake they can make. It is true that new visitors pep up the traffic influx of a website; but is it wise to neglect the importance of old visitors?

Apart from catering to new users, webmasters have to formulate strategies to bring back old users in the form of return traffic. All web users have certain favourite websites, that they access frequently for daily doses of information and entertainment related to their fields of interest. If webmasters can get their sites to this level, it'll definitely spell success for their SEO campaigns.

Web analytics is one service that can be extremely useful in driving return users to your site. It'll help you analyse prevalent online trends and customer behaviour. The log files of your website will also provide a good insight into visitor reactions, thus helping you identify pages that are retaining or repelling users.

People who bookmark your pages are quite likely to come back once in a while. Buyers - prospective or otherwise â€" will, in all likelihood, visit your site more than once. Return traffic also fosters viral marketing - when users buy something from you, they talk about it to their family, friends and peers, thus publicising your business by word of mouth.

Retaining users is not as easy a task as it sounds to be. Your website should have appealing content matter; something that users have not read elsewhere, something that interests them enough to make a return visit, and above all, something that benefits them. You also need to upgrade your site constantly, catering to latest SEO service guidelines and design techniques. By doing so, you can ensure that users will not find your site boring or static.

It is also imperative that you do away with elements that irk visitors. Things like drop-down menus, scrollers, dragable objects, colours, fonts, etc. have to be dealt with smartly. Review the content of your site periodically. This will help effective Search engine optimizationredirects as much as possible.

When people come back to your website, they would definitely not like to see the same mistakes on your pages. Let your users feel refreshed after every visit to your site. This return traffic will ultimately go on to benefit your campaign in a major way. Use it to your advantage!

Jumat, 11 Januari 2008

How to Create Search Engine Friendly Web Site Copy

Search engines read text and not much else. Because they can’t generally index graphics, search engines rely on the text in web sites to provide information about the site content, which they can compare with search queries.

Webmasters therefore need to use body text on any pages on the site that they want indexed by the search engines and ranked highly for matching search queries. Not graphical text that was created in design software, but actual, visible body text. Not sure if your site uses graphical or body text? A good rule of thumb that I learnt from search engine guru Danny Sullivan is to try and highlight the text with your mouse. If you can drag your mouse over individual words in the text when viewing it in a browser, chances are this is body text and the search engines can read it.

Figure 1

Figure 2

The most important page on which to use body text is the home page. Above is an example of a home page that uses graphical text instead of body text. Figure 1 shows what content the site visitors see, while Figure 2 shows the content a search engine sees and indexes.

How much information about a site’s content does a page like the one above provide a search engine? That’s right, very little. With next to no text to be found, the search engine would have to rely on the page’s Title and META Tags to tell it what the page is about. With such little information to go on, it is unlikely that a search engine would consider this page a relevant match for search queries relating to its content. To remedy this, it is widely recommended that each web page you want listed in search engines should contain at least 250 words of visible body text.

Keyword-Rich Text

While it’s a good idea to use plenty of body text on web pages, if that text doesn’t contain relevant keywords and phrases that people type in to the search engines, there’s not much point, because a site isn’t going to be found for logical search queries anyway. Many web sites make the mistake of including text on their site that is either unrelated to their products and services, or full of marketing-speak like “Internet solutionsâ€
 or “superior services”. The Internet is plagued with web sites selling particular items without once making reference to those items in their site text. Weird huh?

For a search engine to find a site relevant for a particular search query, it MUST find that search query somewhere in that site. The easiest way to ensure this is to include logical keywords and phrases within the visible text on web pages, as well as in the Title and META tags. The best way for webmasters to find keywords that searchers are actually using is by conducting keyword research of their target market on a site such as Keyword Discovery or WordTracker.

Once it is determined what search terms perspective visitors are commonly typing in to search engines, they can then be compared to the goods and services offered on the site and the body text can be adjusted accordingly. Sites lacking any keyword research tend to use very generic, unfocused body copy, or sales-oriented “hypeâ€
. Neither style contributes to high search engine rankings.

Target keywords and search phrases placed strategically throughout your body copy give your pages a much higher ranking potential on search engines for related searches. But it’s not as easy as throwing the keywords into your site text willy-nilly. You must ensure that the keywords are integrated seamlessly so their repetition is unobvious and so that the text flows smoothly for the reader.

Don't compromise the readability of your copy to achieve this - hire an expert copywriter to strike the right balance if need be.

SEO Copywriting

Before writing your web site copy, you should research potential keywords and phrases that your target audience may use in search engines and then narrow the list down to your priority terms for each page, sorted in order of importance. You should then use those target search terms as a basis for the creation of optimized Title and META tags for each page on your site. Once you’ve done that, it’s time to integrate those same target search terms into your visible web page copy. We call this SEO copywriting. But exactly how do we do it?

Speak to Your Audience

Don’t lose site of the reader when writing your body copy. Integrating your keywords is important, but not if you are sacrificing the readability of your site and losing the attention of your audience. Put yourself in their shoes like you did when researching your keywords. What are they looking for? What do they need? How will your product/service help them? Does it represent value for money?

Be emotive when describing your products and services. Describe how your product/service will make them feel or look, how it will improve their lives, give them more time etc. Use trigger words that people respond to such as “freeâ€
, “success”, “you”, “cash” etc. Not sure what these are? Check out Words That Sell reports. These reports are perfect if you are targeting a specific industry or profession because they define what keywords people in over 38 industries respond to and what they expect when making a buying decision.

Not sure who your audience is or what they’re looking for? Why not ask them? Use a free survey service such as Survey Monkey to learn more about them so you can write “toâ€
 them and not “at” them. You could even draft various styles of body copy and obtain feedback from your site visitors to determine what copywriting style works better for them.

Use Easy to Understand Language

The Internet is no place for verbosity. People are in a hurry - they want to find what they seek quickly and easily with the least hassle possible. You can help them in this quest by ensuring your site pages use simple language and easy to grasp concepts throughout. For example instead of "brand-building web information architects", use "website designers specializing in brand promotion". Keep the large chunks of text on each page to a minimum, using bullet points, white space, graphics, lists and sub-headings to break it up and make it easier to read. This rule of thumb is especially important when creating landing pages for pay per click and other advertising campaigns.

Use examples to get your main points across or to demonstrate your product benefits. Use the old WIIFM (What's In It For Me?) adage when composing your body copy to keep the user's interests at top of mind. Remember your international visitors by incorporating regional word usage (such as organize versus organize or jewelry versus jewellery) and avoid technical jargon that could alienate. Want your visitor to take a particular action? Spell it out for them in plain English, for example Click here to Buy Now, Subscribe to our free newsletter, Bookmark this page now etc. These references are called “Calls to Actionâ€

Build Your Copy Around Your Keywords

You should always build your page copy around your keywords and not the other way around. If your existing page copy doesn’t contain any of your target search keywords, you’re going to have to rewrite it! Start from scratch if you have to. The secret is to focus. Search engines aren’t going to rank your web site about socks highly if your body copy talks about foot sizes. You need to get specific. It sounds really obvious, but if you sell socks, make sure your site copy has plenty of references to the word socks! If you sell green wool socks, target the phrase "green wool socks" and not "foot apparel in lovely shades of emerald"! Who's going to search for socks using that phrase?

At the risk of sounding like Dr Seuss, if you want to be found for, big socks, small socks, cotton socks and wool socks, then mention them all. Better still, sort your copy into categories based on your various products and services. If you sell wool socks AND cotton socks, then have a page dedicated to each kind. This allows you to target niche keywords within your copy and meet the search engine's relevancy guidelines for related search queries.

Keyword Integration

So imagine you've added plenty of text to your pages and the copy flows well for the reader. You've researched your keywords and phrases and now you're faced with the dilemma of integrating the keywords into your copy. So how do you satisfy the search engine's craving for keywords without interrupting the copy flow for the reader? The answer is: very carefully.

Let's take a look at a practical example. We have a client that specializes in luxury adventure travel. Before I optimized their site, part of the home page copy read like this:

"We specialize in providing vacations for people who want a personal service. We bring to our efforts a fanatical obsession with quality and exclusivity. We also bring a freshness, an outward-going passion for discovery which justifies our growing reputation as one of the world's top travel providers. We can put together packages that include all adventure activities, accommodation, transport and food".

Extensive Keyword Discovery keyword research for the client had determined that the site should target the following key phrases:

• adventure travel
• best adventure vacations
• tailored travel
• overseas adventure travel
• luxury travel packages

So taking our original home page text, the challenge was to integrate these keywords carefully and naturally so as not to disturb the logical flow of the copy and lose the interest of the visitor. Here's how I did it:

"We specialize in providing the best adventure vacations for people who want a personal and tailored travel service. We bring to our efforts a fanatical obsession with quality and exclusivity. We also bring a freshness, an outward-going passion for discovery which justifies our growing reputation as one of the world's top overseas adventure travel providers. We can put together luxury travel packages that include all adventure activities, accommodation, transport and food".

Note that the key phrase "overseas adventure travel" accommodates the phrase "adventure travel" too. Voila! The search engines are happy because the site contains text content relevant to related search queries, the client is happy because we were able to integrate the keywords without distracting the visitor and I'm happy because I know the site is going to rank highly for the client's target search terms.

Now it’s your turn – go tackle your web site copy!

How to Create Search Engine Friendly Title and META Tags (Part 2)

In Part 1 of this article, I defined Title Elements and META Tags and took you step-by-step through how to create an optimized Title Element. Now it's time to create your optimized META Description and META Keywords Tags.

Create Your META Description Tag

Now it's time to create your optimized META Description Tag.

Take your list of target keywords and phrases and open another text file. Again, you can use an existing sample META Description Tag as your template. Let's say our existing description is:

[META name="description" content="Miami Florists create beautiful floral bouquets, arrangements, tributes and displays for all occasions, including weddings, Valentines Day, parties and corporate events. Deliveries throughout Florida."]

You can make your META Description Tag as long as you like, but only a certain portion of it will get indexed and displayed by search engines. According to Danny Sullivan in his article How to Use HTML Meta Tags, 200 to 250 characters of the META Description gets indexed but less than that gets displayed, depending on the search engine. So you want to make sure all your important keywords are listed towards the start of the tag.

Now take your list of keywords for the home page in order of importance. For our fictional florist these were:

- florists Miami
- florists Florida
- wedding bouquets

Now you need to create a readable sentence or two describing your web site and incorporating these keywords so they make the best use of the keyword real estate available.

Because search engines often display the contents of the META Description Tag in the search results, it is very important that your sentences make grammatical sense and are enticing enough to encourage readers to click on your link. Let's start with:

If you're seeking a florist in Miami Florida, Funky Florists create unforgettable wedding bouquets, floral arrangements, tributes and displays for all occasions.

Ok, so that's around 150 characters long and gets our three important keyword phrases included. But it’s a bit bland. We need to add something to entice the searcher to click on it. How about:

Order online for a 10 percent discount!

So now we have the following completed META Description Tag:

[META name="description" content="If you're seeking a florist in Miami Florida, Funky Florists create unforgettable wedding bouquets, floral arrangements, tributes and displays for all occasions. Order online for a 10 percent discount!"]

Our new tag is optimized for our keyword phrases, it's around 200 characters in length, it describes our site accurately, it speaks to the reader and it (hopefully) entices them to click on the link and view the site.

Create Your META Keywords Tag

We're almost there. Now it's time to create your optimized META Keywords Tag. Let me stress here that this Tag is quite unimportant in the grand scheme of things. Not many of the search crawlers even support it any more. You can see which ones do on this page. If you have the time and you really want to create META Keywords tags for your pages, then go ahead, but if not, then leave them out of your code altogether. This tag will have very little impact on your overall SEO campaign.

Assuming you do want to create a Keywords tag, take your list of target keywords and phrases and open another text file. Again, you can use an existing sample META Keywords Tag as your template. Let's say our existing Keywords Tag is:

[META name="keywords" content="flowers, roses, weddings bouquets, florists, floral arrangements, flower deliveries, Valentines Day gifts, Christmas decorations, Mother's Day, tributes, wreaths, clutches, sprays, in sympathy, funerals, corporate functions, parties, floral displays, Miami, Florida"]

You are just including a list of related keywords to include in this tag. Now take your list of keywords for the home page in order of importance. For our fictional florist these were:

- florists Miami
- florists Florida
- wedding bouquets

Because you have a lot more room in this tag, a good rule of thumb for creating a META Keywords Tag is to include the keywords and phrases your are targeting with your site content, as well as some terms that you don't necessarily want to use in your site copy but are still relevant to the site content. For example, the site copy, TITLE and META description tags would include the most important search keywords, but the META Keywords Tag could be used for keyword variations and combinations that don't appear in the visible site text, but that people may also search for. Examples include plurals, contractions, slang, variations, misspellings, cultural nuances and industry jargon.

For our fictional florist, these may include things like:

- wedding flowers
- roses
- wedding roses
- Valentine's Day roses
- sympathy gifts
- Mother's Day gifts
- funeral wreaths
- flower deliveries
- floral arrangements
- birthday gifts
- flowers
- flowers for wedding
- wedding decorations

So now we have the following draft META Keywords Tag:

[META name="keywords" content="florists Miami, florists Florida, wedding bouquets, wedding flowers, roses, wedding roses, Valentine's Day roses, sympathy gifts, Mother's Day gifts, funeral wreaths, flower deliveries, floral arrangements, birthday gifts, flowers, flowers for wedding, wedding decorations"]

However, when creating your Keywords Tag, you should not repeat any particular keywords within your META Keywords Tag more than five times and exclude commas so that all your keywords can be indexed in combination with each other.

So we need to fix the draft tag to remove the excess repetition of the words "flowers" and "weddings". This is easy to do because some of the keyword phrases already incorporate these single generic keywords.

For starters, we can lose the single "flowers" as it is already covered by some of the other phrases like "wedding flowers". Next, we can drop "roses" for the same reason. Then we can combine some keyword phrases together to save space, e.g. "flowers for wedding" and "wedding decorations" can be integrated to become "flowers for wedding decorations" so we can lose the extra instance of "wedding".

So now we have the following completed META Keywords Tag:

[META name="keywords" content="florists Miami florists Florida wedding bouquets wedding flowers wedding roses Valentine's Day roses sympathy gifts Mother's Day gifts funeral wreaths flower deliveries floral arrangements birthday gifts flowers for wedding decorations"]

Tailored TITLE and META Tags

While some webmasters remember to include a META Description and a META Keywords Tag in their home page HTML code, many forget to include them on every page of the site that they want indexed. Or worse, they duplicate the homepage TITLE and META Tags on all other pages. To give a web site the best ranking ability possible, it is highly recommended that each page of the site include a unique TITLE tag and unique META tags, individually tailored to the content of that specific page.

For example, our fictional Miami florist may have a page devoted to wedding bouquets and another devoted to funeral wreaths. The TITLE and META tags for the first page should include keywords relating to weddings and the page about wreaths should utilize keywords relating to funerals and sympathy.

The use of tailored TITLE and META Tags on each page creates multiple entry points to a web site and enables relevant content to be found in search engines no matter where it resides on a site. For example, instead of relying on visitors to arrive via the Home Page, the optimization of individual site pages makes each page more visible in the search engines, providing additional gateways to the site's content. The more pages optimized, the wider the range of keywords and phrases that can be targeted and the more entry points are created to a site.