Tuesday, August 5, 2008

Questions On SEO

1) What is SEO?

Ans) SEO stands for Search Engine Optimization and is defined as (in my own words):

"The process of finding out the best keywords for a web site and by the use of optimizing the web site along with other off-page work making that web site attain a higher position in the search engine result pages (SERPs) for those selected words."

Although the exact calculations used by the search engines are kept secret, there is lot of knowledge and observations in this field from thousands of webmasters worldwide.

It could be said to be a branch of online marketing. In general terms you can say that it means to make a web site more visible and make it look important in the eyes of search engines.

Not being familiar with SEO and not applying it compared to actually doing the right things can make a huge difference in terms of visitors to your web site.


2) How do I find out the best keywords to target?

Ans) The "best" keyword depends on the following main factors:

1.) The amount of traffic it will generate.
2.) The difficulty of attaining a top ranking.
3.) The profitability of that keyword.

In this answer I will address each point and give recommendations on tools to use to help you in your assessment.

The Amount of Traffic it Will Generate Often people choose keywords based on how popular they think they may be. Mostly it is based on "real world" factors rather then fact which is readily available. For instance, I recently saw someone who proposed they were going to go after the term "nursing homes" due to the aging population.

Although this area may be growing quickly, those interested in finding out more information often do not use a computer or if they did, would not be researching it online. Although they would still get visitors, they would find there are much better keywords to target with profit in mind. When it comes to traffic, the best measurement is actual searches. This will tell you how many people search in a day or month for that term and it can be a great indicator. By far, the most popular tool for finding this out is the Keyword Suggestion Tool. This tool combines the two most popular ways of judging popularity, Wordtracker and Overture. In addition, it suggest related keywords and lists their traffic. Always remember however, that this is total searches. These search numbers will always be divided among the SERPs.

The Difficulty of Attaining a Top Ranking

If you simply chose the keywords with highest amount of traffic, you could still lose money. This is because these keywords typically warrant a lot more work to rank for. A perfect keyword is one that has a lot of searches but little SEO competition and moderate to easy to rank for. The best tool I have found for this is the Keyword Difficulty Tool created by Rand Fishkin. It will give you an indication of the amount of SEO work required which you can balance against the number of searches.

The Profitability of that Keyword

There are also keywords where you may get 1000s of visitors with only one conversion while others where you can achieve 1 for every 100. This should be factored in as unless you make your money per impression, you want the highest number of conversions per visitor. The best way I know to evaluate this is to run an AdWords account. The amount of data you receive by starting a campaign can be very useful in establishing the conversion rate. I believe it is always better to spend $10 to find out a keyword isn't profitable then to spend 6 months getting it to number one, THEN find out its a dog.

The moral of the story is although there is no such thing as a "perfect" keyword, you can find the best ones for you by using a combination of the factors above.

3) What is KEI and how do I use it?

Ans) KEI stands for Keyword Effectiveness Index. KEI is a ranking system based on how popular a key word is and how much competition it has on the Internet. The higher the KEI number, the more popular your keywords are and the less competition they have. It also means that you'll have a much better chance getting ranked high on a search engine. A low KEI score means not many people are searching for that keyword and it has too much competition. Hence, eliminate all KEI scores with a low number and choose those with a high KEI score. The higher the score, the more profitable your keywords will be to your web site.

4) What are the most important things for on-page optimization?

Ans) On-page optimization is the part of SEO where you deal with the pages itself, opposite to off-page optimization and keyword analysis.

Here is my opinion on the most important elements in on-page optimization and some brief information about them.

* Content

It has been said over and over since years that a successful way of establishing a web site is by adding more good unique content to it on a regular basis. It cannot be stressed enough although still people are not doing it and want some "magic" to happen.

Make sure that you have your targetted keywords included in the content on your web pages in a natural way. My rule of thumb is that you write without thinking about it and when you finish you can look over the text and add the keyword maybe on one or two places extra where it fits in. If it looks spammy or too much in any way then reduce it. You are writing for the visitor and not for the search engines - never forget that.

Also make sure that the text is unique as if it is the same as on other web pages it can raise a red flag at major search engines and duplicate web pages and even whole domains gets erased from the index.

* Weight factors

By placing your targetted keyword in places such as the title, H1 and H2, Strong tags and Emphasis tags you put more weight on those words and for the search engines it becomes more relevant for those words.

But don't overdue it because if you place the same word or phrase on all the weight tags on the same page you can get hit by the over-optimization penalty which means that basically the search engines figured you tried to cheat them and push you down the rankings.

* Navigation structure

Make sure that the search engine spiders can follow the internal links on your site. If you have a site based on a database it is recommended that you use mod_rewrite to get the best benefit.

If you have lot of pages on your site and they are buried down in the navigation tree of 3 or more clicks away than I recommend the use of a sitemap and a link to it from each page of your site. A sitemap makes is easy for search engines to find all your pages and it can also be a great resource for your visitors to find a specific page quickly.

If one or more web pages of your site is more important than the other ones, like the home page, then get more links to it from the other web pages of the site. A good example is to have a link, "home", on each web page back to your home page. That is also useful for your visitors and it gives more power to the home page in the search engine rankings.


5) Is the use of meta tags dead?

Ans) Yes, in fact they have not been relied upon since many years. An article from Danny Sullivan stating exactly this was released in October 2002.

There are still some minor “stone age” search engines around that uses them.

The main reasons why they have been ceased to work are mostly from these factors:

* Most webmasters tried to fool the search engines with meta tags unrelated to their content and services.
* With improved FTS (full text search) tool kits from verity and many other companies, search engines can index your web pages and know the theme of your web page. With such advanced APIs, search engines like google can easily decide, what your website is about and what your website offers.

Some of the basic features of FTS API are that they can filter out text of your webpage and get important statistics such as:

1. How many times a word gets repeated
2. How far each repeated words are from each other
3. How many times a particular word gets repeated in a particular sentence
4. How far a word 'Online' appears from words like 'Party', 'invitations' to
see if that sentence makes any sense.
5. They can easily figure out, if you are doing keyword dumping.

So with such API's the webmaster should concentrate on the content/layout and not put the meta tags as a main concern.

However it has been tested that the meta keyword tag still has a minor influence on the rankings and the meta description tag should be used as it is some times shown in the SERPs (search engine result pages).


6) Where in my code should I put the keywords?

Ans) We all know it is not enough to have your keyword in the meta keyword tag.

Here is a list of places to put it in the source code, ordered by estimated weight:

* Title tag.
* H1 and H2.
* In paragraphs and general text on the site.
* In STRONG tags: Keyword
* ALT description attributes on image tags:
* TITLE attributes on anchor tags:
* SUMMARY attributes on tables:
* In the file names of images:
* Meta description tag.
* Meta keyword tag.


7) How is the best way to write the title?

Ans) The title is most probably the single most important place to put your keyword.

Have the word in the beginning of the title and also in the end. Try to vary it in different forms as well.

If you want to brand your company name you should keep that name in the end.

Try to follow this and at the same time make it look natural and appealing for the visitors. Remember that this is what is most visible in the SERPs for the visitor.


8) What is the best way to write the URL's?

Ans) In regards to Google it has been stated by two of their staffs involved in the SEO community (GoogleGuy and Matt Cutts) that the dashes (-) are better than underscores (_) when writing the URLs. This has also been confirmed by my own tests on the matter.

In regards to Yahoo, MSN and other search engines I actually don't know but I think it varies.

9) Which factors are considered unethical or black hat SEO?

Ans) Page cloaking is a BIG one, which basically consists of using server side scripts to determine whether the visitor is a search engine or a human and serving up different pages depending. The script would serve up a keyword rich, totally souped up page to search robots, while giving humans an entirely different page.

Linkfarms are considered black hat. Linkfarms are basically sites which consist of masses of links, for the purpose of getting rankings in search engines and turning a profit (usually off of affiliate program advertisements on the website).

Duplicate content can keep a page from getting indexed, and some people have even reported trouble with entire websites being duplicated by unethical webmasters, causing problems with their rankings.

Spamming keywords, in meta tags, title tags, or in your page's content is a black hat SEO trick that used to work well back in the 90's. It's long since been basically exterminated as a useful trick.

Linking to "bad neighborhoods", or sites that have been banned from search engines for using the above tricks, while not particularly black hat is definitely unhealthy for your own sites rankings.

10) How should good navigation look like?

Ans) Good Navigation can be broken down to one word - "Breadcrumbs"

If you remember the fairy tail "Hanzel and Gretel" you will recall that when the children were kidnapped, they dropped breadcrumbs so that their rescuers would be able to find them. In our situation, imagine its your site that has been kidnapped and the Search Engines are your "rescuer".

An example of Breadcrumbs would be the following

*Home-->Sublevel1-->Sublevel2

This type of navigation allows the Search Engines to find all of your pages in the most efficient and thorough way. It also aids "Deep Crawls" which are crucial to dynamic sites which may have 100,000 or more pages. Not only should you use this style of navigation behind the scenes but displaying the Breadcrumbs somewhere on the site will help both the Search Engines and visitors alike. A perfect example of this is www.dmoz.org

The second part of navigation is the "site map". This is a page which contains a link to every one of your pages. To be fully optimized, the links should have a descriptive anchor text to further help the page you are lining to. In addition your site map should always be within one click of the index page as this will help the Search Engines find it quickly.

Using these two methods of navigation will ensure your site gets fully indexed and will add to your users experience

Types of Links

The major type of navigation to avoid or to at least compensate for is Javascript pulldown menus. Because Search Engine bots will not follow these, it is important to compensate by having text links somewhere else on the page. These can be in the footer or worked in elsewhere in the content. In fact, Javascript as a navigation in general has been shown to hinder indexing. There are a few alternative ways to code your javascript however if you always code a backup plan, you will enjoy easy indexing without the worry.








Thursday, May 15, 2008

General SEO information:

In the early days of Internet development, its users were a privileged minority and the amount of available information was relatively small. Access was mainly restricted to employees of various universities and laboratories who used it to access scientific information. In those days, the problem of finding information on the Internet was not nearly as critical as it is now.

Site directories were one of the first methods used to facilitate access to information resources on the network. Links to these resources were grouped by topic. Yahoo was the first project of this kind opened in April 1994. As the number of sites in the Yahoo directory inexorably increased, the developers of Yahoo made the directory searchable. Of course, it was not a search engine in its true form because searching was limited to those resources who’s listings were put into the directory. It did not actively seek out resources and the concept of seo was yet to arrive.

Such link directories have been used extensively in the past, but nowadays they have lost much of their popularity. The reason is simple – even modern directories with lots of resources only provide information on a tiny fraction of the Internet. For example, the largest directory on the network is currently DMOZ (or Open Directory Project). It contains information on about five million resources. Compare this with the Google search engine database containing more than eight billion documents.

The WebCrawler project started in 1994 and was the first full-featured search engine. The Lycos and AltaVista search engines appeared in 1995 and for many years Alta Vista was the major player in this field.

In 1997 Sergey Brin and Larry Page created Google as a research project at Stanford University. Google is now the most popular search engine in the world.

Currently, there are three leading international search engines – Google, Yahoo and MSN Search. They each have their own databases and search algorithms. Many other search engines use results originating from these three major search engines and the same seo expertise can be applied to all of them. For example, the AOL search engine (search.aol.com) uses the Google database while AltaVista, Lycos and AllTheWeb all use the Yahoo database.

Principles Of Search Engine:

To understand seo you need to be aware of the architecture of search engines. They all contain the following main components:

Spider - This program downloads web pages just like a web browser. The difference is that a browser displays the information presented on each page (text, graphics, etc.) while a spider does not have any visual components and works directly with the underlying HTML code of the page. You may already know that there is an option in standard web browsers to view source HTML code.

Crawler – This program finds all links on each page. Its task is to determine where the spider should go either by evaluating the links or according to a predefined list of addresses. The crawler follows these links and tries to find documents not already known to the search engine.

Indexer - This component parses each page and analyzes the various elements, such as text, headers, structural or stylistic features, special HTML tags, etc.

Database – This is the storage area for the data that the search engine downloads and analyzes. Sometimes it is called the index of the search engine.

Results engine – The results engine ranks pages. It determines which pages best match a user's query and in what order the pages should be listed. This is done according to the ranking algorithms of the search engine. It follows that page rank is a valuable and interesting property and any seo specialist is most interested in it when trying to improve his site search results. In this article, we will discuss the seo factors that influence page rank in some detail.

Web serverThe search engine web server usually contains a HTML page with an input field where the user can specify the search query he or she is interested in. The web server is also responsible for displaying search results to the user in the form of an HTML page.

Ranking Factors In Search Engine:

Several factors influence the position of a site in the search results. They can be divided into external and internal ranking factors. Internal ranking factors are those that are controlled by seo aware website owners (text, layout, etc.) and will be described next.

Amount of text on a page – A page consisting of just a few sentences is less likely to get to the top of a search engine list. Search engines favor sites that have a high information content. Generally, you should try to increase the text content of your site in the interest of seo. The optimum page size is 500-3000 words (or 2000 to 20,000 characters).

Number of keywords on a page – Keywords must be used at least three to four times in the page text. The upper limit depends on the overall page size – the larger the page, the more keyword repetitions can be made. Keyword phrases (word combinations consisting of several keywords) are worth a separate mention. The best seo results are observed when a keyword phrase is used several times in the text with all keywords in the phrase arranged in exactly the same order. In addition, all of the words from the phrase should be used separately several times in the remaining text. There should also be some difference (dispersion) in the number of entries for each of these repeated words.

Let us take an example. Suppose we optimize a page for the phrase "seo software” (one of our seo keywords for this site) It would be good to use the phrase “seo software” in the text 10 times, the word “seo” 7 times elsewhere in the text and the word “software” 5 times. The numbers here are for illustration only, but they show the general seo idea quite well.

Keyword density and seo – Keyword page density is a measure of the relative frequency of the word in the text expressed as a percentage. For example, if a specific word is used 5 times on a page containing 100 words, the keyword density is 5%. If the density of a keyword is too low, the search engine will not pay much attention to it. If the density is too high, the search engine may activate its spam filter. If this happens, the page will be penalized and its position in search listings will be deliberately lowered.

The optimum value for keyword density is 5-7%. In the case of keyword phrases, you should calculate the total density of each of the individual keywords comprising the phrases to make sure it is within the specified limits. In practice, a keyword density of more than 7-8% does not seem to have any negative seo consequences. However, it is not necessary and can reduce the legibility of the content from a user’s viewpoint.


Description Meta tag – This is used to specify page descriptions. It does not influence the seo ranking process but it is very important. A lot of search engines (including the largest one – Google) display information from this tag in their search results if this tag is present on a page and if its content matches the content of the page and the search query.

Experience has shown that a high position in search results does not always guarantee large numbers of visitors. For example, if your competitors' search result description is more attractive than the one for your site then search engine users may choose their resource instead of yours. That is why it is important that your Description Meta tag text be brief, but informative and attractive. It must also contain keywords appropriate to the page.

External Ranking Factors:

Why inbound links to sites are taken into account - As you can see from the previous section, many factors influencing the ranking process are under the control of webmasters. If these were the only factors then it would be impossible for search engines to distinguish between a genuine high-quality document and a page created specifically to achieve high search ranking but containing no useful information. For this reason, an analysis of inbound links to the page being evaluated is one of the key factors in page ranking. This is the only factor that is not controlled by the site owner.

It makes sense to assume that interesting sites will have more inbound links. This is because owners of other sites on the Internet will tend to have published links to a site if they think it is a worthwhile resource. The search engine will use this inbound link criterion in its evaluation of document significance.

Therefore, two main factors influence how pages are stored by the search engine and sorted for display in search results:

* Relevance, as described in the previous section on internal ranking factors.

* Number and quality of inbound links, also known as link citation, link popularity or citation index. This will be described in the next section.



Link importance (citation index, link popularity) - You can easily see that simply counting the number of inbound links does not give us enough information to evaluate a site. It is obvious that a link from www.microsoft.com should mean much more than a link from some homepage like www.hostingcompany.com/~myhomepage.html. You have to take into account link importance as well as number of links.

Search engines use the notion of citation index to evaluate the number and quality of inbound links to a site. Citation index is a numeric estimate of the popularity of a resource expressed as an absolute value representing page importance. Each search engine uses its own algorithms to estimate a page citation index. As a rule, these values are not published.

As well as the absolute citation index value, a scaled citation index is sometimes used. This relative value indicates the popularity of a page relative to the popularity of other pages on the Internet. You will find a detailed description of citation indexes and the algorithms used for their estimation in the next sections.



Google PageRank – Theoretical Basics - The Google company was the first company to patent the system of taking into account inbound links. The algorithm was named PageRank. In this section, we will describe this algorithm and how it can influence search result ranking.

PageRank is estimated separately for each web page and is determined by the PageRank (citation) of other pages referring to it. It is a kind of “virtuous circle.” The main task is to find the criterion that determines page importance. In the case of PageRank, it is the possible frequency of visits to a page.

I shall now describe how user’s behavior when following links to surf the network is modeled. It is assumed that the user starts viewing sites from some random page. Then he or she follows links to other web resources. There is always a possibility that the user may leave a site without following any outbound link and start viewing documents from a random page. The PageRank algorithm estimates the probability of this event as 0.15 at each step. The probability that our user continues surfing by following one of the links available on the current page is therefore 0.85, assuming that all links are equal in this case. If he or she continues surfing indefinitely, popular pages will be visited many more times than the less popular pages.

The PageRank of a specified web page is thus defined as the probability that a user may visit the web page. It follows that, the sum of probabilities for all existing web pages is exactly one because the user is assumed to be visiting at least one Internet page at any given moment.

Since it is not always convenient to work with these probabilities the PageRank can be mathematically transformed into a more easily understood number for viewing. For instance, we are used to seeing a PageRank number between zero and ten on the Google Toolbar.

According to the ranking model described above:
* Each page on the Net (even if there are no inbound links to it) initially has a PageRank greater than zero, although it will be very small. There is a tiny chance that a user may accidentally navigate to it.
* Each page that has outbound links distributes part of its PageRank to the referenced page. The PageRank contributed to these linked-to pages is inversely proportional to the total number of links on the linked-from page – the more links it has, the lower the PageRank allocated to each linked-to page.
* PageRank A “damping factor” is applied to this process so that the total distributed page rank is reduced by 15%. This is equivalent to the probability, described above, that the user will not visit any of the linked-to pages but will navigate to an unrelated website.

Let us now see how this PageRank process might influence the process of ranking search results. We say “might” because the pure PageRank algorithm just described has not been used in the Google algorithm for quite a while now. We will discuss a more current and sophisticated version shortly. There is nothing difficult about the PageRank influence – after the search engine finds a number of relevant documents (using internal text criteria), they can be sorted according to the PageRank since it would be logical to suppose that a document having a larger number of high-quality inbound links contains the most valuable information.

Thus, the PageRank algorithm "pushes up" those documents that are most popular outside the search engine as well.


Google PageRank – Practical Use - Currently, PageRank is not used directly in the Google algorithm. This is to be expected since pure PageRank characterizes only the number and the quality of inbound links to a site, but it completely ignores the text of links and the information content of referring pages. These factors are important in page ranking and they are taken into account in later versions of the algorithm. It is thought that the current Google ranking algorithm ranks pages according to thematic PageRank. In other words, it emphasizes the importance of links from pages with content related by similar topics or themes. The exact details of this algorithm are known only to Google developers.

You can determine the PageRank value for any web page with the help of the Google ToolBar that shows a PageRank value within the range from 0 to 10. It should be noted that the Google ToolBar does not show the exact PageRank probability value, but the PageRank range a particular site is in. Each range (from 0 to 10) is defined according to a logarithmic scale.

Here is an example: each page has a real PageRank value known only to Google. To derive a displayed PageRank range for their ToolBar, they use a logarithmic scale as shown in this table
Real PR == ToolBar PR

1-10 == 1
10-100 == 2
100-1000 == 3
1000-10.000 == 4


This shows that the PageRank ranges displayed on the Google ToolBar are not all equal. It is easy, for example, to increase PageRank from one to two, while it is much more difficult to increase it from six to seven.

In practice, PageRank is mainly used for two purposes:

1. Quick check of the sites popularity. PageRank does not give exact information about referring pages, but it allows you to quickly and easily get a feel for the sites popularity level and to follow trends that may result from your seo work. You can use the following “Rule of thumb” measures for English language sites: PR 4-5 is typical for most sites with average popularity. PR 6 indicates a very popular site while PR 7 is almost unreachable for a regular webmaster. You should congratulate yourself if you manage to achieve it. PR 8, 9, 10 can only be achieved by the sites of large companies such as Microsoft, Google, etc. PageRank is also useful when exchanging links and in similar situations. You can compare the quality of the pages offered in the exchange with pages from your own site to decide if the exchange should be accepted.

2. Evaluation of the competitiveness level for a search query is a vital part of seo work. Although PageRank is not used directly in the ranking algorithms, it allows you to indirectly evaluate relative site competitiveness for a particular query. For example, if the search engine displays sites with PageRank 6-7 in the top search results, a site with PageRank 4 is not likely to get to the top of the results list using the same search query.

It is important to recognize that the PageRank values displayed on the Google ToolBar are recalculated only occasionally (every few months) so the Google ToolBar displays somewhat outdated information. This means that the Google search engine tracks changes in inbound links much faster than these changes are reflected on the Google ToolBar.

Friday, April 4, 2008

Positive ON-Page SEO Factors

Keyword in URL:
First word is best, second is second best, etc.

Keyword in Domain name:
Same as in page-name-with-hyphens

Keyword in Title tag:
Keyword in Title tag - close to beginning.Title tag 10 - 60 characters, no special characters.

Keyword in Description meta tag:
Shows theme - less than 200 chars.Google no longer "relies" upon this tag, but will often use it.

Keyword in Keyword metatag:
Shows theme - less than 10 words.Every word in this tag MUST appear somewhere in the body text. If not, it can be penalized for irrelevance.No single word should appear more than twice.If not, it may be considered spam. Google purportedly no longer uses this tag, but others do.

Keyword density in body text:
5 - 20% - (all keywords/ total words)Some report topic sensitivity - the keyword spamming hreshold % varies with the topic.

Individual keyword density:
1 - 6% - (each keyword/ total words)

Keyword in H1, H2 and H3:
Use Hx font style tags appropriately

Keyword font size:
"Strong is treated the same as bold, italic is treated the same as emphasis". Matt Cutts July 2006

Keyword proximity (for 2+ keywords):
Directly adjacent is best

Keyword phrase order:
Does word order in the page match word order in the query? Try to anticipate query, and match word order.

Keyword prominence (how early in page/tag):
Can be important at top of page, in bold, in large font

Keyword in alt text:
Should describe graphic - Do NOT fill with spam (Was part of Google Florida OOP - tripped a threshold - may still be in effect to some degree as a red flag, when summed with all other on-page optimization - total page optimization score - TPOS).

Keyword in links to site pages (anchor text):
Links out anchor text use keyword?

To internal pages- keywords? :
Link should contain keywords. The filename "linked to" should contain the keywords.Use hyphenated filenames, but not long ones - two or three hyphens only.

All Internal links valid?:
Validate all links to all pages on site. Use a free link checker. I like this one.

Efficient - tree-like structure:
TRY FOR two clicks to any page - no page deeper than 4 clicks

Intra-site linking:
Appropriate links between lower-level pages

To external pages- keywords?:
Google patent - Link only to good sites. Do not link to link farms. CAREFUL - Links can and do go bad, resulting in site demotion. Unfortunately, you must devote the time necessary to police your outgoing links - they are your responsibility.

Outgoing link Anchor Text:
Google patent - Should be on topic, descriptive Link stability over time Google patent - Avoid "Link Churn"

All External links valid?:
Validate all links periodically.

Less than 100 links out total:
Google says limit to 100, but readily accepts 2-3 times that number. ref 2k

Sunday, March 30, 2008

Web Marketing Services

Email Marketing

E-Mail Marketing is the process of sending sales letters or customer newsletters via electronic mail (e-mail). Even though some people think that it may be annoying, many businesses find it is as a cost effective marketing tool. The main benefit of e-mail marketing as compared to other marketing sources is interested audience can directly visit the site by clicking on the link in the e-mail. More>>>

Blog Marketing

Blog is nothing but a web site, where you can post thoughts; interact with customers online and advance corporate communications on an ongoing basis. Now days, Blogs are being used to build personal and corporate credibility that will help to interact and attract maximum customers. If you have good Blog marketing strategy then you can sell your services or products to interested/sleeping customers too. More>>>

Banner Advertising

Banner Advertising is intended to attract traffic towards the advertiser’s website. These advertisings should be done with the most popular and most often visited websites like newspapers sites. Also this type of advertising can bring business if it is done on the business related business (B2B) websites. For e.g. If you are in Hotel industry and you are providing special honeymoon packages then you can have tie-up with the leading matrimony or wedding sites to advertise on their website from where you can get targeted customers. More>>>

Product Feed Submission

This is one more marketing medium in which we submit your product details and company details to the shopping search engines. Froogle is the best example of product feed submission. Some websites charge some fees to list our product on their website. e.g. Ebay.com, Amazon.com Pricegrabber.com, Shopping.com, Froogle.com, etc More>>>


Classified Ad Management

Classified ad is short text advertisement. This is the instant way of displaying your ads online on popular websites. It gives us good traffic, which can be converted into leads. More>>>

Wednesday, March 26, 2008

SEO

SEO can be abbreviated as "search engine optimizers", Search engine optimizers may offer SEO as a stand-alone service or as a part of a Internet marketing. Because effective SEO may require changes to the HTML source code of a site, SEO tactics may be incorporated into web site development and design.
Search engine optimization (SEO) is the process of improving the volume and quality of traffic to a web site from search engines via natural search results for targeted keywords. Earlier the sites are placed in the search engine according to there page ranks, that results results more searchers will visit that site.
One tactic as introduced by James Martin was to create unique content, a story so to speak in order to gain search engine "Authority". James created a series of short stories which revolved around a monk named Tuttle. These stories were about his journey through life, and included the relevant keywords needed to gain success.
The "Tuttle SEO Tactic" as it is now commonly referred to, is an interesting, yet complicated way of gaining search engine rankings. It requires a creative mind to introduce concepts, writings and stories that are unique, keyword rich and full of life.

Major search engines
* Google
* Yahoo!
* Live Search by Microsoft, formerly MSN
* Ask.com, formerly Ask Jeevesclick here

Tatics

Rules and limitations can make it harder to benefit from the ranking algorithm, including quirks, of the targeted search engine. For example, the January 2006 Redscowl Bluesingsky contest issued by seologs.com was open for new domains only. That meant that the contestants couldn't benefit from the ranking advantage old web sites are thought to have over new ones. An example of that is the age advantage Anil Dash' blog page had over the well-received but brand new Nigritude Ultramarine FAQ - respectively ended 1st and 6th in the Nigritude Ultramarine challenge. It was expected that the Redscowl Bluesingsky game would be won by a domain of the style "redscowl-bluesingsky.tld" - bound to attract natural links and to benefit from the fact that the URL is made up entirely of the search words.click here