I am a SEO copywriter, SEO executive, SEO strategist, SEO analyst, search quality evaluator at software development company in India.
RSS

What makes a quality link target?

Many people use Google PageRank as a measure of the quality of a link target - usually derived from the little green bar that you see when you download the Google toolbar. This is a mistake.

The score that is shown on the toolbar does not represent the real PageRank that Google assigns to the page. It is notoriously inaccurate almost to the point of being meaningless: it should only be taken as a very rough guide and in my opinion should be marked ‘for amusement only’.

So if the PageRank scores Google shares with us are meaningless, how can we gauge the quality of a potential link target? Here are some suggestions:

1. The target website is somewhere you'd like to be seen

This is an absolute must when looking for link targets. Look for respected sources of information in your industry and it's probably safe to say that it will be regarded as an authority site by Google, and a link from them to your site will be valuable.

2. The target website must be relevant to your business


It's likely to be used by people who would be interested in what you have to offer. That doesn't mean you stick rigidly to sites that are exactly on topic. Look also for sites that are related.

For example, a company that offers data recovery services will want to be seen on technology sites. They may also try to get on to sites whose main focus may be on health management, education or local government but have a section on ‘technology in health management’, ‘technology in education’.

3. The website should be able to drive appropriate traffic

That not only means the same target market as your website, but at the right stage of the buying cycle. As John Alexander shows in Wordtracker Magic, the best time to target recent mothers is not after the baby has been born but before, during pregnancy when the mother is searching the web to find potential names for her child.

4. The website should perform well on Google

The pages upon which your link might sit should be found in the Google index. To find out simply select a unique group of about 6-10 words, put them into the Google search box and enclose them in quotation marks. If the page has been crawled it should come up in the research results.

5. The links must be visible to the search engines

There are some dynamic linking techniques designed to hoard PageRank by not letting the search engine robots follow the links on a page. Such links are valuable only for the traffic that they bring: they will not help your search engine rankings in any way. The most simple of these is the ‘no follow’ tag, agreed by the major search engines to prevent ‘blog spamming’.

6. The website embeds the links in the body copy

This is much better than listing them at the side or bottom of an article. I've found that if another writer links to a site in the body of an article it generates more traffic than from an article that includes a link at the bottom.

7. The page with the links should be near the homepage

Search engine bots are unlikely to go more than three levels deep on any website. Links buried deeper may not be found. When looking for quality links look for websites that provide links as close to the homepage as possible. For those of you designing websites, follow the example of the BBC: their rule is that every piece of content on the BBC News site must be available within three clicks of the News home page.

8. The target website lets you use your own linking text

Webmasters know the value of linking text and they should take the trouble to link to you with meaningful text rather than just your URL.

9. The target website links to specific content

External links to your site, particularly if they are included in editorial should link to a specific resource, not just your homepage.

Conclusion

You are unlikely to find link targets that satisfy all these criteria, but use the list as a checklist and concentrate your link building efforts on those that score highest. 1 comments

On-page Versus Off-page SEO Techniques

The first thing most companies talk about when discussing SEO is what can be done within the site. Your site is your greatest asset on the net, and as the focus of all your search engine hopes, it’s only natural to focus on what it can do for you. The next step, however, is to focus on what the net can be doing for you.

It is well known that off-page factors play a significant part in a site’s ranking. The links toward a site and the mentions of it contribute to the weighing of reputation and popularity within the ranking algorithm. While a site can get by while leaving off-page factors to develop independently, it’s much smarter to give your off-page factors a boost as part of your SEO plan.

There is a fair amount of debate within the SEO community as to the right balance between on-page and off-page search engine optimisation. There is little doubt, however, that both types of factors need to be taken into account when you want to mount a successful SEO campaign.

On-page and off-page SEO – what’s the difference?

The difference between on-page and off-page factors in your SEO plan is fairly straightforward. On-page is whatever you can do within your site, and most traditional SEO techniques fall into this category. Things such as keyword placement, meta tags, your URLs, internal links and quality content are regular on-page factors in an SEO plan. Off-page is everything you do outside your site to draw attention to it, usually hinging on the links you can get.

What to do with your on-page SEO

As stated above, on-page SEO has been the focus of techniques for a very long time. The first step for most on-page SEO will be to research a keyword list and decide which keywords the site should compete for. Keywords are then distributed through content, tags are rewritten and internal links re-focussed to support the site’s relevance for those chosen keywords. After that, the paths through the site are tidied up and code polished so that everything within the site will run smoothly whenever a search engine spider crawls past. This is, of course, an oversimplification of what happens for on-page factors in an SEO plan, and you can talk with our consultants at SEO Consult about using your site’s assets to their full ability for your SEO. 2 comments

8 SEO Techniques

I am keeping this article very simple. You need to follow 8 very simple techniques to make your website SEO friendly. Here are these 8 points.

1. Decide the right keyword

Even before you start designing or writing a blog, think of what keywords people are searching for. You can use google Keyword Tool to find what relevant keywords people are searching. Then you have to look how many pages the search with that keyword results in google. If the number of the results is several million pages, it is probably difficult to be on the top. If however, it is only few hundred thousand words, it will probably be easy to bring your page to top with that keyword.
2. Title

Once you have decided what keyword is right for your page, optimize your webpage for that keyword. Start with the title of the page. The title of the webpage should have the keyword for which you are optimizing the website.
3. URL

The url of the webpage must have the keyword for which you are optimizing your website. This may not be convenient from design point of view, but is very important from SEO point of view.
4. Meta tag

The meta tags must contain the keywords that you represent. Many programmers make the mistake that they keep a common header for all their pages. The common header has common meta tags. This does not serve the SEO purpose well. The meta section should represent the keywords for which we are optimizing the website.
5. Use Related Terms and Keywords in h1 tag

Whatever is written in the h1 header tag is given more weightage for SEO purpose. Therefore, use the right keywords for the header section. The h2 and h3 header should also be used to represent the keywords for which you are optimizing the site.
6. Use Related Terms and Keywords in the webpage

Your webpage should contain the keywords for which you are optimizing the website. The contents of the webpage should be relevant to the keyord for which you are optimizing the website.
7. Do not use excessive outgoing link

The more is the number of outgoing links in a page, the less is the webpage optimized for SEO. So use only as many outgoing link as is appropriate for the article.
8. Build Links and get incoming links

The more is the number of incoming links to youe page, the more is your webpage likely to come to top when someone searches for your webpage for a keyword. You can get backlinks by signatures from webforums, by writing blogs. Eventually you would like to write good blogs or you would like to create good pages so that people start linking your page organically. 0 comments

Critical steps to take before submitting

After developing a Web site and selecting the best hosting company, don't rush out and submit it to search engines immediately. A Web site manager would be wise to take a little time to:

Fine tune the TITLE tag to increase traffic to the site

Improving the TITLE tag is one technique that applies to just about all the search engines. The appearance of key words within the page title is one of the biggest factors determining a Web site's score in many engines. It's surprising how many Web sites have simple, unimaginative titles like "Bob's Home Page" that don't utilize keywords at all. In fact, it's not unusual to see entire Web sites that use the same title on every page in the site. Changing page titles to include some of the site's key words can greatly increase the chance that a page will appear with a strong ranking in a query for those key words.

Create gateway pages that are specific to the focus of each site

Key word selection must be done carefully with great forethought and understanding of the search engine's selection criteria for key words. The larger the number of key words that are used, the more the relevance of any one key word is diluted. One way to get around this is to create gateway pages.

Gateway pages are designed specifically for submission to a search engine. They should be tuned with a specific set of key words, boosting the chance that these key words will be given a heavy weight. To do this, several copies of a page should be made, one for each set of key words. These pages will be used as entry points only, to help people find the site, therefore, they don't need to fit within the normal structure of the site. This provides the page developer with greater flexibility in establishing key words and tags that will encourage a stronger ranking with the search engines. Each gateway page then can be submitted separately to the search engines.

Ensuring that site technology won't confuse the search engines

Often the latest technology being built into a site can confuse the search engine spiders. Frames, CGI scripts, image maps and dynamically generated pages are all recently created technology that many spiders don't know how to read. With frames for instance, the syntax of the FRAMESET tag fundamentally changes the structure of an HTML document. This can cause problems for search engines and browsers that don't understand the tag. Some browsers can't find the body of the page and viewing a page through these browsers can create a blank page.

Today only 2% of browsers don't support frames, but many search engine spiders still don't support them. A search engine spider is really just an automated Web browser and like browsers they sometimes lag behind in their support for new HTML tags. This means that many search engines can't spider a site with frames. The spider will index the page, but won't follow the links to the individual frames.

Setting up a NOFRAMES section on the page

Every page that uses frames should include a NOFRAMES section on the page. This tag will not affect the way a page looks but it will help a page get listed with the major search engines. The NOFRAMES tag was invented by Netscape for backward compatibility with browsers that didn't support the FRAME and FRAMESET tags.

Performing a maintenance check

All Web sites should be thoroughly tested using a site maintenance tool in order to catch errors in operation before customers are brought to the site. HTML errors can hinder a search engine spider's ability to index a site, it can also keep a search engine from reading a page or cause it to be viewed in a manner different from how it was intended. In fact, a recent report by Jupiter Communications suggested 46% of users have left a preferred Web site because of a site-related problem. With NetMechanic's HTML Toolbox or another site maintenance tool, all Webmasters, from the novice to the expert can avoid potential visitor disasters due to site errors.

Finding the best submission service

Selecting a search engine submission service requires careful thought and important decisions. Using an auto submission service is a good place to begin. Most search engines like Alta Vista, HotBot and InfoSeek automatically spider a site, index it and hopefully add it to their search database without any human involvement. Some engines, like Yahoo, are done completely with human review and for many reasons are best submitted individually. Chances are good also, that in the first submission a site will be rejected by several of the engines and will need to be individually resubmitted. There are several online resources for auto submissions. The best ones won't submit a site to Yahoo where the customer is better served doing this on his own.

Understanding the waiting periods

A variety of waiting periods must be endured with each search engine before there is even a hope of being listed. Knowing and understanding these waiting periods before beginning the process can eliminate or at least minimize frustration and confusion. Typical waiting periods for some of the more popular engines are six months with Yahoo; one to two months with Lycos and 4-6 weeks with Excite or is that 4-6months? What they say and what happens in reality can be very different. 0 comments

Tips For Site Optimization

It is not hard to be barking up the wrong tree when it comes to getting search engine traffic because there is so much out of date information being circulated.

That is why it is always best to also learn many other traffic generation methods like viral marketing for example.

Not only is there out of date or invalid SEO advice getting around, there is also information which if acted upon, can result in your pages being banned.

The SEO tips below should assist the reader in forming a basic understanding of how to create human friendly web pages which are easily understood by the most popular search engines.

Know this. There are thousands of search engines but only two of them will bring you most of the traffic. They are google and yahoo. Another search engine that brings me a little traffic is msn but I do not focus too much on tactics for that engine.

Focus your attention on the engines that will bring you the most visitors first and work your way down.

Basic SEO

1. Insert keywords within the title tag so that search engine robots will know what your page is about. The title tag is located right at the top of your document within the head tags. Inserting a keyword or key phrase will greatly improve your chances of bringing targeted traffic to your site.

Make sure that the title tag contains text which a human can relate to. The text within the title tag is what shows up in a search result
. Treat it like a headline.

2. Use the same keywords as anchor text to link to the page from different pages on your site. This is especially useful if your site contains many pages. The more keywords that link to a specific page the better.

3. Make sure that the text within the title tag is also within the body of the page. It is unwise to have keywords in the title tag which are not contained within the body of the page.

Adding the exact same text for your h1 tag will tell the reader who clicks on your page from a search engine result that they have clicked on the correct link and have arrived at the page where they intended to visit. Robots like this too because now there is a relation between the title of your page and the headline.

Also, sprinkle your keywords throughout your article. The most important keywords can be bolded or colored in red. A good place to do this is once or twice in the body at the top of your article and in the sub-headings.

4. Do not use the exact same title tag on every page on your website. Search engine robots might determine that all your pages are the same if all your title tags are the same. If this happens, your pages might not get indexed.

I always use the headline of my pages as the title tag to help the robots know exactly what my page is about. A good place to insert the headline is within the h1 tag. So the headline is the same as the title tag text.

5. Do not spam the description or keyword meta tag by stuffing meaningless keywords or even spend too much time on this tag. SEO pros all agree that these tags are not as important today as they once were. I just place my headline once within the keywords and description tags.

6. Do not link to link-farms or other search engine unfriendly neighborhoods.

7. Do not use doorway pages. Doorway pages are designed for robots only, not humans. Search engines like to index human friendly pages which contain content which is relevant to the search.

8. Title tags for text links. Insert the title tag within the HTML of your text link to add weight to the link and the page where the link resides. This is like the alt tag for images.

My site contains navigation menus on the left and right of the page. The menu consists of links not images. When you hover over the link with your mouse, the title of the link appears. View the source of this page to see how to add this tag to your links.

9. Describe your images with the use of the alt tag. This will help search engines that index images to find your pages and will also help readers who use text only web browsers.

10. Submit to the search engines yourself. Do not use a submission service or submission software. Doing so could get your site penalized or even banned.

Here is the submission page for google: http://www.google.com/addurl.html

Submit only once. There is no need to submit every two weeks. There is no need to submit more than one page. Robots follow links. If your site has a nice link trail, your entire site will get indexed.

My site has a nice human friendly link trail which robots follow easily. All my pages get indexed without ever submitting more than the main index page once. 0 comments