Wednesday 25 January 2012

Online Internet Marketing news & Updates

How Search Engines Work : Learn About Search Engines & Their working


It’s been 10 years since I wrote the second edition of a book about search engines called “Search Engine Marketing: The Essential Best Practice Guide”. It was a very big seller and, in fact, it carried on selling through to the beginning of 2010 when I took it offline.

I’ve decided to start this year by revisiting the chapter in the book about how search engines work. I’ve said many times over the years that, most books about SEO have a section called how search engines work. But rarely (if ever) do they describe the interdisciplinary approach to information retrieval (IR) covering mathematics, computer science, library science, information architecture, cognitive psychology, linguistics, and statistics – to name but a few.

Previously, I had written mainly about methods of manipulating rankings by keyword stuffing and other black hat type techniques of the time. But as I began to realize the importance of linkage data and even more so, link anchor text, I became more and more inquisitive as to what it was exactly that search engines used in their ranking technologies.

After talking to one of the pioneers in web search (Brian Pinkerton of WebCrawler) I was introduced to the work of foremost information retrieval scientist, Gerard Salton. This was a major breakthrough for me.

Salton’s work was cited in just about every IR research paper I read at the time. So I tracked down and bought a copy of his seminal work “Modern Information Retrieval” (written back in 1983, however Salton’s work in the field goes back to the early 1970s).

As a marketer, not a scientist, this was no easy read. Yet, as I began to grasp the basic concepts and drivers behind information retrieval (and the way it is applied to the web) the more I was able to understand the major challenges involved. And that led me to change, not just the amateurish and spammy techniques I’d used previously, but to thinking about SEO in an entirely different way.

And to this day, I still firmly believe that a basic understanding of the science of information retrieval on the web goes a long way toward helping search marketers dispel myths and do their jobs more professionally and proficiently.

Of course, 10 years later my personal library has grown to include a very large section of information retrieval and data mining texts as more and more become available. This is also largely due to the fact that the subject matter is so fascinating it’s hard not to become engrossed.

As I revisited the chapter I’d written on how search engines work from a decade ago, I expected it to be a bit stale, but it wasn’t at all. Although, I dare say, to an IR scientist, if not stale it probably seems about as elementary as it gets. I wrote the chapter placing great emphasis on trying to make it non-mathematical. By that I mean highlighting concepts and background theory rather than matrices and formulae. That said, it’s extremely hard to cover the subject without references to linear algebra and other mind-numbing math.

Anyway, if you’re genuinely interested in how search engines work (but really, not the anecdotal stuff generally bandied around) then it’s as good a place as any to start. I mention in the introduction that it is totally unchanged from the very quirky, very British flavor it had when it was first published. A few pages were eliminated purely because they were totally irrelevant a decade later. There are a few little gems in it which I’d forgotten about.

No the subhead above isn’t a typo or spelling mistake. It’s actually a conversation.

When the French author Victor Hugo had Les Miserables published, he was not living in Paris at that time. He was waiting to hear news from his publisher about the kind of reception his new book was having. When he could wait for news no longer, he sent a letter to his publisher which contained only the character: ?

On receiving this, his publisher knew exactly what it meant and he returned a note to him containing only the character: ! This let Victor Hugo know that his book was a huge success. It is said that this is the shortest correspondence in history.

What’s that got to do with anything? It was actually a good analogy I used relating to the length of the average query at search engines at the time and how difficult it is to deal with short queries.

I’m seriously thinking about trying to find the time to update the entire book this year and make it available free to Search Engine Watch and ClickZ subscribers. More recently I’ve been reading about a feature-centric view of information retrieval and also learning to rank for information retrieval and natural language processing (very hot research topics). I’ll be writing a couple of follow up columns covering these subjects combined with fascinating insights into the strength of end user data and, of course, weaving that into the update of the book once I get time to make a start.

But right now, feel free to download the PDF of “How Search Engines Work”. If nothing else, I hope it acts as a very basic introduction.



Better page titles in search results : Google Webmaster Central Blog Guidelines


Page titles are an important part of our search results: they’re the first line of each result and they’re the actual links our searchers click to reach websites. Our advice to webmasters has always been to write unique, descriptive page titles (and meta descriptions for the snippets) to describe to searchers what the page is about.

We use many signals to decide which title to show to users, primarily the <title> tag if the webmaster specified one. But for some pages, a single title might not be the best one to show for all queries, and so we have algorithms that generate alternative titles to make it easier for our users to recognize relevant pages. Our testing has shown that these alternative titles are generally more relevant to the query and can substantially improve the clickthrough rate to the result, helping both our searchers and webmasters. About half of the time, this is the reason we show an alternative title.

Other times, alternative titles are displayed for pages that have no title or a non-descriptive title specified by the webmaster in the HTML. For example, a title using simply the word “Home” is not really indicative of what the page is about. Another common issue we see is when a webmaster uses the same title on almost all of a website’s pages, sometimes exactly duplicating it and sometimes using only minor variations. Lastly, we also try to replace unnecessarily long or hard-to-read titles with more concise and descriptive alternatives.


For more information about how you can write better titles and meta descriptions, and to learn more about the signals we use to generate alternative titles, we’ve recently updated the Help Center article on this topic. Also, we try to notify webmasters when we discover titles that can be improved on their websites through the HTML Suggestions feature in Webmaster Tools; you can find this feature in the Diagnostics section of the menu on the left hand side.

Google Page Layout Algorithm Improvement : Google Algorithm Update

In our ongoing effort to help you find more high-quality websites in search results, today we’re launching an algorithmic change that looks at the layout of a webpage and the amount of content you see on the page once you click on a result.

As we’ve mentioned previously, we’ve heard complaints from users that if they click on a result and it’s difficult to find the actual content, they aren’t happy with the experience. Rather than scrolling down the page past a slew of ads, users want to see content right away. So sites that don’t have much content “above-the-fold” can be affected by this change. If you click on a website and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience. Such sites may not rank as highly going forward.

We understand that placing ads above-the-fold is quite common for many websites; these ads often perform well and help publishers monetize online content. This algorithmic change does not affect sites who place ads above-the-fold to a normal degree, but affects sites that go much further to load the top of the page with ads to an excessive degree or that make it hard to find the actual original content on the page. This new algorithmic improvement tends to impact sites where there is only a small amount of visible content above-the-fold or relevant content is persistently pushed down by large blocks of ads.

This algorithmic change noticeably affects less than 1% of searches globally. That means that in less than one in 100 searches, a typical user might notice a reordering of results on the search page. If you believe that your website has been affected by the page layout algorithm change, consider how your web pages use the area above-the-fold and whether the content on the page is obscured or otherwise hard for users to discern quickly. You can use our Browser Size tool, among many others, to see how your website would look under different screen resolutions.

If you decide to update your page layout, the page layout algorithm will automatically reflect the changes as we re-crawl and process enough pages from your site to assess the changes. How long that takes will depend on several factors, including the number of pages on your site and how efficiently Googlebot can crawl the content. On a typical website, it can take several weeks for Googlebot to crawl and process enough pages to reflect layout changes on the site.

Overall, our advice for publishers continues to be to focus on delivering the best possible user experience on your websites and not to focus on specific algorithm tweaks. This change is just one of the over 500 improvements we expect to roll out to search this year. As always, please post your feedback and questions in our Webmaster Help forum.

SEO Tips and Tricks: Things You Should and Shouldn’t Do!


Search Engine Optimization (SEO) is a basic necessity for businesses to survive in the online marketplace. SEO is widely used by individuals and business to optimize their websites in order to achieve a prominent place on the search engine’s result page (SERP), thereby drawing the attention of users looking for products/services.

This technique serves to drive traffic to a website, improves its business revenue, and also helps to maintain / improve the SERP ranking. Although the concept and its implementation may seem quite simple to a novice in this domain, there is more to it than what meets your eye. Optimization process usually spans several steps right from analyzing your website current traffic and ranking, to optimizing keywords, content, website structure, links, meta-tags and building inbound links and social media presence.


It is mandatory that all these steps be well-planned before execution. Online marketing certainly has a wider reach than conventional marketing techniques and one wrong step is enough to mar your reputation with both search engines as well as your potential customers. You certainly will have to watch each step of your SEO journey. Here are a few tips to help you with your SEO campaign.

List of Must Do’s

Use relevant keywords in the right proportion to convey messages in a crisp meaningful manner. A single theme per page helps to communicate better with the user.

Titles, headings, content, anchor text for links, URLs and meta tags should all be keyword rich, and convey the nature of your business to the end user. The descriptive text used in these cases also helps the search engine to match user queries to your description and return your website on the SERP.

Make sure that all images /pictures on your website have descriptive, keyword-rich Alt tags. These are not only used by search engines to look-up results but are also displayed to the users if their browsers do not support the select image formats.

Generate quality inbound links to your websites from authentic external websites that already rank high on the SERP. These links are bound to drive traffic to your website and also serve to promote your business and ranking.

Ensure that file names corresponding to each of your web pages include relevant keywords in their names and use hyphens instead of underscores so that keywords are read correctly. Hyphens are read as spaces, while underscores are deleted when file names are processed.

Refresh your website content, update blogs and submit articles at regular intervals and this will force the search engine indexes to re-rank your website.

Create a good sitemap, update Robots.txt file to facilitate ranking, fix any broken links, blank pages, distorted images and such similar issues.

Opt for social media marketing, affiliate marketing and other online marketing techniques (paid and free), which also serve to improve your SERP ranking.

Continuously analyze, track and improve your rankings by constantly optimizing your content and keeping pace with the evolving SEO scenario.

Things to Avoid

Steer clear of verbose content that is stuffed with irrelevant keywords. Keywords when used in the right manner, improve your website ranking, however an overdose may prove fatal as the search engine indexing bots will black-list your site. Visitors to your website too are bound shy away because of the confusing content.

Do not ever resort to link / content / keyword spamming techniques or websites that promote such techniques to achieve a high website ranking.

Avoid plagiarized content as unique content alone is effective in capturing the attention of search engine and the users. Do not repeat same /copied content across different pages of your website, or in article / blog submissions.

Minimize the use of animations as they will not contribute to your website’s ranking even though they may appeal to users visiting your page.

Avoid multiple URLs to your website. A single main URL with traffic redirected from other URLs prevents your website from competing with itself.
Complacency may not really help you as the indexing algorithms of search engines are constantly evolving. You will have to keep pace with the latest guidelines and make sure that your website meets all the standard requirements in order to achieve a high ranking on the SERP.

By following these tips you will certainly be able to do a good job on your SEO implementation, without having to burn a hole in your pocket by opting for expensive services.

Alternatives to Google Analytics Website Analysis tool


Since its launch in 2005 Google Analytics has become almost unassailable in the world of website analytics, with 57% of the world’s 10,000 most popular websites using the popular site statistics suite.
Prior to the arrival of Google Analytics, the choices were largely between the inferior data of ‘server stats’ packages, lightweight 3rd party services or paying several hundred dollars a month for an enterprise level solution.  Google Analytics brought powerful, accurate analytics to the masses and as site owners we lapped it up.

Tools such as Google Analytics give us the data to make smarter decisions about our websites and our businesses.  Whether we are looking to increase traffic, improve conversions, conceive content ideas or do any of a myriad of other tasks, our analytics suite will often be the starting point.


However Analytics != Google Analytics. There are alternatives out there, and what’s more some of them are rather good.

Why wouldn’t you use Google Analytics?

Google Analytics is undoubtedly a strong product. The combination of in depth data, powerful analysis and creative visualisations make it a compelling product.  The fact that you get all of this for free is certainly a draw as well. It is certainly my own first choice for almost all projects that I work on.

The Ubiquitous Google

However there are growing concerns by many about the all-seeing and all-knowing nature of Google.  For some the idea of one company controlling so many parts of their online operation is uncomfortable.  For those working in the areas of online marketing that might require head-wear of a darker shade it just isn’t an option.
Complexity

One complaint that I hear frequently, particularly from new clients, is the complexity of Google Analytics. For the occasional user and those just seeking fast answers to simple questions that complexity can be off-putting.  With Google Analytics version 5 likely to become compulsory early in the new year that complexity is only going to increase, which will undoubtedly be off-putting to some.

Feature Set

Finally there is the simple fact that you sometimes either need or want something a little different. Whilst Google Analytics certainly offers a lot, it doesn’t offer everything or suit everyone. Between them, the alternatives below offer a host of features that are not available in Google’s product.  For some projects that might be just what is needed.

What alternatives are there?

Google Analytics fulfills different needs for different people, so your choice of alternatives will really be governed by what you are actually looking for from a solution.  However there is little that Google Analytics offers that is truly unique, which leaves plenty of interesting alternatives to look at. I’ve picked 7 alternatives that should offer something for everyone.

1. Clicky

Clicky prides itself on being easy to use, in fact they confidently claim to be the easiest analytics service you have ever used. Ease of use often means simplicity and Clicky certainly doesn’t provide the depth of data that an expert Google Analytics user might expect, but to Clicky and its loyal users that is one of the key advantages! Most site owners never look at most of the in depth data and the addition of live data makes Clicky appealing to many.

Despite its claimed simplicity, Clicky does offer useful click-stream data, visitor level information that Google doesn’t share through analytics.  Clicky is free for up to 3,000 daily page views with paid packages starting at under $5/month.

2. Mixpanel

Mixpanel is another package that headlines with real-time analytics, however it is their handling of conversion funnels that stands out for me.  In particular; the ability to be able to create and analyse funnels retroactively in a way that is both quick and elegant.

Cohort analysis in Mixpanel allows you to track the retention of your product, graphing how often customers return after their initial visit – a great metric for building a stickier (and more profitable) site.
Mixpanel’s pricing structure is based on how many events  you track, which might seem confusing to some.  However they have a free package allowing you to track up to 25,000 data points which can be increased to 200,000 if you give them a footer link in return. For most sites that would be more than adequate to at least test this innovative offering.

3. FoxMetrics

FoxMetrics gives you the ability to track metrics that are specific to your business, in the form of events.  Using their API you could for instance track software installs, newsletter views, media consumption or
almost any event that you can get to trigger an API call.

These events, along with more standard metrics, can be used to trigger personalization of your website based on user behaviour. Simple examples of this might be displaying a “subscribe” call to action to visitors who have read multiple pages, or a different banner to newsletter subscribers.

FoxMetrics offers a free package for up to 25,000 events and premium packages from under $10/month.

4. Open Web Analytics

Open Web Analytics is the open source community’s answer to Google Analytics and has a look and feel that will be rather familiar to many.  Rather than being a hosted solution, OWA is a downloadable program that you install on your server. Whist this means some extra work at the outset it does also mean retaining control and ownership of your site’s analytics data.

In terms of features OWA does it’s best to mimic Google Analytics and covers the key features quite well. OWA adds tracking of mouse movements and visual heatmaps to the feature set which will be of use to those with a casual interest in usability.  However the key selling point of Open Web Analytics is not really it’s feature set, but in offering a self-hosted and open source alternative to Google Analytics.

5. Kissmetrics

Many site owners will be aware of Kissmetrics thank to the excellent blog they run, yet I suspect far fewer have tried their analytics solutions.  Kissmetrics tries to make analytics more personal by tracking, and allowing you to easily visualise, the user life-cycle.  If you’ve ever found yourself wondering why some of your site visitors are so much more valuable than others then Kissmetrics allows you to drill down to see the behaviour of individual visitors on your site and how that has changed over time.

The other great draw of Kissmetrics is the ability to analyse conversion funnels in real time and produce clear visualisations of your site’s ability to convert visitors to whatever goals you set define.
Kissmetrics don’t offer a free service level, but their focus on actionable data should mean that the $30/month starting subscription isn’t too difficult to recoup for any commercial site.

6. Log File Analysis

Depending on your hosting set-up you might already have an alternative to Analytics installed and collecting data in the form of log files analysis.  Webservers collect masses of usage data as default behaviour and many hosts include software to analyse this data for free to call their customers.  Popular choices include AWStats and Webalizer.

The log file data that these packages analyse (and the way that they collect it), does differ from what you might be used to through the likes of Google Analytics, but will still give you valuable information on who is using your website and how they are using it.

One advantage of log file analysis programs is speed for the user.  Because these solutions analyse data that your server is already collecting there is no overhead at all for the site visitor.   Another aspect that appeals to some is that of data security.  Because you are not generating additional data, and in particular not sharing that data with a third party, information about your site is kept safely on your server.

7. Website Tracking Tools

3rd party tracking tools are the precursor to modern analytics suites and share many similarities with them. In most cases you are using a embedded snippet of code to pass data to a third party service who collect, collate and present it in a meaningful way to you.

The line between what I would term “website tracking” and “analytics” is blurry at best and technically the terms are Synonymous. However there is a clear difference between the likes of Google Analytics or Kiss Metrics and Sitemeter or eXTReMe Tracking.  Both groups of products deal with website visitors and their behaviour, however those that position themselves as Analytics tend to offer additional dimensions such as conversion tracking, segmentation and campaign tracking that are essential to the professional marketer.

However many site owners not only don’t want those features, but are actively put off by them. Being able to log in and see visitor numbers, most popular pages and what the last X visitors to the site did is all that is wanted and needed.  For such projects website tracking tools such as eXTReMe Tracking, Sitemeter, GoStats or Statcounter are ideal solutions.

Which Analytics Package Will I Use Next?

With all these options available it does beg the question of which I will use on my next project?  The answer to that will really be dependent on what that project is, but if I were a betting man I would put my money on Google Analytics.  Whilst there are brilliant alternatives out there, Google Analytics does provide a solution that is ideal for most projects. It’s also the package that I (like most people) are most familiar with, so the one that I can pull actionable data from most quickly.

However there are undoubtedly times when Google’s offering is not the best analytics product for the job and in those cases I am more than happy to turn to one of the options above to understand my site visitors better.





Friday 20 January 2012

SEO Tutorials,Principles,Techniques,Strategies and Guides



10 Steps to Start a Good Website with Common SEO Principles

This article will help you to create a good website with common seo principles in current industry. When I started my carreer, My dream was to start a website with good traffic and that will help all. But at that time I was not aware of Web Hosting and SEO related things. Later I studied basics of Web hosting and SEO, then started to experiment my knowledge with my site( www.seofreetools.net ). Thats the best practice to study SEO. Just start a site and experiment. Only thing is that, you need to spend the cost of a shirt.

Now a days I am thinking to write a simple article that will help all, who are new to Web hosting and SEO. I strongly feel that every one will have a desire to own a website. Also every one will like to add their website name with their address.

These are the 10 simple steps that I feel is the best practice to create a good website. Just go through this and enjoy the world of web hosting..:)

 
Domain Name Registration
First and very important section. You should careful about domain name. To find a good domain name, first you have to select a common keyword related to your website. That should be the Primary Keyword of our website.

There are lot of tools to find the suitable keyword for our business.

I will recommend Google Keyword Tool :

a) https://adwords.google.com/select/KeywordToolExternal
b) http://www.google.com/sktool/

eg : If our business related sports shoe, then search with this keyword in these tools.

You can see two types of variables in search result.

i) Advertiser Competition

ii) Approx Avg Search Volume

You should select a keyword with High search volume and Lowest Advertiser Competition.

This will be the Primary keyword of your site. Next step is to find out a domain name containing this Primary keyword. This will helps you a lot in google search because domain name will get more preference in google search.

Eg: bestsportshoe.com , getsportshoe.com etc.

Try to select a domain name with less characters and meaningful word because it will be easy to remember.

Refer these sites to register a domain.
yourdomainvalue.com
triple.com


Domain Space Registration( Buy a space to host your website)
By getting a domain name you have an address but for existence you need to buy some space to host your website.

According to your site's characterstics you have to select the web hosting plan.

The main factors we have to consider are

a) Plan fee=>The amount we have to pay for a plan(monthly or yearly)

b) Disk Space=>The total space(GB) to host our site

c) Monthly traffic => The total bandwidth allowed for site( if our site needs file upload and download functionality, then we need more bandwith )

d) Addon domains => If we need to host more than one site, consider this.

e) Developer tools => The tools needed to develop a site like php,asp,jsp,python,perl etc

f) Company => The quality of technical support of web hosting company

I will recommend   http://www.gethostingplans.com/
for selecting a suitable web hosting plan.

There is a good search option to find a suitable plan based on main factors.
http://www.gethostingplans.com/search-web-hosting-plans/ss/ 


Point Your Domain to Web Hosting Server
Next step is to point your domain to web hosting server, where you are going to host your site.

Eg: If your domain name is getsportshoe.com and IP of hosting server is 208.43.209.22 , then you have to point the domain to 208.43.209.22. 
Contact your domain registrar to point this. It will take atleast 24 hours to propagate domain all over the world.


Create a Template for Your Website( Look and Feel )
Its very important to have a good look and feel for your site. The most important feature of successful site is that users should remain on site for long period. The site design should be simple, attractive and user friendly.

There are a lot of free quality templates available in market.

Please refer these sites

http://www.templateworld.com/free_templates.html

http://myfreetemplatehome.com/

http://www.oswd.org/


Title, Keyword and Meta Description Tags
a) Primary meta keyword (The first keyword in the meta keyword tag)

First we should select a suitable Primary Keyword for each page and also that is related to our page content.

We can select this by using keyword suggestion tool of google.

https://adwords.google.com/select/KeywordToolExternal

By comparingAdvertiser Competition and Search volume value of keyword we can select suitable primary keyword for each page.

b) Use 5 keywords in our meta keywords tag

Sometimes search engines might think we are doing keyword spamming and it won't list our page. The search engine rank our keywords in the order that will appear. They will treat the first one as primary keyword and the second one a secondary keyword and so on. We should line our keywords up in the order of relevance to the page.

c) Use maximum 3 words in primary keyword

People rarely search for a keyword phrase over 3 words long. Two words in a keyword phrase is ideal.

d) Use our primary keyword in the first and last paragraph of each page

The search engines will give more relevancy to our page if they find our keywords at the beginning and at the end of our page. Also our primary keyword should appear in the first 90 characters of our content.

e) Use primary keyword as a text link in our page at least once.

Use primary keyword as link text in a page atleast once.

f) Title Tag

The title tag of a page plays most important role in search results at Google. Google chooses title tag as the heading of its search results.

Placement of our primary keyword should be at the start of the Title tag. We should make sure that all of our title tags are relevant, unique, and contain our primary keyword for each page.

Eg: women sport shoes provider

g) Meta description

Meta description should be between 100 and 200 characters. It is important to get a solid description of our page and most of the time google will show this as description of its search result. if it is less than 100 characters then search engine may not find it relevant enough to list. if More than 200 characters then search engines may truncate it. All pages should have different meta description. Use all of our keywords in meta description, with the primary keyword as close to the front as possible. Definitely within the first 5 words of our description.

Put our meta description all on one line in each page. This will ensure that the search engines reads it all.


Page URL and Links
First we should select a suitable Primary Keyword for each page and also that is related to our page content.

We can select this by using keyword suggestion tool (https://adwords.google.com/select/KeywordToolExternal).

a) Primary keyword must be an exact match with the file name of our page.

The search engines will give more privilege to the pages with url containing search keyword.

Eg: Primary keyword : women sport shoe

url1 : women-sports-shoe.html

url2 : shoe.html

The page with url1 will list first when user search with keyword women sport shoe.

Methods to implement this are

i) Use Frameworks

Use SEO friendly frame works to develop your site.

Eg: Joomla,Majento,django,Symfony etc

ii)Use .htaccess File(Linux server)

Using .htaccess files we can rewrite our URLs to include primary keyword of each page.

Eg:

Options +FollowSymlinks
RewriteEngine on
RewriteRule ^sp/([^/]+)/ show-plans.php?lid=$1 [NC]

b) Link Text( Text all of links in a page)

Link text is very import , Use primary keyword and synonyms of keywords as link text.

c) Backlinks( incoming links to a website or web page)

This will play a very important role in google pagerank( The rank or privillage given to each web page by google).

Google pagerank depends on number of quality backlinks from other sites.

Eg :

URL : http://www.gethostingplans.com/

Backlinks : 7

To find no of backlinks, search in google using keyword as "link:sitename"

Eg: link: http://www.gethostingplans.com/

We have several methods to increase backlinks.

i) Articles :

Writing and distributing industry specific articles is a great way to increase our backlink count and site traffic.

ii) Forum Posts :

Forum posts can help to raise our link counts, Please add a link to our site in signature of each post.

iii) Blog Comments :

Add comment to different blogs related to our business and please remember to add a link to our site.

iv) Free and Paid Directory Submissions :

Commonly used technique in SEO field. Directory based links can be of significant help, especially if they are from highly reputable directories, the two biggest being DMOZ.org and the Yahoo Directory. Also you can add your site to directory.seofreetools.net

Note : Find the perfect category for our site and check to see if it has an editor. If we see a link "Volunteer to edit this category" try and find another relevant location. Pages without active editors take much longer to get listed into. Once we find the perfect directory submit our site every 4-6 months until listed.

Please Check this Link : http://directory.seofreetools.net/

Create quality backlinks to your site by adding your site to seofreetools.net directory. This will help you to get quality backlinks. 

Site Navigation

Its another important that our website should be fully spiderable by the search engines. We must use Breadcrumbs for site navigation.

Breadcrumbs provide a trail for the user to follow back to the starting/entry point of a website.

Eg: Home page => Section page => Subsection page

This will also spread google page rank from main page to other sub pages.


Image Alt and Title Attribute

Image alt and title attributes will play an important role in image search. Also add it with image based site navigation or links(where image as link text).

If we have an image linked to another page, the alt and title attribute text will be same way as standard link anchor text is.


Sitemaps

It is a representation of the architecture of a web site. Its plays major role in search engine optimization.
Two types of site maps are there.

i) Sitemap page in the website

Eg : http://www.seofreetools.net/sitemap/

It should updated regularly because these sitemaps help search engine robots to crawl all pages from our site.

ii) Sitemap page submitted to google

Eg : http://www.gethostingplans.com/sitemap.txt

Create a sitemap and it should containg all pages of your site. Then submit it to google through Google webmaster tools( https://www.google.com/webmasters/tools/). Check Google webmaster tools daily to track all urls are indexed or not. 

Use this site to create sitemap file.

http://www.xml-sitemaps.com/


Track Perfomance of Your Site
Now your site is ready with basic search engine optimization principles but you have to track the perfomance of your site using different tools avialable in market.

Use Seo Panel (Free Seo Control Panel) to track and increase the perfomance of your websites.

a) Check keyword position in different search engines on daily basis.

Eg : keyword : 300gb hosting(http://www.gethostingplans.com/) as rank 6 in google search results because it is listed as 6th position when we search with "300gb hosting" as keyword)

We must watch this position using different tools available in this market.

Use this tool to find out the keyword position of your site in different search engines.

http://www.seofreetools.net/seo-tools/keyword-search-engine.php

b) Google page rank

Check your google page rank using

http://www.seofreetools.net/seo-tools/google-pagerank-script.php

c) Alexa rank(Traffic rank given to each site by alexa.com)

Check your alexa rank by http://www.seofreetools.net/seo-tools/alexa-rank-script.php

d) Number of visitors,seach keywords and visitor details use these tools to track all these things

(Google Analytics) https://www.google.com/analytics/

(Counter) http://counter.bizhat.com/

10 Search Engine Optimization Strategies

10 basic search engine optimization techniques you can use to improve your site rankings:

 
Meta Tags.

Meta tags are simple lines of code at the top of your web page programming that tell search engines about your page. Include the title tag, keywords tag, description tag, and robots tag on each page.

Create and update your sitemap.

Developing a site map is a simple way of giving search engines the information they need to crawl your entire website. There are plenty of free software packages on the web that can help you generate a sitemap. Once you create a sitemap, submit it to Google and Yahoo.

Ensure that all navigation is in HTML.

All too often, navigational items are in the form of java script. Even though navigation technically still works in this format, it's not optimized. Create your navigation in HTML to enhance internal links throughout your website.

Check that all images include ALT text.

Your image's alt text is spidered by search engines. If you're not including your keywords in alt text, you're missing out on a huge opportunity for improved search engine result placements. Label all of your images properly.

Use Flash content sparingly.

Content generated through java script or flash is a big no-no. Some webmasters like to use flash because of the presentation. If you must, use it sparingly, but only after your site has been properly optimized with basic search engine optimization in mind.

Make sure that your website code is clean.

Keep in mind when optimizing a web page crawlers are basically only looking at your source code. When programming your web pages, having W3C compliant code can make all the difference. Run your code through a W3C validator before promoting.

Place keywords in your page content.

Search engines scan your website and web pages for keywords. Shoot for a keyword density of between two and eight percent. Google likes your page to be at the lower end of this scale and Yahoo at the upper end.

Submit your website to search engine directories.

It's always a good idea to let large search engine directories know that you're out there. Submit your website URL to directories like Google, Yahoo,DMOZ and directory.seofreetools.net.

Build links to your website.

Consider building a link exchange program or create one-way links to your site using articles or forum posts. All major search engines value the importance of your website based on how many others websites are linking to it.
Add your site to directory.seofreetools.net to get quality backlinks to your site.

Learn the basics.

Learning to optimize your website for search engines takes time and patience. Start by applying basic search engine optimization principles. If you're new to website optimization, or even a well seasoned veteran, begin by prioritizing which pages are most important to you and go from there. Soon you'll find yourself moving up the rankings.


Thursday 19 January 2012

Promoting Your Site to Increase Traffic



The main purpose of SEO is to make your site visible to search engines, thus leading to higher rankings in search results pages, which in turn brings more traffic to your site. And having more visitors (and above all buyers) is ultimately the goal in sites promotion. For truth's sake, SEO is only one alternative to promote your site and increase traffic – there are many other online and offline ways to do accomplish the goal of getting high traffic and reaching your target audience. We are not going to explore them in this tutorial but just keep in mind that search engines are not the only way to get visitors to your site, although they seem to be a preferable choice and a relatively easy way to do it.

1. Submitting Your Site to Search Directories, forums and special sites

After you have finished optimizing your new site, time comes to submit it to search engines. Generally, with search engines you don't have to do anything special in order to get your site included in their indices – they will come and find you. Well, it cannot be said exactly when they will visit your site for the first time and at what intervals they will visit it later but there is hardly anything that you can to do invite them. Sure, you can go to their Submit a Site pages in submit the URL of your new site but by doing this do not expect that they will hop to you right away. What is more, even if you submit your URL, most search engines reserve the right to judge whether to crawl your site or not. Anyway, here are the URLs for submitting pages in the three major search engines: Google, MSN, and Yahoo.

In addition to search engines, you may also want to have your site included in search directories as well. Although search directories also list sites that are relevant to a given topic, they are different from search engines in several aspects. First, search directories are usually maintained by humans and the sites in them are reviewed for relevancy after they have been submitted. Second, search directories do not use crawlers to get URLs, so you need to go to them and submit your site but once you do this, you can stay there forever and no more efforts on your side are necessary. Some of the most popular search directories are DMOZ and Yahoo! (the directory, not the search engine itself) and here are the URLs of their submissions pages: DMOZ and Yahoo!.

Sometimes posting a link to your site in the right forums or special sites can do miracles in terms of traffic. You need to find the forums and sites that are leaders in the fields of interest to you but generally even a simple search in Google or the other major search engines will retrieve their names. For instance, if you are a hardware freak, type “hardware forums” in the search box and in a second you will have a list of sites that are favorites to other hardware freaks. Then you need to check the sites one by one because some of them might not allow posting links to commercial sites. Posting into forums is more time-consuming than submitting to search engines but it could also be pretty rewarding.

2. Specialized Search Engines

Google, Yahoo!, and MSN are not the only search engines on Earth, nor even the only general-purpose ones. There are many other general-purpose and specialized search engines and some of them can be really helpful for reaching your target audience. You just can't imagine for how many niches specialized search engines exist – from law, to radiostations, to educational one! Some of them are actually huge sites that gather Webwide resources on a particular topic but almost all of them have sections for submitting links to external sites of interest. So, after you find the specialized search engines in your niche, go to their site and submit your URL – this could prove more trafficworthy than striving to get to the top of Google.

3. Paid Ads and Submissions

We have already mentioned some other alternatives to search engines – forums, specialized sites and search engines, search directories – but if you need to make sure that your site will be noticed, you can always resort to paid ads and submissions. Yes, paid listings are a fast and guaranteed way to appear in search results and most of the major search engines accept payment to put your URL in the Paid Links section for keywords of interest to you but you also must have in mind that users generally do not trust paid links as much as they do with the normal ones – in a sense it looks like you are bribing the search engine to place you where you can't get on your own, so think twice about the pros and cons of paying to get listed.


Visual Extras and SEO

As already mentioned, search engines have no means to index directly extras like images, sounds, flash movies, javascript. Instead, they rely on your to provide meaningful textual description and based on it they can index these files. In a sense, the situation is similar to that with text 10 or so years ago – you provide a description in the metatag and search engines uses this description to index and process your page. If technology advances further, one day it might be possible for search engines to index images, movies, etc. but for the time being this is just a dream.

1. Images

Images are an essential part of any Web page and from a designer point of view they are not an extra but a most mandatory item for every site. However, here designers and search engines are on two poles because for search engines every piece of information that is buried in an image is lost. When working with designers, sometimes it takes a while to explain to them that having textual links (with proper anchor text) instead of shining images is not a whim and that clear text navigation is really mandatory. Yes, it can be hard to find the right balance between artistic performance and SEO-friendliness but since even the finest site is lost in cyberspace if it cannot be found by search engines, a compromise to its visual appearance cannot be avoided.


With all that said, the idea is not to skip images at all. Sure, nowadays this is impossible because the result would be a most ugly site. Rather the idea is that images should be used for illustration and decoration, not for navigation or even worse – for displaying text (in a fancy font, for example). And the most important – in the <alt> attribute of the <img> tag, always provide a meaningful textual description of the image. The HTML specification does not require this but search engines do. Also, it does not hurt to give meaningful names to the image files themselves rather than name them image1.jpg, image2.jpg, imageN.jpg. For instance, in the next example the image file has an informative name and the alt provides enough additional information: <img src=“one_month_Jim.jpg” alt=“A picture of Jim when he was a one-month puppy”>. Well, don't go to extremes like writing 20-word <alt> tags for 1 pixel images because this also looks suspicious and starts to smell like keyword-stuffing.

2. Animation and Movies


The situation with animation and movies is similar to that with images – they are valuable from a designer's point of view but are not loved by search engines. For instance, it is still pretty common to have an impressive Flash introduction on the home page. You just cannot imagine what a disadvantage with search engines this is – it is a number one rankings killer! And it gets even worse, if you use Flash to tell a story that can be written in plain text, hence crawled and indexed by search engines. One workaround is to provide search engines with a HTML version of the Flash movie but in this case make sure that you have excluded the original Flash movie from indexing (this is done in the robots.txt file but the explanation of this file is not a beginners topic and that is why it is excluded from this tutorial), otherwise you can be penalized for duplicate content.

There are rumors that Google is building a new search technology that will allow to search inside animation and movies and that the .swf format will contain new metadata that can be used by search engines, but until then, you'd better either refrain from using (too much) Flash, or at least provide a textual description of the movie (you can use an <alt> tag to describe the movie).

3. Frames

It is a good news that frames are slowly but surely disappearing from the Web. 5 or 10 years ago they were an absolute hit with designers but never with search engines. Search engines have difficulties indexing framed pages because the URL of the page is the same, no matter which of the separate frames is open. For search engines this was a shock because actually there were 3 or 4 pages and only one URL, while for search engines 1 URL is 1 page. Of course, search engines can follow the links to the pages in the frameset and index them but this is a hurdle for them.

If you still insist on using frames, make sure that you provide a meaningful description of the site in the <noframes> tag. The following example is not for beginners but even if you do not understand everything in it, just remember that the <noframes> tag is the place to provide an alternative version (or at least a short description) of your site for search engines and users whose browsers do not support frames. If you decide to use the <noframes> tag, maybe you'd like to read more about it before you start using it.

Example: <noframes> <p> This site is best viewed in a browser that supports frames. </p><p> Welcome to our site for prospective dog adopters! Adopting a homeless dog is a most noble deed that will help save the life of the poor creature. </p></noframes>

4. JavaScript


This is another hot potato. It is known by everybody that pure HTML is powerless to make complex sites with a lot of functionality (anyway, HTML was not intended to be a programming languages for building Web applications, so nobody expects that you can use HTML to handle writing to a database or even for storing session information) as required by today's Web users and that is why other programming languages (like JavaScript, or PHP) come to enhance HTML. For now search engines just ignore JavaScript they encounter on a page. As a result of this, first if you have links that are inside the JavaScript code, chances are that they will not be spidered. Second, if JavaScript is in the HTML file itself (rather than in an external .js file that is invoked when necessary) this clutters the html file itself and spiders might just skip it and move to the next site. Just for your information, there is a <noscript> tag that allows to provide alternative to running the script in the browser but because most of its applications are pretty complicated, it is hardly suitable to explain it here.



Wednesday 18 January 2012

What Is SOPA And Why Does It Matter?

The tech industry is abuzz about SOPA and PIPA, a pair of anti-piracy bills. Here's why they're controversial, and how they would change the digital landscape if they became law.

What is SOPA?

SOPA is an acronym for the Stop Online Piracy Act. It's a proposed bill that aims to crack down on copyright infringement by restricting access to sites that host or facilitate the trading of pirated content.

SOPA's main targets are "rogue" overseas sites like torrent hub The Pirate Bay, which are a trove for illegal downloads. Go to the The Pirate Bay, type in any current hit movie or TV show like "Glee," and you'll see links to download full seasons and recent episodes for free.

Content creators have battled against piracy for years -- remember Napster? -- but it's hard for U.S. companies to take action against foreign sites. The Pirate Bay's servers are physically located in Sweden. So SOPA's goal is to cut off pirate sites' oxygen by requiring U.S. search engines, advertising networks and other providers to withhold their services.

That means sites like Google wouldn't show flagged sites in their search results, and payment processors like eBay's PayPal couldn't transmit funds to them.

Both sides say they agree that protecting content is a worthy goal. But opponents say that the way SOPA is written effectively promotes censorship and is rife with the potential for unintended consequences.

Silicon Valley woke up and took notice of the implications when SOPA was introduced in the House of Representatives in October. But its very similar counterpart, PIPA, flew under the radar and was approved by a Senate committee in May. PIPA is now pending before the full Senate and scheduled for a vote on January 24, though some senators are pushing for a delay.

Isn't copyright infringement already illegal?

Yes. The 1998 Digital Millennium Copyright Act lays out enforcement measures.
Let's say a YouTube user uploads a copyrighted song. Under the current law, that song's copyright holders could send a "takedown notice" to YouTube. YouTube is protected against liability as long as it removes the content within a reasonable time frame.

When it gets a DMCA warning, YouTube has to notify the user who uploaded the content. That user has the right to file a counter-motion demonstrating that the content doesn't infringe on any copyrights. If the two sides keep disagreeing, the issue can go to court.

The problem with DMCA, critics say, is that it's useless against overseas sites.
SOPA tackles that by moving up the chain. If you can't force overseas sites to take down copyrighted work, you can at least stop U.S. companies from providing their services to those sites. You can also make it harder for U.S. Internet users to find and access the sites.

But SOPA goes further than DMCA and potentially puts site operators -- even those based in the U.S. -- on the hook for content that their users upload. The proposed bill's text says that a site could be deemed a SOPA scofflaw if it "facilitates" copyright infringement.
That very broad language has tech companies spooked.

Sites like YouTube, which publishes millions of user-uploaded videos each week, are worried that they would be forced to more closely police that content to avoid running afoul of the new rules.

"YouTube would just go dark immediately," Google public policy director Bob Boorstin said at a conference last month. "It couldn't function."

Tech companies also object to SOPA's "shoot first, ask questions later" approach.

The bill requires every payment or advertising network operator to set up a process through which outside parties can notify the company that one of its customers is an "Internet site is dedicated to theft of U.S. property." Once a network gets a notification, it is required to cut off services to the target site within five days.

Filing false notifications is a crime, but the process would put the burden of proof -- and the legal cost of fighting a false allegation -- on the accused.

As the anti-SOPA trade group NetCoalition put it in their analysis of the bill: "The legislation systematically favors a copyright owner's intellectual property rights and strips the owners of accused websites of their rights."

Who supports SOPA, and who's against it?

The controversial pair of bills, SOPA and PIPA, have sparked an all-out war between Hollywood and Silicon Valley. In general, media companies have united in favor of them, while tech's big names are throwing their might into opposing them.

SOPA's supporters -- which include CNNMoney parent company Time Warner, plus groups such as the Motion Picture Association of America -- say that online piracy leads to U.S. job losses because it deprives content creators of income.

The bill's supporters dismiss accusations of censorship, saying that the legislation is meant to revamp a broken system that doesn't adequately prevent criminal behavior.

But SOPA's critics say the bill's backers don't understand the Internet's architecture, and therefore don't appreciate the implications of the legislation they're considering.

In November, tech behemoths including Google and Facebook lodged a formal complaint letter to lawmakers, saying: "We support the bills' stated goals. Unfortunately, the bills as drafted would expose law-abiding U.S. Internet and technology companies to new uncertain liabilities [and] mandates that would require monitoring of web sites."

Where does the bill stand now?

SOPA was once expected to sail quickly through committee approval in the House. But after a massive pushback from tech companies and their supporters, it's being extensively reworked. House Majority Leader Eric Cantor has said SOPA won't come up for a committee vote as-is.

That means the bill could change a lot from day to day -- and one major tenet of the original legislation has already been removed. As originally written, SOPA would have required Internet service providers (ISPs) to block access to sites that law enforcement officials deemed pirate sites.

But the White House said its analysis of the original legislation's technical provisions "suggests that they pose a real risk to cybersecurity," and that it wouldn't support legislation that mandates manipulating the Internet's technical architecture.

The White House's statement came shortly after one of SOPA's lead sponsors, Texas Republican Lamar Smith, agreed to remove SOPA's domain-blocking provisions.

Smith's office says it's still planning to work through amendments to the bill, but his representatives declined to estimate how long that will take. They plan to resume revision of the bill in February.

A markeup process once expected to take days is now likely to last for months. As the outcry around SOPA grows louder, the bill's momentum in Congress appears to be fading.

What are the alternatives?

One option, of course, is that Congress does nothing and leaves the current laws in place.
Alternative legislation has also been proposed. A bipartisan group of House members has begun drafting the Online Protection and Enforcement of Digital Trade Act (OPEN), a compromise bill.

Among other differences, OPEN offers more protection than SOPA would to sites accused of hosting pirated content. It also beefs up the enforcement process. It would allow digital rights holders to bring cases before the U.S. International Trade Commission (ITC), an independent agency that handles trademark infringement and other trade disputes.

OPEN's backers have posted the draft legislation online and invited the Web community to comment on and revise the proposal.

SOPA supporters counter that the ITC doesn't have the resources for digital enforcement, and that giving it those resources would be too expensive.

Monday 16 January 2012

Google AdWords


Google AdWords is Google's main advertising product and main source of revenue. Google's total advertising revenues were USD$28 billion in 2010.[2] AdWords offers pay-per-click (PPC) advertising, cost-per-thousand (CPM) advertising, and site-targeted advertising for text, banner, and rich-media ads. The AdWords program includes local, national, and international distribution. Google's text advertisements are short, consisting of one headline consisting of 25 characters and two additional text lines consisting of 35 characters each. Image ads can be one of several different Interactive Advertising Bureau (IAB) standard sizes.

Pay-per-click advertisements (PPC)


Advertisers select the words that should trigger their ads and the maximum amount they will pay per click. When a user searches on Google, ads (also known as "creatives" within Google) for relevant words appear as "sponsored links" on the right side of the screen, and sometimes above the main search results. Click-through rates (CTR) for the ads are about 8% for the first ad, 5% for the second one, and 2.5% for the third one. Search results can return from 0 to 12.

The ordering of the paid-for listings depends on other advertisers' bids (PPC) and the Quality Score of all ads shown for a given search. The search-engine system calculates the quality score on the basis of historical click-through rates, relevance of an advertiser's ad text and keywords, an advertiser's account history, and other relevance factors as determined by Google. The quality score is also used by Google to set the minimum bids for an advertiser's keywords.[6] The minimum bid takes into consideration the quality of the landing page as well, which includes the relevancy and originality of content, navigability, and transparency into the nature of the business.[7] Though Google has released a list of full guidelines for sites,[8] the precise formula and meaning of relevance and its definition is in part secret to Google and the parameters used can change dynamically.

The auction mechanism that determines the order of the ads is a generalized second-price auction.[9][10] This is claimed to have the property that the participants do not necessarily fare best when they truthfully reveal any private information asked for by the auction mechanism (in this case, the value of the keyword to them, in the form of a "truthful" bid).


AdWords features


IP address exclusion

In addition to controlling ad placements through methods such as location and language targeting, ad targeting can be refined with Internet Protocol (IP) address exclusion. This feature enables advertisers to specify IP address ranges where they don't want their ads to appear.

Up to 100 IP addresses, or ranges of addresses, can be excluded per campaign. All ads in the campaign are prevented from showing for users with the IP addresses specified.
Location-based exclusion is also offered as a method of narrowing targeted users.

Frequency capping


Frequency capping limits the number of times ads appear to the same unique user on the Google Content Network. It doesn't apply to the Search Network. If frequency capping is enabled for a campaign, a limit must be specified as to the number of impressions allowed per day, week, or month for an individual user. The cap can be configured to apply to each ad, ad group, or campaign.


Placement targeted advertisements (formerly Site-Targeted Advertisements)

In 2003 Google introduced site-targeted advertising. Using the AdWords control panel, advertisers can enter keywords, domain names, topics, and demographic targeting preferences, and Google places the ads on what they see as relevant sites within their content network. If domain names are targeted, Google also provides a list of related sites for placement. Advertisers may bid on a cost-per-impression (CPM) or cost-per-click (CPC) basis for site targeting.

With placement targeting, it is possible for an ad to take up the entire ad block rather than have the ad block split into 2 to 4 ads, resulting in higher visibility for the advertiser.

The minimum cost-per-thousand impressions bid for placement targeted campaigns is 25 cents. There is no minimum CPC bid, however.


AdWords distribution

All AdWords ads are eligible to be shown on www.google.com. Advertisers also have the option of enabling their ads to show on Google's partner networks. The "search network" includes AOL search, Ask.com, and Netscape. Like www.google.com, these search engines show AdWords ads in response to user searches, but do not affect quality score.

The "Google Display Network" (formerly referred to as the "content network") shows AdWords ads on sites that are not search engines. These content network sites are those that use AdSense and DoubleClick, the other side of the Google advertising model. AdSense is used by website owners who wish to make money by displaying ads on their websites. Click through rates on the display network are typically much lower than those on the search network and are therefore ignored when calculating an advertiser's quality score.

Google automatically determines the subject of pages and displays relevant ads based on the advertisers' keyword lists. AdSense publishers may select channels to help direct Google's ad placements on their pages, to increase performance of their ad units. There are many different types of ads that can run across Google's network, including text ads, image ads (banner ads), mobile text ads, and in-page video ads.
Google AdWords' main competitors are Yahoo! Search Marketing and Microsoft adCenter.

In 2010, Yahoo formed a partnership with Microsoft, giving Microsoft the control over powering the Yahoo search marketing ads. Both accounts are now run through Microsoft AdCenter. When ads are displayed on Yahoo now, it is powered by Microsoft AdCenter and is run through the Microsoft software interface.



AdWords account management




To help clients with the complexity of building and managing AdWords accounts search engine marketing agencies and consultants offer account management as a business service. This has allowed organizations without advertising expertise to reach a global, online audience. Google has started the Google Advertising Professionals program to certify agencies and consultants who have met specific qualifications and passed an exam. Google also provides account management software, called AdWords Editor.

Correct choice of keywords is very important because the targeting of ads by Google is totally dependent upon the keyword selection. Keyword choice is also a very large factor in determining the level of exposure the Google ad receives, and to a large extent, who sees the ad.

Another useful feature is the My Client Center available to Google Professionals (even if not yet passed the exam or budget parameters) whereby a Google professional has access and a dashboard summary of several accounts and can move between those accounts without logging in to each account.

The Google Adwords Keyword Tool provides a list of related keywords for a specific website or keyword.


Google Click-to-Call


Google Click-to-Call was a service provided by Google which allows users to call advertisers from Google search results pages. Users enter their phone number, Google calls them back and connects to the advertiser. Calling charges are paid by Google. It was discontinued in 2007.[16] For some time similar click-to-call functionality was available for results in Google Maps. In the Froyo release of Google's Android operating system, in certain advertisements, there is a very similar functionality, where a user can easily call an advertiser.