Showing posts with label Search Engine Optimization. Show all posts
Showing posts with label Search Engine Optimization. Show all posts

Sep 8, 2013

Is my Bounce Rate Normal or does it change with the type of my website?

Is my Bounce Rate Normal or does it change with the type of my website?
Bounce rate is one of those web-metrics which have always baffled webmasters. Bounce rate determines the fate of a website, whether it will grow or not. Forums and comment boxes are filled with questions like "Is my bounce rate good enough?" or "Does bounce rate change according to website types" or "What should I do? My bounce rate is too high!!". 

Just know this, Yes bounce rate changes as per the type of website you run, and there is nothing high-class Google wizardry in it, just simple logic. As you know websites are of different types, the bounce rate for any website is different as per the functionality. So an online web store will have a different bounce rate than that of a personal blog. 

The following article is particularly useful if you doubt your bounce rate or if you are not sure whether your bounce rate is normal. Here is a complete range of bounce rates for different websites as per functions.

Online Shopping/Services Websites

You run a website which offers products and allows users to purchase the products online through a payment gateway.

Ideal Bounce rate percentageSuch websites are highly popular with the world swarming up with a huge number of examples like amazon, ebay etc.  An online shopping website should be having less than 35 % bounce rate. The obvious question is why such low value? The only logical answer is because of the huge number of steps in the transition. (For example, You select on a product to buy, you’ll be redirected to the webpage for selection/entry of address for shipping, then comes confirmation of order, then another webpage for the actual monetary transition where you will be directed to your credit card/bank’s website and then finally the confirmation of purchase). So bounce rate must be less than 35% for the website to prosper.


Online service website


Offline Shopping/Services Website

You run a website that offers a catalog of products but allows users to purchase only through offline methods like money order, VPP, telephone etc.

Ideal Bounce rate percentageA new business would not directly plunge into online services. They would have a tendency to keep their business out of the web sphere. Also certain services involve huge transactions which are highly risky for the client and the provider to be allowed on the website. So such transactions need to be done on phone or through email correspondences. For example a housing developer’s website looking for people to invest Or a mattress specialist not ready to indulge in online trading etc. Such websites must have a bounce rate of less than 40%.



Directories and online Encyclopedias


Ideal Bounce rate percentageYou run a website which functions like a dictionary where users can search and find information. You cannot expect to have a bounce rate of more than 20-25% for such type of websites. Since such websites have each and every thing a person needs or might need, he/she is not expected to leave the website for another one. So that explains such low bounce rate.


Directory


Blogs

Ideal Bounce rate percentageYou do not run a website but a blog which is updated regularly with new posts. If you have a blog where you publish content from time to time, you are in for a surprise. Content based websites and webpages tend to have huge bounce rates in the likes of 50 to 70%. If you have it below 50%, you’re in a very safe zone and you need not worry, cause with such low bounce rate, your PageRank would be high too.


Video Blogs or PodCasts


Though most people assume both are the same, one must understand that those are two different things. A video blog is a conventional blog on popular platforms like blogger or wordpress with video(s) hosted on it. A podcast on the other hand is a page with actually hosts videos only (like you-tube). You run a YouTube page or a website which hosts either video or music or similar media.


Ideal Bounce rate percentageWebsites which host videos/music or similar media, bounce rate tends to be very low. As one the user lands on the page, there are a lot of entertainment/infotainment options available to the users. Also the user is on the page for a lot of time which opens him up for new options. So a bounce rate of 20 to 30% is not a value to be raised an eyebrow on.


Some exceptions

If you have very high bounce rate i.e. near 90 to 100%, do not lose heart as there are some exceptions to the above mentioned rules.

  1. A website with only 1-5 pages, will have a bounce rate near or above 90%. This is because; a website with low number of pages will not provide any incentive to the visitor to visit other pages. For example, a website with a single web page will have a bounce rate of 100%. So it is not a bad thing if your low webpage website has unbelievably high bounce rate.
  2. New websites/blogs need not worry, as their bounce rate will stabilize in a few weeks. For new Blogs there is a problem, your bounce rate will not decrease unless you have a decent number of blog posts which users might visit after the landing page.
  3. Certain Blogs also tend to have a bounce rate of near 70 to 80%. This depends on the number of articles already present on the blog and the amount of time the blog has existed in the web sphere. This is because; readers come to the website in search for information for a particular topic and leave when satisfied. So bounce rate tends to be high. For such situations, I would recommend using the related posts technique, where links to related articles is displayed at the end of the article.

Where You Went Wrong

After reading this, if you still think that your bounce rate is on the higher side, There might be reasons behind it. I have listed down a few of them, comment if you know another.

  1. You are lazy, (either a potato blogger or a lazy one)
  2. You are not posting recently (Learn what happens when a daily post blog is left untouched for a week)
  3. Not giving enough attention to the content of your website or the steps that come after good content
  4. Lack of External and Internal Links
  5. Higher Page load time which might be, so high that your users leave before your website loads.
  6. You have not read the article on 4 sureshot ways to improve your bounce rate.


This must have answered your question
Mohitchar Is my Bounce Rate Normal or does it change with the type of my website?

May 27, 2013

Why does Wikipedia pages come on the top in Google Search Results

Why does Wikipedia pages come on the top in Google Search Results front
One of the questions posed by many SEO enthusiasts is why does Wikipedia always comes on top of a Google search.

Wikipedia is the one website which is said to help youngsters in their reports, elders in their projects and the savvies to satisfy their cravings. But what makes Google tick? How does a simple content based website soar to a PageRank 8 giant with more than 6 million external links? 

Why does wiki scores when it comes to Google. Another meaningful question comes up, Can Wikipedia sustain its position among the top corporations who play by the money?

A really powerful yet meaningful statement was once made by Jimmy or "Jimbo" Wales the co-founder of Wikipedia on its creation, "The core of Wikipedia is something people really believe in. That is too valuable for the world to screw it up.". So lets hear all about the myths and truths about the web giant which floats on donations...

The Myths

Many answers were found in answers.com and yahoo answers along with several forums to start with. Though the answers surely deserve a face-palm but one must know what the people think too.

Pay me some

So some not so knowledgeable say that Wikipedia pays money to Google to put it on the top. Let me remind you that in my own 7 year journey, I've never  seen a single premium ad for Wikipedia  Let us assume for the sake of it, that it is true if you talk about premium ads, but Wikipedia runs on charity. With no other means of getting money (adverts included), the website cannot even think of premium listings.

Ol Reciprocal way

Some have even discussed the possibility of reciprocal links shared between Google and wiki. Which seems thoroughly far-fetched as google itself seems to be against reciprocal links. But can it be true? Of course not! The pages at Wikipedia are always written, moderated and maintained by general public.

Business hotshots

Some forums suggested big-shots pay Wikipedia to generate articles on their business which is finally paid to get links. This again leads to a fallacy because they just have 400 servers and 95 to 100 people in staff. In short the number 5 site serves 454 million different people every month – with billions and billions of page views. Even after all this, the web giant is merely holding its ends together let alone buy links to get on the top. Haven't you seen the donation requests by its founders? People can still donate at the keep wiki free campaign.

The Truths

Among the illogical and flawed reasons explained by the seemingly less knowledgeable webmaster community, people seem to forget the reason which lies right in-front of them, proper and legal search engine optimization techniques. The following are the techniques and methodologies adopted by the Wikimedia community which automatically leads Wikipedia right up to the doorways of great SERPs.

Profound content system

This is the one thing that slaps in the face of every two cent critic who says ‘pages at wiki are basically a collection of copied statements from various websites’. It is quite evident from the studies that content on the pages at wiki are never copied, instead just the idea and the logic is represented in a more tasteful and detailed manner. To find more about ways Wikipedia can copy, take a look at the list at SEOmoz.

keywords
In short the system of User Generated Content can be very well seen in its true form. It is however increasingly evident that its all the other way-round  People actually copy the complete text from wiki to impress superiors with their report. There are cases when people try to prevent others from copying from their website (There are methods however to get around it). But never has anybody seen Wikipedia done something so far.

Update Frequency

It is a very known fact that search bots for different search engines visit websites according to the frequency of updation. In short more a page is updated the more will it be visited by Google search bots to keep the Google directory updated. The system was very well explained in previous articles. The following are the revision history for two separate pages present in Wikipedia.

revision history for two separate pages present in Wikipedia.

The same methodology of updation is followed by several webmasters to make sure that their blog/website stays up all the time. The philosopy of updation is followed even in the backrooms of codemakit.comAfter writing quality content this is next big thing a website must do.


Detailed and deep linking

When a search engine bot finds out a hyperlink, The bot visits the page its linked to. This is noted in the directory. If the said page again contains a link which has not been yet been included in the directory, required steps are taken. Wikipedia's attempt at helping search engines refresh their directories is often rewarded with higher SERPs

Psychology

Wiki pages are often regarded as ones that affect the psychology of the users by brainwashing them with huge information and innumerable hyperlinks. This finally leads the users to believe that every thing that exists on a wiki page is ‘Sanatan’ Truth, which is one of the reasons why Wikipedia pages are the ones that are always cited in reports and briefs. This repercussion however gives rise to another good thing, the users in lieu of their belief spend higher time on the page than any other website. 
The psychological technique however is arguably the side effect of its profound content and moderation systems mixed with an inherent need for betterment and detail by the moderators and contributors. There are several other psychological methods which can help webmasters pull the user’s focus. It is discussed in detail at a previous article on the Psychology of SEO.

Keyword and domain Authority

One of the main reasons for higher SERPs for different terms is because of its keyword and domain authority.
Lets not pull facts out of thin air and see some example for domain authority.
The following page for SEO (website)  http://en.wikipedia.org/wiki/Search_engine_optimization

domain as well as keyword authority

It is pretty clear that wiki has won the round for domain as well as keyword authority. Not many pages would have such clear cut separations for different terms.

Exceptional linking system

According to Majestic SEO, the Wikipedia domain has about 50,713 referring domains and an impressive 6,611,255 external back-links. Two very important link based terms come under purview. These terms are in use by Majestic SEO to determine the authority and importance of the said website. 
Citation flow which is the number of citations to a particular URL or domain stands at 70 which certainly speak for itself. The trust flow which is the number of clicks from a set of trusted websites.

Brilliant use of NoFollow

Wiki has one side which is untouched and unseen by many. Let us look at the selfish side of Wikipedia and the people behind it. Here are some of the examples.

Total Links: 1732
Internal Links: 1545
Out-Going Links: 187
No-Follows: 188

Total Links: 445
Internal Links: 407
Out-Going Links: 38
No-Follows: 38

Total Links: 167
Internal Links: 165
Out-Going Links: 2
No-Follows: 1

The above are 3 different cases of tree different Wikipedia pages. As is quite evident from the above, nearly every outgoing link is a NoFollow.

“Wiki has been famous for using and throwing links, I was on the top of the world when I saw a link towards my website from Wikipedia  The disheartening news came to me afterwards when I found out that it was peppered with a  NoFollow link.” says a frustrated anonymous webmaster.

The above cases have been observed in many occasions. Here the editing and the references are given in such a way that the link juice does not flow off. This methodology has been used by webmasters over the decades to improve the pagerank. The same Nofollow has been used by bloggers in their blog comments to take care of comment spammers. However, for Wikipedia even if the editing is done honestly, then the moderators or proofreaders will convert the innocent looking link into a NoFollow. Such usage opens up possibilities of users going directly to the website from wiki itself, but limits the possibility of the website getting higher links because of an increase in the PageRank or domain authority. In short the excellent use of NoFollow wherever required helps wiki to float above others in the rat race.

Back-links in Overwhelming numbers

Here another usage of the psychological technique comes into play. Webmasters and enthusiasts keep on quoting Wikipedia for different subjects in the process of making the article more and more authentic. Its their way of saying "See! didn't I tell ya! its on Wikipedia too, go check it out." Now multiply this attitude to the millions of bloggers and webmasters and you'll get a jaw dropping number. \

data from Majestic SEO from checkpagerank

According to data from Majestic SEO from checkpagerank, Wikipedia has 862,698,514 External Back-links, 2,749,595 Referring Domains, 2,144,395 Back-links from educational websites and 314,508 Back-links from government websites out of which, acquiring later two in such numbers is a feat in itself.

Therefore authenticating the statement from my friend Samuel Johnson, who used to say "Next time you feel like pointing finger at the content giant; try matching their numbers and then we'll talk."

This answered your question,
Why does Wikipedia pages come on the top in Google Search Results mohitchar

Apr 12, 2013

Google’s Crawling Process De-Mystified


Google’s Crawling Process De-MystifiedIt is interesting to know that the process of crawling is done with the help of a program known as Google-bot. The program is so written that it follows an algorithmic process to determine which sites to crawl, when to crawl, and how often to crawl. 

The crawling process starts from one URL, the bot then encounters an external link or Internal Link. In a particular URL, Google-bot processes metatags, Image sources, Hyperlinks, ALT tags. However, Google confesses that its algorithm cannot process rich media files or dynamic pages.

The frequency of Google’s crawl process is a grey area. Two different types of crawl process happen on a website.
  • Fresh Crawl: Maximum 6 days to crawl. Very few URLs crawled at a time.
  • Deep Crawl: Extensive amount of crawling done. Might take upto a month for completion.

Parts of Google

Similarly for Google's search Process, There are three main parts of Google’s search process.
  • Google Bot
  • The Indexer
  • The Query Processor

Google Bot

Google-Bot or Google’s web crawler is a program which crawls and pulls pages from the web and gives it to the Indexer. Its function in detail is to find a page, download or cache the entire web page and send it to the Indexer. It is interesting to note, that to prevent overloading of servers by Google’s requests, the Google-bot, deliberately crawls through the site slower than its Have a look at the Google Guide.

Google’s Indexer

The indexer is basically a dictionary or directory operative. The pages retrieved by Google-Bot is stored in the Indexer, sorted alphabetically. The indexer ignores words like ‘of’, ‘on’, ‘how’, ‘why’ and punctuation marks in addition to multiple spaces. Read More at Google Truths

Google’s Query Processor

The processor’s main function is to process the commands given by the user and match them to the already documented files in the indexer. The complete process is carried out in accordance with the so called, Google’s PageRank. You can find out more about PageRank in previous articles.
It is interesting to note that a single query made by the user to the search box, 1000 machines in 0.2 seconds as per Amit agrawal from Labnol

This was all about,
Google’s Crawling Process De-Mystified MohitChar

Apr 8, 2013

Enslave and Control Search Engines to your wishes

Enslave and Control Search Engines to your wishes front
Find out answers to some of the most pressing questions in the mind of all webmasters, like How do you know if your site has just been crawled? or What do you do to keep your site regularly crawled or How to force search engines to crawl your website more often?

Imagine a situation where you do not need to fret in the morning worrying about how your articles fared among the visitors, the search engines treat your website with respect and honor. Wonder how it would feel like. Now imagine another aggressive situation where all the search engines are in your pocket and because of it, the visitors worship your website.

The only way this is going to happen is when your learn how the search engines function. For example, when you know how your car functions, you can always modify it to run faster. SO when you understand how the search engines function you can always modify your site to function better and give better returns.

Graph Example

How do you know if your site has just been crawled? 

There are certain indications to remind you that your site has just been crawled. But the most definitive indication is a sharp increase in the visitor statistics as shown in the graph(from an actual blog). Though your case might not be as aggressive as given here, but you get the point. You must find out this particular peak in your case and find out the frequency. Now time your articles accordingly to get maximum viewership. To change the frequency of crawling, you must have to change the frequency of posting on your blog or the frequency of updation of your website. A previous article explains pretty clearly how you can create a perfect schedule for your blog posts and in a week and also when should you post/update in a day to get the maximum visitors.

What do you do to keep your site crawled regularly? 

The only answer to this question is by updating your website regularly. Post more frequently and you’ll get your site crawled frequently. But you must remember, as omnipresent and omnipotent Google appears to be, Not all pages appear in a Google search results. To phrase it properly, not all pages are indexed in Google’s records. The reasons vary much but you can appeal against Google’s decision by submitting it manually through your Google webmaster account. The only limitation is that you can submit only 500 links through your webmaster account. The above was to have your webpage crawled, but what should you do to save your webpage from being crawled? The answer is a robots.txt file. Since the Google-bots are automated, a robots.txt file is checked first before a bot enters the site for the crawling process. So through a directive in your file you can prevent the Google-bots from entering in the particular webpage thus avoiding any crawl process. 

How to force Google to crawl your website more frequently?

  • One of the ways is with the use of internal and external links. Those are also to be used in two different methods, by harboring links or by hosted links on external websites. Internal links on your websites will provide incentives to search engines to crawl the website more frequently so as to index more links. However for the case of frequency of crawling, more preferable than internal links are external links. Google values external links because it considers it (External links) as a source of new links increasing the overall directory strength of the web giant. It also considers external links as another method of value addition to the hyperlinks.
  • Another higher priority method is by hosting your link on some other website. When a link of your website is found on another website Google values it as a recommendation for your website to be crawled further. Your link status can be very well found out using your Google websmaster tools. Some web experts claim that your website server must also be fast. For those of you who have hosted their own website on your own hardware, it is better to upgrade and render your site fast enough.
  • Another method is sitemaps, A Site-Map is basically a way of telling the search engines how you want your website to be crawled. Personally, I’ve had some bitter experiences with site maps and my views are shared with several others. I would not recommend using site-maps even though it can be called as a possible solution to the web crawl scenario.

Last but not the least, the only sure-shot way of ensuring frequent visits by Google is by frequent updation of your content.

This was a Mumbo-Jumbo about,
Enslave and Control Search Engines to your wishes mohitchar

Mar 22, 2013

Requirements of web Meta Descriptions with Examples


find the most effective meta description for your websiteMeta-tags have always been the ones most hyped about in any SEO practice. And why shouldn't it be, meta tags are the things that a search engine (or a visitor) first looks at before going through the website. Find out the most effective Meta Description for your website. Find what are the requirements for writing the most effective description to pull maximum visitors. Also have a look at some of the top websites and their meta descriptions.

A very relevant quote comes into context here, “Don’t Judge a book by its cover”. Contrary to your expectations, the quote is exactly the opposite of what happens in the ‘meta’ business. 
There have been articles who claim that the CTR for a blog can effectively be increased by proper use of meta-tags.
There are three major types of meta-tags, 
  1. Title Tag
  2. Keyword Tag
  3. Description Tag
Title Tag has always been easy to think of, you just need to think of 70 fitting characters that fits your website. Keywords Tags can very well be understood through Google keywords tool and a detailed video can be found out in no time. But what about description? How can you write the most efficient description tag for your website, so that everyone finds you out and your business flourishes?
So here are some tips to help you write the perfect meta description for your beloved page. Also learn from the mistakes of some famous websites like Wikipedia, Reddit, e-bay, CNN etc.

Here are the requirements for a superior website description.
Requirements of web Meta Descriptions with Examples
  1. Catchy text or something trust worthy or else you'll loose visitors.
  2. Keep your description within 150 characters
  3. Always use your blog keywords for your Meta description
  4. Never blurt out entire story. For bloggers this tip is important, people make mistake of giving out the entire blog post in a single line without even saying spoiler alert.
  5. At least tell what is it about, never be clandestine about the content shown hoping people will visit and see what it’s all about. Believe me, people don’t.
  6. Make it as specific and to the point as you can
  7. Do not stuff keywords
  8. Keep it unique and original
To follow leadership by example, let us have a look at some famous meta descriptions by some very famous websites.

Requirements of web Meta Descriptions with Examples chart


  • First in the list, as always is Wikipedia with a short (to the point) and sweet little description. Just enough to bring a smile on the face of the reader.
  • Next is Reddit, boastful and description same as the title tag. 
  • Ebay tests the capacity of google bots to parse long information. If one could resort to counting, 48 words is a rope stretched far enough.
  • Wordpress has one of the most well written meta description ever. Clear and concise, just the way search engines want it.
  • CNN too has a long list of words. An ambitious 38 worder, the meta description is a bit long but filled with information and keywords, very business-like.

This was all about,

Requirements of web Meta Descriptions with Examples mohitchar

Mar 4, 2013

What PageRank Should You have to reach the top 500?


What PageRank Should You have to reach the top 500? FrontA perfect page rank is one of the most important things, a webmaster looks for in his creation. But what if you get something other than a good page rank? What if you could get a chance to enter the top 500 websites of all time? There are only three main things you need to have before your website can enter the Top 500 website clan; First Sufficient Internal links, Second More than sufficient External links and Third a perfect PageRank. So here are the observations and the analysis presented for websites with acceptable PageRanks to brag and a chance to feature in the top 500. So what if you don't have an acceptable PageRank, you could always improve your PageRank in no time.

The Observations

Do not be misled by the term perfect PageRank, It just denotes a PageRank above 4, that’s all. According to the results by research Division with the help of data by SEOmoz, the minimum page rank for all the websites never went below 4, so If you have a PageRank of 4 or more, you stand a fighting chance.

The Pie TriadA PIE Triad stands for PageRank, Internal Links and External Links Triad. It is a Diagram representing the relative importance of a particular area for any website, which means higher the area of interest in the diagram, higher will be the priority of improvement. The triad consists of only three areas, as it is these areas which lead to any alteration in the website rank. The Triad clearly signifies that the Internal and External links share the same responsibility in affecting a website’s rank. 
However, a PageRank inculcates a lower responsibility in the overall progress of the website in the ranking ladder.  According to the pie chart, just a percent of websites have a PageRank of 10. But one must also keep in mind, that just a percent of websites have a PageRank of 4 (i.e. the minimum). 

Page Rank DistributionAbout 41% of websites have a PageRank of 8, the hypothetical perfect score. Also second comes PageRank 7 with 29% websites falling in that category. Along comes third with PageRank 9, with 14% websites falling in that category. The only thing that comes to the mind with this pie is the equality with which the websites are categorized.  This goes on to prove that, even though you might have a high PageRank but among the things needed to get you to the top 500, PageRank is the thing that matters the least, This opinion is not of the author but of the results and even of some of the leading experts in the web industry like paul crowe.

Change in the number of websites

The Concept

The Gaussian curve in the Graph clearly depicts that higher number of websites have a PageRank of near 8. Not only does the curve depicts lowest number of websites at the ends, but also shows clearly high concavity until a PageRank of 6. This again shows that you have a high chance after a PageRank of 6. The high PageRankers like, addthis or Twitter are exceptions and are beyond the topic of discussions. Think of twitter as a blog (Of course it is a micro Blogging site), but think of it as a guest blog, where you are the guest bloggers, you create unique and original content in just 140 characters, Think of what such content will do when millions of people do the same in just a minute. 

For websites with PageRank of 8 Or more, webmasters need not worry, because the P.I.E. Triad seems to suffer a form of inversion and the importance of Internal and External Links are reduced to nil. Which means It is at that Precise point where your Internal and External Links do not matter anymore. This is because the PageRank is already based on algorithm that takes into account, your Link quantity and quality. In other words, Google is giving you a thumbs-up that your content and content linking practices are good enough and you need not work on it anymore.

Conclusions

So, one thing that you must keep in mind is that if you have a PageRank of 4, you still have a chance if you can improve your internal and External Links. People owning a website with PageRank of equal to or more than 8, need not worry because that is when the P.I.E. Triad suffers an inversion. A good number and quality of Links is essential for best site building practices as those are the parameters which enjoy importance over PageRank below 8. So Don’t just focus on PageRank but also on the amount of quality linking.

This was about,
What PageRank Should You have to reach the top 500? MohitCHar

Feb 25, 2013

How many Internal Links Do I need, to get in the Top 500


How many Internal Links Do I need, to get in the Top 500 front
Ever wondered, how many internal links do you need to be inducted in the top 500 clan? The results below will make it clear enough. For the aspiring achievers, this is the time to know the number of internal links you need to have in your blog/website for best possible results. But, if you need the number of external links needed, you should take a look at this.

To find out how many links your competitor has take a look at this. Also, you must first understand why are internal links important before trying to increase it.

Observations

How many Internal Links Do I need, to get in the Top 500 graph
The graph depicts the distribution of internal links with respect to the respective website’s rank according to SEOmoz. On a closer look, the curve represents a hyperbolic curve. To a layman, the above graph shows it very clearly that to a new website, there is too much competition at the top, but its nil at the bottom, which means, a new websites can very easily get into the top 500 without much resistance. Considerable resistance is met by the websites after the rank of 200. Further analysis will make it very clear to the top 500 aspiring web developers.  


How many Internal Links Do I need, to get in the Top 500 linksTake a look at the analysis below, 96% of all websites have links less than 1,000,000 links. So, it’s good news for enthusiasts, who think highly of internal links. Here you can conclude and make a goal of 1,000k links to get a chance of entering the top 500 websites list. Less than 2% of all websites had internal links between 1,000,000 to 2,000,000 and overall 4% websites had links more than 1000k. That means, the only threshold you need to set is 1,000,000. Once you’ve crossed it, you’ll be one of the top 500 websites. Once you’re there, your page rank will improve a lot and so will your visitors.

Just like anybody, you too would think a target of 1,000,000 is not your cup of tea. What would you do, then?

How many Internal Links Do I need, to get in the Top 500 pie
Out of those 96% of websites having internal links less than 1000k about 80% of websites had links less than 200,000. Just 20% had links in the range of 200,000 to 1,000,000. less than 14% of websites had links more than 200k and less than 400k. 

Seeking probability, it is still better and easier to reach a target of 200k than 1000k. Now, this fact made the situation easier for you. For realists, you may set a target of 200,000 and for pessimists; you can target 40,000 to 80,000. These targets would guarantee you better performance and higher visitors. For adsense and other ad users, this would mean huge monetary inflow.

Now coming to the things you can ignore. Do not set high goals, less than 1% websites had links more than 3,000,000. These would be the top websites (i.e. always in the top 5%, even after fluctuations). In the 96% category, less than a percent of websites had internal links of more than 800,000 links, so there’s your no-no.

How many Internal Links Do I need, to get in the Top 500 the zone graph


The Zones

Here is a break-up of 93% websites among the top 500 websites (i.e. 465). The above graph clearly depicts 5 zones.
Zone A: The zero zone, 
No website in the top 500 had internal links less than 66000.
Zone B: The apex,
Apex has a peak of 80 websites. The zone ranges from 66,000 to 90,000. Containing 164 websites out of 500, the websites in this zone are one of the most significant.
Zone C: The Curve
A parabolic curve from 90,000 to 130,000 which also contains a lot of websites based on the area under the curve.
Zone D: The Tail
The tail zone starts from 130,000 and ends at 250,000, many websites come into this category acting as filler. The 250,000 has been chosen because the average of all the internal links comes to 250216.
Zone E: The insignificant few
As the name suggests, the insignificant few do not make much difference, in the internal link analysis of top 500 websites.

Epilogue

For further analysis, the internal links were multiplied with their respective page ranks and then divided by 10. It was found that the average came down to 202,130 from 250216. Again plotting them on the above graph, the zones did not shift much and nor did the peaks, Which leads us to believe there is a direct correlation between internal links and their respective page ranks. In short, internal links lead to a direct increase in the page ranks.

This answered the question,
How many Internal Links Do I need, to get in the Top 500 mohitchar

Feb 15, 2013

4 Sureshot Ways of Improving your Bounce Rate


4 Sureshot Ways of Improving your Bounce Rate FrontFirst of all, what is bounce? In the language of web development, bounce refers to a situation when a visitor comes or lands on your webpage reads/glances at your content, closes the webpage and supposedly moves over to another webpage. 

Now let me tell you, you sincerely do not want people bouncing off your website. This would refute all your plans of making money through ads; Worse, the posts you've written before will go to vain because your potential high valued visitor has flown away and so all the toiling you've done in inserting internal links in your page will go to vain.

Stay Stay Stay,

One of the best ways of improving bounce rate of any blog is by increasing the amount of time a visitor stays on the webpage. When the visitor stays longer he has the incentive to look at the links laid out by the developer and not close the tab and look for something else. Visit blogs and learn how to keep your visitors longer at your website.

Links anyone?

In the previous Para we saw how visitors can be hooked to the site by using internal links. But what are the ways by which you can show internal links on your blogs?
  1. Blogger blogs usually have a popular posts widget in them, which is something you can use to your advantage. Wordpress blogs have awesome plugins which can also be utilized.
  2. It would help your visitors to know what you’ve actually done during the previous months. So, some bloggers even include an archive section which might contain links month wise or year wise.
  3. Labels do help your visitors to look for similar content written by the author on the subject. Another attraction for the visitors to gain knowledge and the webmaster to gain loyalty.
  4. Not all articles are one pagers some have to be written in series, like for an example take a look at the following series by codemakit. Good news for Wordpress users, there is an article series plugin for the same.

The good old content

Content is the king, nearly every blogger would agree. Codemakit has talked a lot about useful content and content writing tips, so that would not be discussed here. But since the point is worth noting, so it is here.

Keywords man!!!

Whenever a visitor see your website on the Google search engine listings for a particular search item, he clicks on your link and enters with the hopes that you’re giving what he needs. But, if you have not reviewed your keywords, then your website is plunging in the doom, no matter what you do. When a visitor fails to achieve his quest from your website, he is bound to leave.

This was about,
4 Sureshot Ways of Improving your Bounce Rate MohitChar

Feb 11, 2013

PageRank and SERP


PageRank and SERP FrontPageRanks are a method of determining the value of a website by testing it through different parameters. PageRank is an algorithm named after Larry Page which gives a numerical value to each and every page hosted and cached.

PageRanks might be of different types Percentage, linear, logarithmic and sometimes even parabolic. However, Google uses a logarithmic. Here a percentage system has been discussed below.

As seen above, a website C has huge number of links from D category websites. On one look, it might seem like the website C would have a nice PageRank, but on closer look, you would find that A would have higher PageRank. Why? Because ‘A’ has a link from the 47 percenter B. So the search engine compares two situations, one where a website recieves 3 links with 13% and some nondescript websites and a website recieveing a single link with 47% and some other nondescript websites. So A is chosen as the better one.

PageRanks can also be called as a probability of a person randomly clicking on a link and coming to a particular page. The value, between 0 and 1. A 0.4 probability can be expressed as a 40% probability or just 40% if you see the percentage view. In simple words, there is 40% chances of a person landing on a particular page on clicking on a random link.


Ball Diagram for Page Rank
Now coming to the point, the PageRank which most developers and SEO specialists like to think about is the popular google toolbar. The toolbar displays a numerical value (a natural number from 0 to 10) about a page (which you just visited). The best of websites have a PageRank of 10 and the worst have a PageRank of 0. Google’s stance on the determination of the number is quite opaque and experts guess that it is based on the size of the website, updations, number of Internal and External links etc. 

SERP Rank,
Basically, it is the position of a website or a particular webpage in a search listing. Higher the position of the website, higher will be its SERP rating (example – Wikipedia always enjoys very high SERP rating as certain technical words always feature Wikipedia in the top 3 results).

The Google Directory PageRank is a bit of a haze, The Directory does not show any numerical value but just a bar, showing your position. Now it’s up to you to guess your PageRank.

PageRank and SERP MohitChar

Feb 1, 2013

How many External Links Do I need to be in the Top 500

The article below explains the users why external links are important for any website and how many external links should one have to reach the top 500 websites of all time. 

Here you’ll find all that you want to know about the number of external links you need to have to have and how do websites score on the basis of external links.





Importance of External Links

One cannot ignore the role of external links in the status of the blog. The following are the reasons why the external links on your website is important for the visitors. 

83 % of external links are less than 20,000 KFirst Thing, the visitors do not know about the topic (that is one of the reasons why they come to you in the first place). Secondly a very high percentage of them do not know what they're looking for, So if you have the means to direct them to a better source of knowledge, they will be happy and you will gain loyalty.
Regarding Search Engines, It is the external links (in addition to the Internal links of course) which determine the strength of the blog/website.

When external links are more in number and of good quality, the search engines will help you rise. the complete concept can be read at a previous post at the Nofollow article.

Analysis

About 96% of websites had external links less than 200,000,000. In that 96%, 83% of websites had external links less than 20,000,000. Just see the change in value. Basically just 10% of websites make up more than 80% of website external links. Also just 3% of websites had external links between 200,000,000 to 400,000,000 and just a percent of websites had external links more than 400,000,000 (i.e. just 25 websites).

Analysis

In the 83% websites, 80% of websites had External Links less than 20,000k (i.e. 66.4% out of the total lot). So, the numbers are good news for people who think 200,000k is a goal set too high. You just need to touch the number specified and you’re good to go.

The Zones

Here is a break-up of 96% websites among the top 500 websites (i.e. 480). The above graph clearly depicts 3 zones.
The Zones

Zone A: The peaks and perks
The websites are among the most popular and enjoy huge number of external linking. Most of the websites fall in this category and is contained within the range of 7,000,000 links.

Zone B: The Insignificant many
Just as the name suggests, the zone contains many websites with the external links ranging from 7,000,000 to 24,000,000 external links. They do make a lot of difference in the top 500 clan.

Zone C: The Significant few
This zone has websites with external links more than 24,000,000. Now, they might look like super websites and such huge number of links has equipped with the best possible page ranks, but they are just as important as they look. They contain to top 15% of the top 500 websites even after fluctuations.

Note. 

1. If you still need to know more about external links and its role in determining the quality of a website, kindly visit seomoz's 17 ways Search Engines Judge the value of the link.

2. To check your website's external links and their quality, go to the W3C Link checker tool.

Jan 28, 2013

Add a NoFollow to your Blog comments

Add a NoFollow to your Blog comments Front
Learn about NoFollow and the process of adding it to your blogger blog’s comments. Also learn about link and content farmers and their goals.

A time comes when you find people (spammers actually) clamoring up your blog posting strange and weird comments to your blogs. It’s no time when you realize that it is depreciating your blog’s value. Also you can read an article at HubSpot which says that while crawling search, engines consider the links in the comments section too and so if your blog has weird comments with crappy links hidden in a clandestine fashion, then at some point of time your blog will be considered near crap by the search engines. So you decide to remove the comments.

Now it’s been time when you’ve started removing comments or moderating them. But it’s a tedious process anyway. So what’s the solution?

Content Or Link Farmers

Obviously the ‘NoFollow’ attribute. Google has always adhered to its policies diligently  A new concept comes here which must be explained. A particular group of people who just post crappy content with or without links on different blogs and websites so that they can harness link juice and keyword are known as content or link farmers. The complete practice is illegal and can lead to ban.

Why do these people do what they do?

There are some reasons,

  1. When a website has its link on a website with a PageRank higher that their’s like yours for example, Then in the search Engine’s eyes, you are recommending the website to the search engine. And if the website doesn’t turn out good or becomes a 404 error, then it is bad for you and your website. Some spammers are actually your own competitors trying to wear you down so that your website gets penalized and looses visitors to the competitor.
  2. Also when you’re recommending a website by hosting  a link on your website, Why let others take the advantage of your efforts.
  3. In the keywords perspective, certain blogs have claimed a decrease of more than 50% traffic to a website/blog due to spammer comments. The reason being unnatural and unrelated comments bring altogether weird keywords to your blog. Your main keywords set aside for your website will have no effect.

The HTML tag, known as NoFollow tells the search engine robots that they should not consider that link in their calculations. So you need not worry about recommending the site as you’re not and so you will not be penalized.

Process

Here is a step by step process of applying a NoFollow attribute to your blogger blog.

1. Go to Blogger-; Design - Edit HTML
2. Click "Expand Widget Templates" box
3. Now find the following piece of code using Ctr+f

<a class='comment-link'>
<a expr:href='data:comment.authorUrl'>

4. After the at the end just add rel=”nofollow”, so at the end of process it would look like

<a class='comment-link' rel="nofollow"
<a expr:href='data:comment.authorUrl' rel='nofollow'>

5. Click save your template.

Now that you’ve completed the procedure, you need not worry any more about the search engine implications due to weird and crappy comments with links in your blog.

Now you can even check if you've completed the procedure successfully using the firefox addon known as nodofollow. It performs the seemingly impossible work of finding out which parts of a webpage has a NoFollow attribute stuck on it. Here is a screenshot of the add-on on codemakit itself. The highlighted links in blue are dofollow links. But the highlighted links in red are nofollow links. As you can see yourself, the comments section are all in red thereby freeing you from any liability and penalty.

Add a NoFollow to your Blog comments examples

This was a walk-through on the process to
Add a NoFollow to your Blog comments MohitChar