Top

How I Get My PageRank Back While Still Serving Text Link Ads

March 17, 2008 by · 19 Comments 

OK, I think a more descriptive and appropriate title would probably be something along the line “Google shouldn’t have given my PageRank back when I’m still selling Text Link Ads!” LOL.

While some claim that the long awaited update took place on March 2nd 2008, it didn’t seem to have any affect on my blog until recently. I first noticed my PageRank value was no longer zero last Friday after I made some changes to my blog.

To make sure that it wasn’t just a normal spike that could happen during a PageRank update, I decided to wait for a couple of days before writing about it. I ran a future PageRank check to see what my PageRank values were across several Google datacenters.

Sure enough, all datacenters agree that Sabahan.com is now a PageRank 3. So what did I do to have my PageRank back while I still serve Text Link Ads on my blog?

To be honest, I don’t claim this method as definitive and I can’t say for sure Google will retain my PageRank once the dust settles. I could end up having my blog slap again once Google catch up with this trick. But for the sake of information sharing, here’s what I did.

I just update the Text Link Ads plugin with the latest version and changed the title of my link ads from “Our Sponsors” to “Sites”. That’s all, nothing fancy.

That said, I am not suddenly retracting my previous stand that selling (or buying) text link ads is not good for your blog especially if the main motivation is to game the search engine rankings.

People just never get tired talking about this PageRank thing don’t they? 🙂

Now let’s hear something from you. Have you tried the changes above? How does the recent PageRank update affect your blog?

Google’s January 2008 PageRank Update

January 15, 2008 by · 24 Comments 

There has been a lot of buzz around the blogosphere and forums about the recent changes in PageRank and backlinks counts from Google

From where I am sitting, I can see that Google is getting tougher towards those selling text link ads and paid reviews. It appears that Sabahan.com is penalized again for selling text links, and this time around my PageRank has been reduced to 0.

Personally I think PageRank is important but not as important as the amount of traffic you get. For example if 10K unique visitors stop by at your blog daily, and you are making tons of money from them, do you really care about your PageRank? I don’t.

On the other hand, when the drop in PageRank is followed by a reduction in your daily traffic, which does happens every now and then, you can start dropping everything you are doing and promise to follow Google guidelines to a tee and hope Google will give your original PageRank back.

It appears that this is what’s happening to Sabahan.com. The drop in PageRank is followed by the reduction in traffic this blog receives daily. That’s a clear signal that it’s time to remove those paid link ads asap. Sorry advertisers, I have no choice but to comply with the big brother’s rules.

Have you noticed any changes so far?

Want to know you what your blog’s future PageRank would be, just enter your URL below.


Future PageRank Tool © SEO Chat™

URL
Valid URL

Data Center
Select datacenter range.

Enter Captcha To Continue
To prevent spamming, please enter in the numbers and letters in the box below

Report Problem with Tool.

Why Google Hates Paid Reviews

December 2, 2007 by · 14 Comments 

I haven’t been able to post anything for over a week because something came up and I needed to take care of it right away. Anyway, since I’ll be away from the computer again tomorrow I guess it’s best to spend sometime writing to let you guys know that I am still alive and kicking 🙂

While browsing my RSS feed on a lazy Sunday afternoon looking for something to write about, I saw a post by Matt Cutts which I thought you should know especially if you are doing paid reviews.

I’ve written several times in this blog that selling and buying links that pass PageRank is frown upon by Google. In the post Matt gives an example, a serious one at that, of paid posts to illustrate his points how some paid posts writer couldn’t careless about the accuracy about their reviews. Inaccurate information is not only bad as far as the users experience is concern but it can potentially be a matter of life or death for the readers in certain situation.

To illustrate his point, he uses brain tumors paid reviews, and ask you to put yourself in the reader’s shoe. Matt’s main concern is that most of the reviewers knew nothing about the treatment before getting paid to post about it. As a result, the reviews were often inaccurate or uninformed. In the end of the day, the one who suffers are the end users who believe the reviews are accurate.

Now if using brain tumor treatment as an example is a little too serious for you, ask yourself if paid reviews actually offer good users experience and how it could unfairly affect the search engine rankings in the long run. If you think that it’s unfair for Google to stop people from selling/buying links as it closes the opportunity for small website publishers to advertise their websites at a low cost, imagine when all the big companies with million dollar budget join the party.

If Google were to allow buying and selling links to continue, I won’t be surprise to see one day that acquiring top search engine rankings is a matter of spending the most money to buy paid links and reviews. Gone were the days where the Internet was a level playing field, at least as far as SEO is concern. Surely nobody wants that to happen, not me, not Google and definitely not you, right?

I’ve posted a comment in the post and I hope Matt will answer my questions

I agree with everything about this article but I wonder if your algorithm is able to determine whether a paid post is well written/well research. What if someone writing a paid review about brain tumors and was able to write an amazingly accurate and honest article about it? Do you still demote their page rank just because they are writing paid review about brain tumors?

My guess is that, such thing might require a manual review. Take JohnChow.com for example, he’s ignoring everything Google says about not doing paid review or selling text link ads and he still have PageRank 4. Google is giving mixed message here for not demoting his page to 0. Perhaps Google realises that his readers actually find some of his paid reviews useful and that’s how he can get away with it. Is that an accurate assumption?

Then what happen if JohnChow.com suddenly writes a paid review about a brain tumor treatment and the information is not entirely accurate. Will he get a 0 PageRank then?
Perhaps even with PR4, JohnChow.com has lost its ability to pass along Google Juice so Google couldn’t care less about what he writes?

What about someone promoting an affiliate program related to brain tumors treatment? And let say the link was able to pass Google Juice along. Are you going to do anything about that?

What do you think?

By the way, I know I am selling text link ads here so some readers might perceive that I’m contradicting myself by not doing what I write. Actually, I didn’t escape the wrath of Google when my PageRank was reduced to 3 from 5. Hey they could be gone soon and I might start selling banner ads as an alternative.

Google’s YouTube Lost Its High PageRank

October 28, 2007 by · 22 Comments 

That’s right, somehow Google own video sharing website has suffered a major drop in PageRank after the recent shake up. From where I am, currently it’s a 3. The Future PageRank tool also indicates that the value is consistent across all datacenter.

I don’t recall YouTube selling text link ads, so obviously Google is looking at other areas when deciding whether a site’s PageRank should be reduced.

youtubepr.PNG

If it’s not a glitch, perhaps Google had decided that YouTube is no longer an important website for whatever reason and therefore doesn’t deserve the high PageRank.

Regardless, it goes to show that Google doesn’t discriminate when it comes to setting PageRank value for their own properties. Then again, if it’s not a spam penalty, I’m incline to think that this is a normal fluctuation during a PageRank update and surely YouTube would set to regain its former PageRank.

Terrible Internet Connection & A Second Round PageRank Update?

October 25, 2007 by · 40 Comments 

I’ve been experiencing a terrible Internet connection since yesterday and probably unable to post lengthy post for now. I’ve called Streamyx last night but I was disconnected after going through all the ‘mandatory’ troubleshooting for 20 minutes. Yea, that’s a ‘great’ TMNet customer support that you come to expect.

I am on the phone with them right now.

Anyway, apparently Google is doing a second round of the PageRank shake up and according to HTNet, Sabahan.com is now a 3. I can’t check this myself because my connection doesn’t even load Sabahan.com main page properly. (Update – yes that’s seem to be the case)

If you are wondering how this would affect me. well it’s still business as usual but then I’ll have to realign some of my strategies. I have over 40 domain names and about 30+ are active blogs/websites/forums. Not all of them are affected. In addition, the majority of them do not depend on PageRank to make me money. I’ll write about this more in my next post.

Anyhow, the penalty seems to affect almost everyone regardless of how good the content.

andybeard.eu is compiling a list of sites affected by this ‘debacle’

 

http://www.autoblog.com/ PR6 PR4
http://www.engadget.com/ PR7 PR5
http://www.problogger.net/ PR6 PR4
http://www.copyblogger.com/ PR6 PR4
http://www.joystiq.com/ PR6 PR4
http://www.tuaw.com/ PR6 PR4

http://www.searchengineguide.com PR7 PR4
http://www.searchenginejournal.com PR7 PR4
http://www.johnchow.com PR6 PR4
http://www.quickonlinetips.com/ PR6 PR3
http://weblogtoolscollection.com/ PR6 PR4
http://andybeard.eu PR5 PR3
Vlad PR4 PR2

http://www.seroundtable.com/ PR7 PR4
http://www.blogherald.com/ PR6 PR4

www.Forbes.com PR7 PR5
http://www.sfgate.com PR7 PR5
www.washingtonpost.com PR7 PR5

Are your sites or blogs affected (again)?

Did You Lose Your PageRank Because You Are Selling Paid Links?

October 9, 2007 by · 23 Comments 

For the past few weeks, news about blogs losing their PageRank has been widely discussed around the blogosphere. Besides this blog, several of my other blogs weren’t spared either.

This had affected almost everyone regardless of the quality of their content which includes big names such as JohnChow.com, AndyBeard.eu, Yaro Starak’s of Entrepreneuers-Journey.com.

I think it’s important to be aware that there are actually two different types of PageRank. One is the actual one that Google uses internally to rank a page and the other one visible on the Google toolbar. The reduction in the visible PageRank may affect a site’s text links price but based on Google’s internal PageRank value, the site should continue to rank just as it always been. Then again, an over zealous link selling promotion would definitely affect the rankings eventually.

While I tend to believe the visible PageRank carries little weight when it comes to deciding where you rank in the search engine, it’s still used by many people to gauge a site credibility. As I wrote earlier, this incident was probably nothing more than a normal PR update exercise where some sites would enjoy an increase while others experience a drop until I came across a post by Danny Sullivan over at Search Engine Land.

Danny points out that it’s now official that selling paid links can hurt your PageRank or rankings on Google based on a feedback he got from Google.

While I am aware the sites selling paid links might lose their ability to pass along link love, this is perhaps an indication that Google started taking a concrete action penalizing link selling sites. This news seems to be spreading like wildfire around the blogosphere at the moment.

Danny uses The Stanford Daily, a student newspaper of Stanford University where it continued to sell paid links despite widespread attention to its actions and without any penalty being imposed by Google.

The Stanford Daily is NOT banned from Google. The site’s homepage still has a PR9 score. Nothing indicates that the Stanford Daily’s links aren’t passing ranking juice, not in the ways that Google could control, if it wanted. Maybe they aren’t, but how would most people know? How would other publishers thinking of doing the same know? Certainly not from reading the paper’s rate card (PDF), where there’s nothing said about text links relating to search engines. The only thing said is the price: $350 per month.

Then Danny adds, last week he noticed the Stanford Daily PageRank had been reduced from PR9 to PR7

Last week, I noticed the Stanford Daily had dropped from when I wrote the above in April to PR7 today. That’s a huge drop that has no apparent reason to happen. Some others were also reporting PageRank drops. So I pinged Google, and they confirmed that PageRank scores are being lowered for some sites that sell links.

In addition, Google said that some sites that are selling links may indeed end up being dropped from its search engine or have penalties attached to prevent them from ranking well.

So guys, it’s official – from a seller point of view – making money from text link ads is no longer as exciting as before because you will be penalized regardless of your intention. This will definitely change the way how site owners monetize their sites. If selling paid links is one of your main sources of income, you would have to look elsewhere.

I’m incline to believe that this move by Google will improve the quality of their search results in the long run as it weeds out those who buy links to boost their ranking regardless of the quality of the content. But at the moment, it’s easier to put the blame on Google for being senseless and unfair to the rest of us. Talking about fair treatment, if you think you don’t deserve a PR reduction (nobody does right? ), you might want to write to Google via Webmaster Central and request a review. I am going to do that next.

Oddly enough, there are several blogs that are unaffected by this move. Those sites continue to sell text links but have received no penalty. Let me give one example. Now I’ve nothing against AdesBlog.com, in fact I think it’s one of those well written blogs with high quality content that I like to read. However if Google judgement is based solely on selling paid links, there’s no way AdesBlog.com could escape unnoticed.

Anyway, do you plan to keep selling text links on your website? Are you concern about your PageRank? If so why?

Update: Adesblog.com’s PageRank was recently reduced.

Super Fast One Hour or Less Google Indexing

October 5, 2007 by · 18 Comments 

After I published my previous post, I noticed from my user’s online page Googlebot came munching the content almost immediately.

If you Google the term “latest google pagerank update Sabahan” you will notice that the post appears first on the list, but that’s besides my point.

When I did the search at 9:30PM, I noticed the post was indexed 7 hours ago as shown in the screenshot below. That’s about an hour after I published the post at 1:30PM. It appears that now it takes Google about an hour or less to index my post.

fast-indexing.png

While I’m not certain whether a blog’s PageRank and update frequency have any influence over the Google indexing speed, I tend to think they do and this shouldn’t be confined to blogs alone. Can you imagine the potential traffic you could drive to you blog if you are the first to break an important story?

Have you experienced the same effect in your blog? If you haven’t, I suggest you create a sitemap for your blog and see what happens.

8 Things to Do When Your Site Is Removed From Google

August 21, 2007 by · 10 Comments 

If you depend on Google for 90% of your website traffic, the consequences of losing that traffic when your site is removed from Google can be devastating.

It’s even more so if you are making a living working on the Internet. All the long hours you had put in seems meaningless. Watching your cash flow dries up can be a real blow to your motivation, I know, I’ve been there.

But before we go insane over the whole debacle, take a deep breath and calm down. There will always be an explanation as to why your pages disappear from Google.

The good news is, having your site removed from Google isn’t a death sentence because you can always request for re-inclusion. If you know why your pages disappear, you should be able make the necessary changes and have them re-indexed as soon as possible.

Here are the things you can do when you find your pages missing from Google index.

 

  1. Register With Google Webmaster Central

    Actually you don’t have to wait until your sites removed from Google before registering with Webmaster Central. If you haven’t done it already, do yourself a big favour and register now.

    Webmaster Central should be your first stop to help you understand what’s happening. It’s the only place where Google alerts site owners of penalties for their sites.

    Google may explicitly confirm a penalty and offer you a re-inclusion request specifically for that site. Once you have your site verified, click on the tab label “Diagnostic” to find a section called “Indexing summary”. It might say

    No pages from your site are currently included in Google’s index due to violations of the webmaster guidelines . Please review our webmaster guidelines and modify your site so that it meets those guidelines. Once your site meets our guidelines, you can request reinclusion and we’ll evaluate your site. [?] Submit a reinclusion request

    Now that’s a lifesaver if you ask me.

     

  2. Create A Google Sitemap

    Again, this is one of the things that you should have done by now. The sitemap will ensure you get back in as quickly as possible by notifying Google all your URLs at once. Learn how to create a sitemap for your blog here.

     

  3. Check Your Site If It Was Down

    One of the common reasons why you pages are removed from Google is that your server was down during a Googlebot visit. As a result Googlebot will issue a network unreachable error or robots.txt unreachable error in the Webmaster Central control panel.

    In most cases, you don’t have to worry about it because Googlebot is only postponing the crawl and will return to your site later when your server is reachable.

    However if you are certain that your server hasn’t been down, something might be blocking Googlebot from visiting your site. This is what I had experience recently.

    If you server is always down, it’s time to look for a new and more stable web host.

     

  4. Check Your robots.txt File

    If you don’t know what robots.txt file is, chances are you probably do not have one on your site. That shouldn’t be a problem because this allows Googlebot to index everything if that’s what you wanted.

    If you have a robots.txt file, you might have inadvertently included an instruction in the file that stopped Googlebot from crawling your site.

    Again, the Webmaster Tools will help you analyse your robots.txt file whether it allows Googlebot crawling or not.

    allow-robots.PNG

     

  5. Ensure That You Are Not Spamming

    Before you put the blame on Google for removing your pages; it’s a good idea to check whether you are doing something that violates Google Webmaster Guidelines.

    In particular, make sure that you don’t have the following on your pages:

    Hidden text – often achieved by using Cascading Style Sheets (CSS) as in this example or normal HTML such as using white text on a white background,

    Other practice that you should avoid is cloaking, often done using server based scripts. The purpose is to serve different pages to search engine than your display to users.

    Sneaky Javascript redirects could also risk your sites of banning. Since Googlebot is unable to index links hidden in the Javascript, some site owners use this to display different content for users who see the Javascript based content than for search engines which see the noscript-based text.

    If you are creating doorway pages containing many links that specifically made for the search and don’t benefit your users, your site may be removed from Google index.

    Other practice to avoid is stuffing keywords into your pages. This is often done in combination with the hidden text practice. Keywords stuffing occurs when you attempt to include the same keywords repeatedly in an attempt to manipulate the page ranking.

    Other thing you should check is to ensure you don’t serve substantial duplicate content on multiple pages, subdomains or domains if the motivation behind this is to manipulate the search engine ranking in an attempt to attract more traffic.

    If you have restructured your site and have pages that are accessible via several URLs, eg:

    http://www.yoursite.com/games/bubble-hunter.html
    http://www.yoursite.com/bubble-hunter.html

    Make sure to use 301 redirects (“RedirectPermanent”) in your .htaccess file to make sure that there is only one URL is associated with the bubble-hunter.html page.

    Now if you are using WordPress, and you are changing your permalink structures, you can use the Permalink Redirect WordPress plugin to help you do this.

     

  6. Are you Doing Excessive Link Exchange?

    As I’ve written previously, Google had recently updated their Webmaster guideline where they’ve added that excessive reciprocal links or excessive link exchanging ("Link to me and I’ll link to you.") can negatively impact your site’s ranking in search results.

    Avoid using any software that automates the link exchange effort which ignores the quality of the links.

    The best type of links are the editorial ones given by choice by other site owners. You can also submit your site to relevant directories such as Yahoo! Or Open Directory Project.

    In addition, if for some reason you lost some incoming links, it’s possible that it’s will affect your site ranking. This is often overlooked especially if you have a new site that hasn’t gathered substantial number of incoming links.

     

  7. Take Extra Care When Buying and Selling Links

    I know selling text links from your site or blog can offer a good source of income. But make sure it’s done in the user’s best interest. Buying links to improve PageRank violates Google quality guidelines.

    When you are selling links, avoid accepting sites unrelated to yours. Sure having those penis elongation or Viagra text links on you tech blog might put money in your pocket, but it would be done at the expense of having your blog removed from Google.

    If all you are after is short term profits, this probably doesn’t concern you.

     

  8. Is It Google Problem?

    It’s not always your fault for having your site removed from Google. With billion of pages to crawl every month, Googlebot may encounter its own technical problem during one of its own normal crawls.

    Do a search on Google and see if anybody else was having problem the same problem during the same period.

    If none of the above seven reasons applies to you and you are certain that it’s Google’s problem, send them a feedback. Ask them politely why your pages were removed. If it’s their fault, you can be sure your site will be included fairly easily. However be prepare to give them several days (or even weeks) to respond to your email.

How to Request for Reinclusion Into Google Index?

Just log into your Webmaster Central account and click on the “Submit a reconsideration request” link under Tools on the right sidebar.

Before you do that it’s important to make sure you have made all the necessary changes to your site so that it adheres to Google Webmaster Guidelines

You’ll be given the chance to admit your mistake, what you have done to correct it and you must agree not to repeat it again. It’s important that you provide some evidence of good faith before they can reconsidered your site.

“Banned” By Google? Find Out How to Entice Googlebot to Recrawl Your Site

August 18, 2007 by · 21 Comments 

In my previous post I wrote about a problem I had where many of my sites were suddenly removed from Google search result pages.

It was ‘unsettling’ to say the least because I could had easily lost hundreds of dollars per day from AdSense and affiliate programs that depend on Google organic traffic during that debacle.

I found it strange to see Googlebot repeatedly spewed the robots.txt unreachable error or Network unreachable errors via my Google Webmaster Console when I was absolutely sure that my server uptime had been nothing but one hundred percent during the period.

 

roboterror.png

If you are not familiar with the robots.txt, it’s a file used to keep web pages from being indexed by search engines.

A few questions came to mind when it happened to me.

 

Did Google finally penalise me for selling text link ads?

I know Google often regards some sites that engage in selling and buying of text links as ignoring the users’ best interest. This is clearly stated in their webmaster guidelines where sites participating in such schemes would risk having their rankings dropped from Google search result pages.

I’ve written about Google’s opinion on paid link before and the addition of paid link reporting form at Google Webmaster Central. Did I finally became the victim of their position on this matter?

That said, I only accept quality and relevance links at my sites and blogs. Besides, several of those affected sites didn’t participate in selling of text links. So I concluded that it had nothing to do with selling text link ads.

 

Did they consider my Partner page as excessive link exchange practice?

Google had recently updated their Webmaster guideline where they’ve added that excessive reciprocal links or excessive link exchanging ("Link to me and I’ll link to you.") can negatively impact your site’s ranking in search results.

However, I doubt what I am doing is excessive – at least not what I think they consider excessive at the moment.

While the Partner page at Sabahan.com was created exclusively for cross linking, I only accept useful sites or blogs related to technology, marketing or blogging. A site that doesn’t offer any value to my users isn’t good enough for the search engine and will be deleted.

Occasionally though, some low quality, unrelated blogs might slipped through but they would be deleted in a manual review that I did every now and then. Other affected sites were never involved in any link exchange practice. So, I had crossed this as a possible reason off my list.

 

Were my sites struck by algorithm changes?

Perhaps what I had been doing to optimise those sites in the search engines are now regarded as spamming.

This might be possible if it affects one or two sites, but not when 15 or more sites, which include several blogs, forums, static HTML sites, e-commerce sites across several different niches were simultaneously affected. It didn’t make sense.

 

Perhaps it was Google that’s having technical difficulty with their server.

That’s possible but if that’s the case, I am sure there would had been many other site owners affected during that period.

A quick search at Google to find similar occurrences in the past 3 months didn’t return any result that supports this notion. Those that I’ve discovered seemed to be isolated incidents.

 

So what was really happening here?

Then it struck me that the answer was right in front of me. I guess like some people, I tend to overlook the simple details in favour of a more complicated explanation.

When something like this happens to you, Google Webmaster Central will be your best friend – seriously. Googlebot had been trying to tell me that both my robots.txt file or network was unreachable, and that’s exactly what causing the problem… duh!

The trick was to figure out how did the robots.txt or my server became unreachable when I knew for sure my server had nothing but 100% uptime during that debacle. So obviously, something was preventing Googlebot from accessing my robots.txt file or server. And that something must had been blocking Googlebot IP address.

After some searching I discovered the following error message from my server log file.

[Sat Jul 14 01:39:32 2007] [error] [client 66.249.66.228 ] mod_security: Access denied with code 406. Pattern match "=(http|www|ftp)\\\\:/(.+)\\\\.(c|dat|kek|gif|jpe?g|jpeg|png|sh|txt|bmp|dat|txt|js|html?|tmp|asp)\\\\x20?\\\\?" at REQUEST_URI [hostname " www.portable-cd-mp3-player.com "] [uri "/frame/index.php?url= http://reviews.cnet.com/SanDisk_Sansa_m240_1GB_silver/4505-6490_7-31563923.html?subj=fdba&part=rss&tag=MP_Portable+Audio+Devices "]

Now it looks like the mod_security had blocked Googlebot IP address. I have CSF – ConsifgServer firewall running and further check revealed that it had blocked Googlebot IP address.

Fixing the problem was a matter of removing Googlebot IP address from the csf.deny file and adding it into /etc/csf/csf.allow file. Of course this can be done easily via the CSF graphical user interface.

Once that done, I resubmitted my sitemap.xml file via Google Webmaster Central and it didn’t take long before Googlebot start to recrawl my sites.

sitemap.PNG

googlebot.png

 

In some situation, the problem would appear to go away by itself and Googlebot would start to crawl your site again. This could happen if the Googlebot comes from a different IP address which is not blocked by your server.

Having a sitemap file for your blog allows Google to index it faster. Check out my other article to learn more. If you have a static HTML site, or any site other than a blog, you can use the tool at XML-Sitemaps.com to generate a sitemap.xml file for your sites easily. It’ll include up to 500 pages from your site in the sitemap file for free.

To prevent similar problem from recurring in the future, I’ve added the following line into my mod_security ( modsec.user.conf ) file to prevent Googlebot from being blocked.

# GoogleBot by user-agent…
SecFilterSelective HTTP_USER_AGENT "Google" nolog,allow
SecFilterSelective HTTP_USER_AGENT "Googlebot" nolog,allow
SecFilterSelective HTTP_USER_AGENT "GoogleBot" nolog,allow
SecFilterSelective HTTP_USER_AGENT "googlebot" nolog,allow
SecFilterSelective HTTP_USER_AGENT "Googlebot-Image" nolog,allow
##
SecFilterSelective HTTP_USER_AGENT "AdsBot-Google" nolog,allow
SecFilterSelective HTTP_USER_AGENT "Googlebot-Image/1.0" nolog,allow
SecFilterSelective HTTP_USER_AGENT "Googlebot/2.1" nolog,allow
SecFilterSelective HTTP_USER_AGENT "Googlebot/Test" nolog,allow
SecFilterSelective HTTP_USER_AGENT "Mediapartners-Google/2.1" nolog,allow
SecFilterSelective HTTP_USER_AGENT "Mediapartners-Google*" nolog,allow
SecFilterSelective HTTP_USER_AGENT "msnbot" nolog,allow

 

Of course if you don’t manage your own server, this is probably something that you don’t have to worry about, although you might want to refer your server admin to this article if something similar happens to you.

If your blog or site is suddenly removed from Google index for no apparent reason, just head over to Google Webmaster Central. It’ll offer a hint as to the cause of the problem.

An Introduction to On-Page Search Engine Optimization (SEO) Factors

August 8, 2007 by · 14 Comments 

The on-page factors form part the hundreds of criteria used by the search engines to rank a page. In the absence of significant off page factor advantage, a good on-page SEO practice will help your pages to rank better than your competitors.

On-page factors are elements on a web page that the search engines weight differently when deciding where to rank your page in the search engine result pages.

These elements include the page title, headlines, alternate image descriptions, anchor text, keyword density and so on.

It’s estimated that Google alone uses over 200 ranking factors. As to the exact factors used to rank a page, nobody knows except the search engines themselves.

The following are page elements identified by most search engine optimizers to have some influence over your search engine rankings. The term “keyword” below refers to one word or a phrase containing more than one word.

Now, say a keyword you are targeting is “personal loan”, it should be included in the following on page elements

 

Keyword in URL http://www.sabahan.com/category/personal-loan
Keyword in Domain Name http://www.personal-loan.com/application.html
Keyword in The Page Name http://www.domain.com/personal-loan.html
   
Keywords in Header Tags  
Keyword in the Title Tag

This is probably the single most important on-page factor. You can see it used by most search engine in their result pages.

There’s more to creating an effective title tag than including your keywords in there. Here some essential title tag strategies that could improve your search engine ranking.

Keyword in Description Tag

Example: <meta name="description" content="your site description">

While Google no longer put any importance in the description tag, some search engine may use it to display what your page is all about in their search results. So include some of you important keywords here but avoid over stuffing them

Keyword in Keywords Tag

Example: <meta name="keywords" content="keyword 1, keyword 2, etc.">

As far as Google is concern, placing your keywords in the Meta tags won’t improve your ranking. In contrast, MSN does take into account the presence of keywords in the Meta tags.

If you decide to use it, avoid cramming all your keywords. Try using synonyms, plurals and common misspellings. I personally feel it’s important to include a keywords tag to make a page more complete.

   
Keywords in the Body  
Keyword Density

Keyword density is the ratio of the targeted keywords contained within the total number of words within a page.

Each search engine has its own optimum percentage, but in general it should be between 3% to 7%. Avoid creating a higher density page as it can be viewed as a spamming attempt.

Here’s a tool to help you check your page’s keyword density.

Keywords in the Headlines

Headlines are those text enclosed within the H1, H2, and H3 tag and so on. H1 is considered more important than H2, and H2 is more important than H3 and so on.

It’s recommended that you put important keywords in your header.

Keywords Font Size In general, large font is considered more important than small font.
Keyword Proximity

Keyword proximity refers to the closeness between two or more words. If you are targeting a phrase containing several words, make sure they appear close to one another throughout your article.

Eg.

  1. How to Make Money Online Selling Ebook
  2. How Selling Ebook Can Make You Money

The first line is likely to rank higher than the second one under the keyword “make money”.

Keyword Prominence

Keyword prominence refers to how early the keywords appear on the page. Those appear first on the page are considered more important than those appear later.

Generally, keywords appear near the start of a page, title tag, header and so on are considered more important.

Keywords in Alt Text

Alt text refers to the alternative text that appears when you put your mouse over an image. The HTML for inserting ALT text is:

<img src="image.gif" alt="Alternative text goes here" >

It’s crucial not to stuff the alt text with your keywords repetitively

Keywords in Anchor Text

Anchor text is also known as hyperlink. It’s the clickable text typically with underline that you click on to go to another page or location.

Google puts a significant emphasis on the anchor text on you pages as it’s an important pointer to a page relevancy. In addition, keywords in the anchor text tell the search engine that the subject or theme of the page that it links to.

Try placing your important keywords in the anchor text when linking to other page; and make sure the page that you are linking to is related to the anchor text.

 

Conclusion

There are many other on-page factors that made up a search engine ranking algorithm besides the above.

Search engine optimisation demands a lot of time and effort in order to produce a meaningful results. It’s important to avoid doing what the search engines consider as spamming.

Instead of worrying too much about it, your time would be better spent on producing good content while employing good SEO practice. Pay attention to your website quality, the general rule of thumb is that if it’s not good for the human readers, it’s not good for the search engines.

Next Page »

Bottom