OK, I think a more descriptive and appropriate title would probably be something along the line â€œGoogle shouldnâ€™t have given my PageRank back when Iâ€™m still selling Text Link Ads!â€ LOL.
While some claim that the long awaited update took place on March 2nd 2008, it didnâ€™t seem to have any affect on my blog until recently. I first noticed my PageRank value was no longer zero last Friday after I made some changes to my blog.
To make sure that it wasnâ€™t just a normal spike that could happen during a PageRank update, I decided to wait for a couple of days before writing about it. I ran a future PageRank check to see what my PageRank values were across several Google datacenters.
Sure enough, all datacenters agree that Sabahan.com is now a PageRank 3. So what did I do to have my PageRank back while I still serve Text Link Ads on my blog?
To be honest, I donâ€™t claim this method as definitive and I canâ€™t say for sure Google will retain my PageRank once the dust settles. I could end up having my blog slap again once Google catch up with this trick. But for the sake of information sharing, hereâ€™s what I did.
I just update the Text Link Ads plugin with the latest version and changed the title of my link ads from â€œOur Sponsorsâ€ to â€œSitesâ€. Thatâ€™s all, nothing fancy.
That said, I am not suddenly retracting my previous stand that selling (or buying) text link ads is not good for your blog especially if the main motivation is to game the search engine rankings.
People just never get tired talking about this PageRank thing don’t they? 🙂
Now letâ€™s hear something from you. Have you tried the changes above? How does the recent PageRank update affect your blog?
There has been a lot of buzz around the blogosphere and forums about the recent changes in PageRank and backlinks counts from Google
From where I am sitting, I can see that Google is getting tougher towards those selling text link ads and paid reviews. It appears that Sabahan.com is penalized again for selling text links, and this time around my PageRank has been reduced to 0.
Personally I think PageRank is important but not as important as the amount of traffic you get. For example if 10K unique visitors stop by at your blog daily, and you are making tons of money from them, do you really care about your PageRank? I donâ€™t.
On the other hand, when the drop in PageRank is followed by a reduction in your daily traffic, which does happens every now and then, you can start dropping everything you are doing and promise to follow Google guidelines to a tee and hope Google will give your original PageRank back.
It appears that this is whatâ€™s happening to Sabahan.com. The drop in PageRank is followed by the reduction in traffic this blog receives daily. Thatâ€™s a clear signal that itâ€™s time to remove those paid link ads asap. Sorry advertisers, I have no choice but to comply with the big brother’s rules.
Have you noticed any changes so far?
Want to know you what your blogâ€™s future PageRank would be, just enter your URL below.
I havenâ€™t been able to post anything for over a week because something came up and I needed to take care of it right away. Anyway, since Iâ€™ll be away from the computer again tomorrow I guess itâ€™s best to spend sometime writing to let you guys know that I am still alive and kicking 🙂
While browsing my RSS feed on a lazy Sunday afternoon looking for something to write about, I saw a post by Matt Cutts which I thought you should know especially if you are doing paid reviews.
Iâ€™ve written several times in this blog that selling and buying links that pass PageRank is frown upon by Google. In the post Matt gives an example, a serious one at that, of paid posts to illustrate his points how some paid posts writer couldnâ€™t careless about the accuracy about their reviews. Inaccurate information is not only bad as far as the users experience is concern but it can potentially be a matter of life or death for the readers in certain situation.
To illustrate his point, he uses brain tumors paid reviews, and ask you to put yourself in the reader’s shoe. Mattâ€™s main concern is that most of the reviewers knew nothing about the treatment before getting paid to post about it. As a result, the reviews were often inaccurate or uninformed. In the end of the day, the one who suffers are the end users who believe the reviews are accurate.
Now if using brain tumor treatment as an example is a little too serious for you, ask yourself if paid reviews actually offer good users experience and how it could unfairly affect the search engine rankings in the long run. If you think that itâ€™s unfair for Google to stop people from selling/buying links as it closes the opportunity for small website publishers to advertise their websites at a low cost, imagine when all the big companies with million dollar budget join the party.
If Google were to allow buying and selling links to continue, I wonâ€™t be surprise to see one day that acquiring top search engine rankings is a matter of spending the most money to buy paid links and reviews. Gone were the days where the Internet was a level playing field, at least as far as SEO is concern. Surely nobody wants that to happen, not me, not Google and definitely not you, right?
Iâ€™ve posted a comment in the post and I hope Matt will answer my questions
I agree with everything about this article but I wonder if your algorithm is able to determine whether a paid post is well written/well research. What if someone writing a paid review about brain tumors and was able to write an amazingly accurate and honest article about it? Do you still demote their page rank just because they are writing paid review about brain tumors?
My guess is that, such thing might require a manual review. Take JohnChow.com for example, heâ€™s ignoring everything Google says about not doing paid review or selling text link ads and he still have PageRank 4. Google is giving mixed message here for not demoting his page to 0. Perhaps Google realises that his readers actually find some of his paid reviews useful and thatâ€™s how he can get away with it. Is that an accurate assumption?
Then what happen if JohnChow.com suddenly writes a paid review about a brain tumor treatment and the information is not entirely accurate. Will he get a 0 PageRank then?
Perhaps even with PR4, JohnChow.com has lost its ability to pass along Google Juice so Google couldnâ€™t care less about what he writes?
What about someone promoting an affiliate program related to brain tumors treatment? And let say the link was able to pass Google Juice along. Are you going to do anything about that?
What do you think?
By the way, I know I am selling text link ads here so some readers might perceive that Iâ€™m contradicting myself by not doing what I write. Actually, I didnâ€™t escape the wrath of Google when my PageRank was reduced to 3 from 5. Hey they could be gone soon and I might start selling banner ads as an alternative.
That’s right, somehow Google own video sharing website has suffered a major drop in PageRank after the recent shake up. From where I am, currently it’s a 3. The Future PageRank tool also indicates that the value is consistent across all datacenter.
I don’t recall YouTube selling text link ads, so obviously Google is looking at other areas when deciding whether a site’s PageRank should be reduced.
If it’s not a glitch, perhaps Google had decided that YouTube is no longer an important website for whatever reason and therefore doesn’t deserve the high PageRank.
Regardless, it goes to show that Google doesn’t discriminate when it comes to setting PageRank value for their own properties. Then again, if it’s not a spam penalty, I’m incline to think that this is a normal fluctuation during a PageRank update and surely YouTube would set to regain its former PageRank.
I’ve been experiencing a terrible Internet connection since yesterday and probably unable to post lengthy post for now. I’ve called Streamyx last night but I was disconnected after going through all the ‘mandatory’ troubleshooting for 20 minutes. Yea, that’s a ‘great’ TMNet customer support that you come to expect.
I am on the phone with them right now.
Anyway, apparently Google is doing a second round of the PageRank shake up and according to HTNet, Sabahan.com is now a 3. I can’t check this myself because my connection doesn’t even load Sabahan.com main page properly. (Update – yes that’s seem to be the case)
If you are wondering how this would affect me. well it’s still business as usual but then I’ll have to realign some of my strategies. I have over 40 domain names and about 30+ are active blogs/websites/forums. Not all of them are affected. In addition, the majority of them do not depend on PageRank to make me money. I’ll write about this more in my next post.
Anyhow, the penalty seems to affect almost everyone regardless of how good the content.
andybeard.eu is compiling a list of sites affected by this â€˜debacle’
http://www.searchengineguide.com PR7 PR4
http://www.searchenginejournal.com PR7 PR4
http://www.johnchow.com PR6 PR4
http://www.quickonlinetips.com/ PR6 PR3
http://weblogtoolscollection.com/ PR6 PR4
http://andybeard.eu PR5 PR3
Vlad PR4 PR2
Are your sites or blogs affected (again)?
For the past few weeks, news about blogs losing their PageRank has been widely discussed around the blogosphere. Besides this blog, several of my other blogs weren’t spared either.
This had affected almost everyone regardless of the quality of their content which includes big names such as JohnChow.com, AndyBeard.eu, Yaro Starak’s of Entrepreneuers-Journey.com.
I think it’s important to be aware that there are actually two different types of PageRank. One is the actual one that Google uses internally to rank a page and the other one visible on the Google toolbar. The reduction in the visible PageRank may affect a site’s text links price but based on Google’s internal PageRank value, the site should continue to rank just as it always been. Then again, an over zealous link selling promotion would definitely affect the rankings eventually.
While I tend to believe the visible PageRank carries little weight when it comes to deciding where you rank in the search engine, it’s still used by many people to gauge a site credibility. As I wrote earlier, this incident was probably nothing more than a normal PR update exercise where some sites would enjoy an increase while others experience a drop until I came across a post by Danny Sullivan over at Search Engine Land.
Danny points out that it’s now official that selling paid links can hurt your PageRank or rankings on Google based on a feedback he got from Google.
While I am aware the sites selling paid links might lose their ability to pass along link love, this is perhaps an indication that Google started taking a concrete action penalizing link selling sites. This news seems to be spreading like wildfire around the blogosphere at the moment.
Danny uses The Stanford Daily, a student newspaper of Stanford University where it continued to sell paid links despite widespread attention to its actions and without any penalty being imposed by Google.
The Stanford Daily is NOT banned from Google. The site’s homepage still has a PR9 score. Nothing indicates that the Stanford Daily’s links aren’t passing ranking juice, not in the ways that Google could control, if it wanted. Maybe they aren’t, but how would most people know? How would other publishers thinking of doing the same know? Certainly not from reading the paper’s rate card (PDF), where there’s nothing said about text links relating to search engines. The only thing said is the price: $350 per month.
Then Danny adds, last week he noticed the Stanford Daily PageRank had been reduced from PR9 to PR7
Last week, I noticed the Stanford Daily had dropped from when I wrote the above in April to PR7 today. That’s a huge drop that has no apparent reason to happen. Some others were also reporting PageRank drops. So I pinged Google, and they confirmed that PageRank scores are being lowered for some sites that sell links.
In addition, Google said that some sites that are selling links may indeed end up being dropped from its search engine or have penalties attached to prevent them from ranking well.
So guys, it’s official â€“ from a seller point of view – making money from text link ads is no longer as exciting as before because you will be penalized regardless of your intention. This will definitely change the way how site owners monetize their sites. If selling paid links is one of your main sources of income, you would have to look elsewhere.
I’m incline to believe that this move by Google will improve the quality of their search results in the long run as it weeds out those who buy links to boost their ranking regardless of the quality of the content. But at the moment, it’s easier to put the blame on Google for being senseless and unfair to the rest of us. Talking about fair treatment, if you think you don’t deserve a PR reduction (nobody does right? ), you might want to write to Google via Webmaster Central and request a review. I am going to do that next.
Oddly enough, there are several blogs that are unaffected by this move. Those sites continue to sell text links but have received no penalty. Let me give one example. Now I’ve nothing against AdesBlog.com, in fact I think it’s one of those well written blogs with high quality content that I like to read. However if Google judgement is based solely on selling paid links, there’s no way AdesBlog.com could escape unnoticed.
Anyway, do you plan to keep selling text links on your website? Are you concern about your PageRank? If so why?
Update: Adesblog.com’s PageRank was recently reduced.
After I published my previous post, I noticed from my user’s online page Googlebot came munching the content almost immediately.
If you Google the term â€œlatest google pagerank update Sabahanâ€ you will notice that the post appears first on the list, but that’s besides my point.
When I did the search at 9:30PM, I noticed the post was indexed 7 hours ago as shown in the screenshot below. That’s about an hour after I published the post at 1:30PM. It appears that now it takes Google about an hour or less to index my post.
While I’m not certain whether a blog’s PageRank and update frequency have any influence over the Google indexing speed, I tend to think they do and this shouldn’t be confined to blogs alone. Can you imagine the potential traffic you could drive to you blog if you are the first to break an important story?
Have you experienced the same effect in your blog? If you haven’t, I suggest you create a sitemap for your blog and see what happens.
If you depend on Google for 90% of your website traffic, the consequences of losing that traffic when your site is removed from Google can be devastating.
It’s even more so if you are making a living working on the Internet. All the long hours you had put in seems meaningless. Watching your cash flow dries up can be a real blow to your motivation, I know, I’ve been there.
But before we go insane over the whole debacle, take a deep breath and calm down. There will always be an explanation as to why your pages disappear from Google.
The good news is, having your site removed from Google isn’t a death sentence because you can always request for re-inclusion. If you know why your pages disappear, you should be able make the necessary changes and have them re-indexed as soon as possible.
Here are the things you can do when you find your pages missing from Google index.
Actually you don’t have to wait until your sites removed from Google before registering with Webmaster Central. If you haven’t done it already, do yourself a big favour and register now.
Webmaster Central should be your first stop to help you understand what’s happening. It’s the only place where Google alerts site owners of penalties for their sites.
Google may explicitly confirm a penalty and offer you a re-inclusion request specifically for that site. Once you have your site verified, click on the tab label â€œDiagnosticâ€ to find a section called â€œIndexing summaryâ€. It might say
No pages from your site are currently included in Google’s index due to violations of the webmaster guidelines . Please review our webmaster guidelines and modify your site so that it meets those guidelines. Once your site meets our guidelines, you can request reinclusion and we’ll evaluate your site. [?] Submit a reinclusion request
Now that’s a lifesaver if you ask me.
Again, this is one of the things that you should have done by now. The sitemap will ensure you get back in as quickly as possible by notifying Google all your URLs at once. Learn how to create a sitemap for your blog here.
One of the common reasons why you pages are removed from Google is that your server was down during a Googlebot visit. As a result Googlebot will issue a network unreachable error or robots.txt unreachable error in the Webmaster Central control panel.
In most cases, you don’t have to worry about it because Googlebot is only postponing the crawl and will return to your site later when your server is reachable.
However if you are certain that your server hasn’t been down, something might be blocking Googlebot from visiting your site. This is what I had experience recently.
If you server is always down, it’s time to look for a new and more stable web host.
If you don’t know what robots.txt file is, chances are you probably do not have one on your site. That shouldn’t be a problem because this allows Googlebot to index everything if that’s what you wanted.
If you have a robots.txt file, you might have inadvertently included an instruction in the file that stopped Googlebot from crawling your site.
Again, the Webmaster Tools will help you analyse your robots.txt file whether it allows Googlebot crawling or not.
Before you put the blame on Google for removing your pages; it’s a good idea to check whether you are doing something that violates Google Webmaster Guidelines.
In particular, make sure that you don’t have the following on your pages:
Hidden text â€“ often achieved by using Cascading Style Sheets (CSS) as in this example or normal HTML such as using white text on a white background,
Other practice that you should avoid is cloaking, often done using server based scripts. The purpose is to serve different pages to search engine than your display to users.
If you are creating doorway pages containing many links that specifically made for the search and don’t benefit your users, your site may be removed from Google index.
Other practice to avoid is stuffing keywords into your pages. This is often done in combination with the hidden text practice. Keywords stuffing occurs when you attempt to include the same keywords repeatedly in an attempt to manipulate the page ranking.
Other thing you should check is to ensure you don’t serve substantial duplicate content on multiple pages, subdomains or domains if the motivation behind this is to manipulate the search engine ranking in an attempt to attract more traffic.
If you have restructured your site and have pages that are accessible via several URLs, eg:
Make sure to use 301 redirects (â€œRedirectPermanentâ€) in your .htaccess file to make sure that there is only one URL is associated with the bubble-hunter.html page.
Now if you are using WordPress, and you are changing your permalink structures, you can use the Permalink Redirect WordPress plugin to help you do this.
As I’ve written previously, Google had recently updated their Webmaster guideline where they’ve added that excessive reciprocal links or excessive link exchanging ("Link to me and I’ll link to you.") can negatively impact your site’s ranking in search results.
Avoid using any software that automates the link exchange effort which ignores the quality of the links.
The best type of links are the editorial ones given by choice by other site owners. You can also submit your site to relevant directories such as Yahoo! Or Open Directory Project.
In addition, if for some reason you lost some incoming links, it’s possible that it’s will affect your site ranking. This is often overlooked especially if you have a new site that hasn’t gathered substantial number of incoming links.
I know selling text links from your site or blog can offer a good source of income. But make sure it’s done in the user’s best interest. Buying links to improve PageRank violates Google quality guidelines.
When you are selling links, avoid accepting sites unrelated to yours. Sure having those penis elongation or Viagra text links on you tech blog might put money in your pocket, but it would be done at the expense of having your blog removed from Google.
If all you are after is short term profits, this probably doesn’t concern you.
It’s not always your fault for having your site removed from Google. With billion of pages to crawl every month, Googlebot may encounter its own technical problem during one of its own normal crawls.
Do a search on Google and see if anybody else was having problem the same problem during the same period.
If none of the above seven reasons applies to you and you are certain that it’s Google’s problem, send them a feedback. Ask them politely why your pages were removed. If it’s their fault, you can be sure your site will be included fairly easily. However be prepare to give them several days (or even weeks) to respond to your email.
How to Request for Reinclusion Into Google Index?
Just log into your Webmaster Central account and click on the â€œSubmit a reconsideration requestâ€ link under Tools on the right sidebar.
Before you do that it’s important to make sure you have made all the necessary changes to your site so that it adheres to Google Webmaster Guidelines
You’ll be given the chance to admit your mistake, what you have done to correct it and you must agree not to repeat it again. It’s important that you provide some evidence of good faith before they can reconsidered your site.
In my previous post I wrote about a problem I had where many of my sites were suddenly removed from Google search result pages.
It was â€˜unsettling’ to say the least because I could had easily lost hundreds of dollars per day from AdSense and affiliate programs that depend on Google organic traffic during that debacle.
I found it strange to see Googlebot repeatedly spewed the robots.txt unreachable error or Network unreachable errors via my Google Webmaster Console when I was absolutely sure that my server uptime had been nothing but one hundred percent during the period.
If you are not familiar with the robots.txt, it’s a file used to keep web pages from being indexed by search engines.
A few questions came to mind when it happened to me.
Did Google finally penalise me for selling text link ads?
I know Google often regards some sites that engage in selling and buying of text links as ignoring the users’ best interest. This is clearly stated in their webmaster guidelines where sites participating in such schemes would risk having their rankings dropped from Google search result pages.
I’ve written about Google’s opinion on paid link before and the addition of paid link reporting form at Google Webmaster Central. Did I finally became the victim of their position on this matter?
That said, I only accept quality and relevance links at my sites and blogs. Besides, several of those affected sites didn’t participate in selling of text links. So I concluded that it had nothing to do with selling text link ads.
Did they consider my Partner page as excessive link exchange practice?
Google had recently updated their Webmaster guideline where they’ve added that excessive reciprocal links or excessive link exchanging ("Link to me and I’ll link to you.") can negatively impact your site’s ranking in search results.
However, I doubt what I am doing is excessive â€“ at least not what I think they consider excessive at the moment.
While the Partner page at Sabahan.com was created exclusively for cross linking, I only accept useful sites or blogs related to technology, marketing or blogging. A site that doesn’t offer any value to my users isn’t good enough for the search engine and will be deleted.
Occasionally though, some low quality, unrelated blogs might slipped through but they would be deleted in a manual review that I did every now and then. Other affected sites were never involved in any link exchange practice. So, I had crossed this as a possible reason off my list.
Were my sites struck by algorithm changes?
Perhaps what I had been doing to optimise those sites in the search engines are now regarded as spamming.
This might be possible if it affects one or two sites, but not when 15 or more sites, which include several blogs, forums, static HTML sites, e-commerce sites across several different niches were simultaneously affected. It didn’t make sense.
Perhaps it was Google that’s having technical difficulty with their server.
That’s possible but if that’s the case, I am sure there would had been many other site owners affected during that period.
A quick search at Google to find similar occurrences in the past 3 months didn’t return any result that supports this notion. Those that I’ve discovered seemed to be isolated incidents.
So what was really happening here?
Then it struck me that the answer was right in front of me. I guess like some people, I tend to overlook the simple details in favour of a more complicated explanation.
When something like this happens to you, Google Webmaster Central will be your best friend – seriously. Googlebot had been trying to tell me that both my robots.txt file or network was unreachable, and that’s exactly what causing the problem… duh!
The trick was to figure out how did the robots.txt or my server became unreachable when I knew for sure my server had nothing but 100% uptime during that debacle. So obviously, something was preventing Googlebot from accessing my robots.txt file or server. And that something must had been blocking Googlebot IP address.
After some searching I discovered the following error message from my server log file.
[Sat Jul 14 01:39:32 2007] [error] [client 188.8.131.52 ] mod_security: Access denied with code 406. Pattern match "=(http|www|ftp)\\\\:/(.+)\\\\.(c|dat|kek|gif|jpe?g|jpeg|png|sh|txt|bmp|dat|txt|js|html?|tmp|asp)\\\\x20?\\\\?" at REQUEST_URI [hostname " www.portable-cd-mp3-player.com "] [uri "/frame/index.php?url= http://reviews.cnet.com/SanDisk_Sansa_m240_1GB_silver/4505-6490_7-31563923.html?subj=fdba&part=rss&tag=MP_Portable+Audio+Devices "]
Now it looks like the mod_security had blocked Googlebot IP address. I have CSF â€“ ConsifgServer firewall running and further check revealed that it had blocked Googlebot IP address.
Fixing the problem was a matter of removing Googlebot IP address from the csf.deny file and adding it into /etc/csf/csf.allow file. Of course this can be done easily via the CSF graphical user interface.
Once that done, I resubmitted my sitemap.xml file via Google Webmaster Central and it didn’t take long before Googlebot start to recrawl my sites.
In some situation, the problem would appear to go away by itself and Googlebot would start to crawl your site again. This could happen if the Googlebot comes from a different IP address which is not blocked by your server.
Having a sitemap file for your blog allows Google to index it faster. Check out my other article to learn more. If you have a static HTML site, or any site other than a blog, you can use the tool at XML-Sitemaps.com to generate a sitemap.xml file for your sites easily. It’ll include up to 500 pages from your site in the sitemap file for free.
To prevent similar problem from recurring in the future, I’ve added the following line into my mod_security ( modsec.user.conf ) file to prevent Googlebot from being blocked.
# GoogleBot by user-agent…
SecFilterSelective HTTP_USER_AGENT "Google" nolog,allow
SecFilterSelective HTTP_USER_AGENT "Googlebot" nolog,allow
SecFilterSelective HTTP_USER_AGENT "GoogleBot" nolog,allow
SecFilterSelective HTTP_USER_AGENT "googlebot" nolog,allow
SecFilterSelective HTTP_USER_AGENT "Googlebot-Image" nolog,allow
SecFilterSelective HTTP_USER_AGENT "AdsBot-Google" nolog,allow
SecFilterSelective HTTP_USER_AGENT "Googlebot-Image/1.0" nolog,allow
SecFilterSelective HTTP_USER_AGENT "Googlebot/2.1" nolog,allow
SecFilterSelective HTTP_USER_AGENT "Googlebot/Test" nolog,allow
SecFilterSelective HTTP_USER_AGENT "Mediapartners-Google/2.1" nolog,allow
SecFilterSelective HTTP_USER_AGENT "Mediapartners-Google*" nolog,allow
SecFilterSelective HTTP_USER_AGENT "msnbot" nolog,allow
Of course if you don’t manage your own server, this is probably something that you don’t have to worry about, although you might want to refer your server admin to this article if something similar happens to you.
If your blog or site is suddenly removed from Google index for no apparent reason, just head over to Google Webmaster Central. It’ll offer a hint as to the cause of the problem.
The on-page factors form part the hundreds of criteria used by the search engines to rank a page. In the absence of significant off page factor advantage, a good on-page SEO practice will help your pages to rank better than your competitors.
On-page factors are elements on a web page that the search engines weight differently when deciding where to rank your page in the search engine result pages.
These elements include the page title, headlines, alternate image descriptions, anchor text, keyword density and so on.
It’s estimated that Google alone uses over 200 ranking factors. As to the exact factors used to rank a page, nobody knows except the search engines themselves.
The following are page elements identified by most search engine optimizers to have some influence over your search engine rankings. The term â€œkeywordâ€ below refers to one word or a phrase containing more than one word.
Now, say a keyword you are targeting is â€œpersonal loanâ€, it should be included in the following on page elements
|Keyword in URL
|Keyword in Domain Name
|Keyword in The Page Name
|Keywords in Header Tags
|Keyword in the Title Tag
This is probably the single most important on-page factor. You can see it used by most search engine in their result pages.
There’s more to creating an effective title tag than including your keywords in there. Here some essential title tag strategies that could improve your search engine ranking.
|Keyword in Description Tag
Example: <meta name="description" content="your site description">
While Google no longer put any importance in the description tag, some search engine may use it to display what your page is all about in their search results. So include some of you important keywords here but avoid over stuffing them
|Keyword in Keywords Tag
Example: <meta name="keywords" content="keyword 1, keyword 2, etc.">
As far as Google is concern, placing your keywords in the Meta tags won’t improve your ranking. In contrast, MSN does take into account the presence of keywords in the Meta tags.
If you decide to use it, avoid cramming all your keywords. Try using synonyms, plurals and common misspellings. I personally feel it’s important to include a keywords tag to make a page more complete.
|Keywords in the Body
Keyword density is the ratio of the targeted keywords contained within the total number of words within a page.
Each search engine has its own optimum percentage, but in general it should be between 3% to 7%. Avoid creating a higher density page as it can be viewed as a spamming attempt.
Here’s a tool to help you check your page’s keyword density.
|Keywords in the Headlines
Headlines are those text enclosed within the H1, H2, and H3 tag and so on. H1 is considered more important than H2, and H2 is more important than H3 and so on.
It’s recommended that you put important keywords in your header.
|Keywords Font Size
|In general, large font is considered more important than small font.
Keyword proximity refers to the closeness between two or more words. If you are targeting a phrase containing several words, make sure they appear close to one another throughout your article.
The first line is likely to rank higher than the second one under the keyword â€œmake moneyâ€.
Keyword prominence refers to how early the keywords appear on the page. Those appear first on the page are considered more important than those appear later.
Generally, keywords appear near the start of a page, title tag, header and so on are considered more important.
|Keywords in Alt Text
Alt text refers to the alternative text that appears when you put your mouse over an image. The HTML for inserting ALT text is:
<img src="image.gif" alt="Alternative text goes here" >
It’s crucial not to stuff the alt text with your keywords repetitively
Keywords in Anchor Text
Anchor text is also known as hyperlink. It’s the clickable text typically with underline that you click on to go to another page or location.
Google puts a significant emphasis on the anchor text on you pages as it’s an important pointer to a page relevancy. In addition, keywords in the anchor text tell the search engine that the subject or theme of the page that it links to.
Try placing your important keywords in the anchor text when linking to other page; and make sure the page that you are linking to is related to the anchor text.
There are many other on-page factors that made up a search engine ranking algorithm besides the above.
Search engine optimisation demands a lot of time and effort in order to produce a meaningful results. It’s important to avoid doing what the search engines consider as spamming.
Instead of worrying too much about it, your time would be better spent on producing good content while employing good SEO practice. Pay attention to your website quality, the general rule of thumb is that if it’s not good for the human readers, it’s not good for the search engines.
There’s no question that having tons of links pointing to your blog will not only increase your traffic but also improve your search engine rankings. It helps users discover your blog from other places too.
However, before you begin exchanging links with other sites, there are several things you need to consider whether it will provide value that benefits you in the long run.
The following are six issues that could affect the effectiveness of your link exchange practice.
Google typically tries to serve content relevant to the user’s geographic location. When it comes to ranking your site geographically, Google considers both the site’s IP address and the top-level domain i.e. .com, .com.my.
So if you want to rank properly for certain keywords in Malaysia , it helps to have your domain name ends with .com.my. In the absence of a significant top-level domain, Google often use the server’s IP address to get a hint about your content geographic relevancy
The problem with this is that there are many non US site owners who prefer to host their site in the US due to the pricing and performance advantages over their local counterparts. Someone who uses the domain Sabahan.com.my would likely enjoys better ranking in Malaysia for the keyword â€œsabahanâ€ than Sabahan.com, provided that they are utilising a good SEO practice to optimise their blog.
This is because Sabahan.com is hosted in the US and obviously its not a .com.my domain.
Exchanging links with other sites hosted in Malaysia, or with those using .com.my extensions may give Google hints that your content is targeted towards the Malaysian audience.
The reason behind the requirement is that cross-linking between non-related sites â€“ e.g. blogging to a site selling Viagra carries little or no weight on the link because Google recognise themes of websites.
Keep this in mind when you are buying text-link-ads.com or exchanging links with other blogs by making sure the site’s theme is related to yours so that the Google juice is passed properly.
The ramification of linking between unrelated sites, even if you own them both, is not something that you should pursue.
One of the best ways to increase links to your site or blog is to create useful content that attracts audience and other site owners who might link to it.
When you have good content, people will be more likely to link to you when they consider your content would benefit their users. This way, you’ll enjoy natural link growth without the need for any time consuming link exchange emails.
Link baiting is one way to attract certain blog owners to link to you. You can write something interesting that catches people’s attention, write an analysis that generates interesting information, or say something controversial that contradict what other people say about certain issue.
Link baiting doesn’t have to be negative. A properly created link baiting article can capture large number of links on its own.
The viral effect of having your site in front the social bookmarking audience like that of Digg and StumbleUpon will make your link building effort much easier.
These sites attract thousands of visitors daily and once a user recommends your website, it becomes part of a larger audience almost instantly.
Popular articles on any of these sites often enjoy a snowball effect in that they will also become popular on other social bookmarking sites.
You may consider placing social bookmarking buttons on your blog to allow your visitors highlight your content on the various social bookmarking sites.
Sometimes, hoping others to link to you is less effective than asking for it. Contrary to what some people might believe, asking for a link does not go against the definition of natural link growth, unless of course there is some sort of financial compensation involved.
You must make people aware of your blog existence and emailing other site owners is one way to do it. Don’t beg for links, instead point them the article that they might find interesting, especially if you are writing to a prominent blogger.
Get to know them before you write to them. Even better, be a subscriber and a commentator in his blog. It helps when the relationship is beneficial to both sides in the long run.
Each time you conduct a link exchange campaign, ask yourself if it’s creating a better user experience in the long run.
Having a bunch of links to blogging, Viagra, casino sites won’t benefit your users. Google knows this and it will devalue your pages accordingly.
Consider the user’s perspective and whether the links provide any value. If not, remove them.
I was reading JohnChow.com when I came across a post about a plugin for Kontera from Shupe.ca. The developer claims his plugin will help Kontera publishers install Kontera ContentLink on their blogs easily
As a Kontera publisher myself, I could see how this plugin facilitates Kontera ads integration into my blogs. Since I was planning to install Kontera on several of my blogs and didn’t fancy tinkering with the code, I had decided to give the plugin a try.
Once activated, I noticed that the plugin was installing extra code into my blogs footer in addition to Kontera ContentLink code. However, the extra code was hidden with Cascading Style Sheet (CSS).
Below was the extra code:
<a href="http://www.shupe.ca/articles/wordpress/plugins/kontera-dynamicontext/?version=2.0" title="Rodney’s Kontera DynamiContext Plugin">Rodney’s Kontera DynamiContext Plugin</a> plugged in.
<li><a href="http://www.rodneyshupe.ca/" title="RodneyShupe.ca">RodneyShupe.ca</a></li>
<li><a href="http://www.rodneyshupe.com/" title="RodneyShupe.com">RodneyShupe.com</a></li>
<li>Check out Joomla Resources at Rodney Shupe’s<a href="http://joomla.rodneyshupe.com/" title="Rodney Shupes Joomla Resource">Rodney Shupes Joomla Resource</a>.</li>
<li>Check out WordPress Resources at Rodney Shupe’s<a href="http://wordpress.rodneyshupe.com/" title="Wordpress Resource">Wordpress Resource</a>.
Below is the resulting output if the text isn’t hidden.
According to Google Webmaster Guidelines, and I quote
hiding text or links in your content can cause your site to be perceived as untrustworthy since it presents information to search engines differently than to visitors.
If your site is perceived to contain hidden text and links that are deceptive in intent, your site may be removed from the Google index, and will not appear in search results pages.
Now I don’t see how the hidden text added by the plugin developer is necessary here. Clearly it’s there to gain some search engine advantage in a deceptive manner and this can be mistaken as spamming by Google.
The code uses the following style sheet code to hide the text
The visibility: hidden hides the text, but it takes up space in the layout while the display: none removes the text completely from the page and it does not take up any space even though the HTML code is still present in the source code.
Matt Cutts has written in his blog that he doesn’t recommend people to use the above CSS formatting to hide text.
Now that the cat is out of the bag, I hope the developer will consider removing the unnecessary code to save unwitting bloggers from Google penalties.
I came across an alternative Kontera plugin by Big Bucks Blogger.com that seems to be more versatile than the Kontera WordPress Plugin.
Pay per post publishers will be happy to know that this plugin will let them keep Kontera links out of their pay per posts. To be honest, I haven’t tried it myself. If any of you have, please share your experience in the comment section.