Bug Tracking Banter

Thoughts, articles and musings of the BugAware development team related to all things bug tracking, software development and other "rant worthy" topics.

Saturday, August 11, 2007

Lets be honest

OK, I admit it. This is just another stab in the dark at SEO optimisation and a fruitless attempt to increase our Google ranking for our chosen keywords, so it may as well be the topic of my first Blog posting !

It's not easy !

In the good old days of BugAware 1.07, with no attempt what-so-ever at SEO (not even sure if the concept existed back then) - we were consistently in the top 10 of Google search results for various keywords including "bug tracking software", "issue tracking software" etc..

These days, it's a totally different story. You cannot find BugAware on Google for our major keywords, unless you are prepared to navigate past the top 200 search results. Over the last few months I've tried every trick in the "SEO book" and have only just discovered what our major problem was.

Failed Attempt

I totally redesigned our web site, added loads of valuable incoming links, wrote press releases and waited patiently... and waited.... and waited. Although there were minor fluctuations in our rankings - "Oooh we moved from 498 to 475 for 'bug tracking software' " - my hard work had not paid off. We had a great PR of 6, loads of incoming links - what was the problem?

I then resorted to Google AdWords, and although we had some nice targeted traffic, I felt nauseous every time someone clicked on a paid link and a few more dollars topped up the Google coffers. I want organic Google traffic, why did I have to pay for this? We should be ranking well. Bring back the good old days of 2002!

One of the most infuriating things was the fact that we ranked number 7 on Yahoo and number 2! on MSN or Live.com. Sorry Yahoo & MSN, but nobody uses these search engines and I get more traffic from my piteful ranking on Google than being in the top 10 on other search engines.

So - what was my big discovery?

Well, the battle is far from over as we still rank poorly - though I'm hoping over the next few weeks we will see a dramatic improvement. I came across a forum post discussing duplicate content. One poster described exactly the same symptoms that we had, ranking really well once then vanishing out of the top 300 results for no apparent reason. What he discovered was that he was being penalised for duplicate content. He re-wrote all of his content and within a few days he was back to where he was years ago.

Interesting, I thought - though all of my content is original? Why would I be penalised?

I then came across a very valuable tool http://www.copyscape.com/.

You simply put in your web site URLs and it finds all of the other web sites out there with duplicate content to your own, and my content was "all over the place"! The press releases and web site descriptions that I had used when adding links to BugAware.com over the years all came from source content on our web site. I can only assume that at some point Google decided that we were no longer the originator of this content as we were being penalised for having duplicate content. I rewrote all pages that came up as having duplicate content on Copyscape, and a few days later I noticed a massive leap in the hundreds for the ranking of all of our major keywords. Our ranking is improving every day and with fingers crossed here's hoping that we'll be back where we were in the good old days of BugAware 1.07!

Hopefully this tale will help other people out there who have been scratching their heads on why they have dramatically lost ranking over the last few years - refresh your content as you may no longer be considered the original source for your original content.