Another blow to my sites and income

In February websites reported on how a few simple changes suggested by Aaron increased his Google traffic tenfold. As much as I enjoyed both of them and their posts on escaping the supplemental index I never thought about implementing any of the advice myself. That was until today! As you can see from the graph below Google searches represent only 16.35% of my overall traffic, with most of my other blogs this figure is well over 60%.

Preparing for action

To take a quote from Get Your Blog Out Of Googles Supplemental Result Hell

Between slightly lower internal PageRank scores (minor issue) and increasingly aggressive duplicate content filters (major issue) and significant duplication from page to page on your site (major issue) much of your site is in Googles supplemental index.

From this, it seems as though duplicate content is the main contributing factor in going supplemental. So how does this affect this blog? Have a look at the screenshot below taken from Googles Cache.

In single posts between 200-300 words content is outweighed by all of the crap that appears on the sidebars. It completely dilutes the content sidewide. To quote Seo again.

Reduce Sitewide Repetitive Features:

You need to make your page titles and meta descriptions unique on each page.You may also want to resort your code order to put unique content higher in the page content and have duplicated and sitewide template related issues occur later on.

Taking Action

In no particular order here is a list of the amendments I made to the template to resolve the issue.

Removed the about me.
Increased the amount of Related Posts displayed from 3 to 10. You can set this value in Options > Related Posts Options in WordPress.
Wrapped each of the duplicate content blocks in a condition which stops the components from appearing on single posts. Its as simple as checking if is_single() is false. You can exclude them from categories and archives by evaluating is_archive()
In the archive template each post was originally printed in full with the_content, now it uses the_excerpt(). Ideally I should be printing a unique description tag here

What else needs to be done

Create unique description tags for each post by adding a new custom description field.
Convert the text logo into a graphic adding relevant keywords to the alt tag (most people searching for this blog wont be interested in my blog).
And most importantly..continue to increase the quality of my posts. Quantity isn’t such an issue if my content attracts citations from authority sites.

Robots.txt and Duplication

Steven Bradley has outlined another technique to escape duplication problems in his post Problems With WordPress Posts Going Supplemental. It involves placing the following code into your robots.txt

User-agent: Googlebot
Disallow: /*/feed/$
Disallow: /*/feed/rss/$
Disallow: /*/trackback/$

Let me explain, by default, Google indexes both your feed and HTML code. In many cases these files will contain the same information, causing the duplicate content alarm to sound in Googleplex. The code within robots.txt forbids Googlebot from indexing your feed data, reducing the risk of duplicaton occurring.

Results, results, results

Hopefully this will go some way to increasing from Google. Ill write up a proper update on the results in a months time. In the meantime Ill be investigating other optimization techniques. It would be good to hear about your own experiences with SEO and the Supplemental Index!

Comments are closed.