Wednesday, September 29, 2010

Essential List of Free Content Related SEO Tools

Content is King in SEO – we all know its true, but many people still fall into the trap of focusing on everything else – except their content! It’s easy to get distracted with your offsite SEO strategies and forget the fundamental rule of SEO – create valuable relevant unique content. So to help you research, create and analyse your content, I’ve created this list of essential free content SEO tools.

Keyword Research: Keyword should be the starting point of any solid SEO strategy. These tools and services will help you find new keywords and check to see if your existing keyword strategy is viable.

Thesaurus & Dictionary: Often overlooked, but a good place to start is basic dictionary and thesaurus websites. It helps you think of other ways searcher could be looking for your products/services.


Google External Keyword Tool: This free keyword tool offered by Google is the first spot for research keywords by most SEO professionals. It also provides valuable insights into traffic levels.

Wordtracker: Wordtracker (and the many other keyword research tools available) utilize data that includes sources outside of Google.

Website Keyword Suggestion Tool: This keyword suggestion tools searches the existing content on your website to determine the keywords and search terms to target.

Duplicate content checker: Duplicate content is a big no-no in the SEO world. Getting caught with duplicate content can result in penalization by the search engines – which often includes delisting from their results. These tools will help you check whether the content on your product pages or articles is too closely matched to other content on the web.

Page Comparison Tools: Sometimes, the best way to opti9mize your content for better rankings is to see what your competitors are doing better. These two tools will check to web pages (yours and your competitors) and provides some vital stats about each.

Keyword Density: When determining relevance, search engines like Google look to keyword density are one of the key measures. These tools will help you check whether the copy on your pages provides adequate density to your primary keywords.

Text to Code Ratio Tools: In the world of the web, being lean in terms of code is now more important than ever. Ideally you want web pages to be more content than code to keep the search engines happy. It makes their indexing of your content faster and more thorough. Use these tools to check the ratio of your web site code to content (indexable text)

Search Engine Spider Simulator: While the good looking pages (with images and fancy media) can be good from a user perspective, you have to remember that search engines primarily index text – and that’s how they assess your content’s relevance. These tools will help you see your webpages in the same way that search engine spiders do.

With all these free tools in your content optimization tool chest, it’s a good chance to go and review your existing content and see if it needs a spruce up, or create some new content on your website with the new content opportunities you’ve found.

If you have any other content SEO tools that you use, share them with our readers via the comments below.


Wednesday, September 22, 2010

Creating Effective and Optimized Website Navigation

Following on from my posts about website architecture and PageRank leakage, today I’ve decided to cover another important element of your site’s structure – Navigation.

There are two main considerations when building your sites navigation-

  • First and foremost should be users. Your site’s navigation needs to help users find different content on your website quickly and easily.
  • The second consideration should be search engines. Navigation should also be useful for search engines wanting to crawl your website and index your content.

It is possible to create navigation that will satisfy both users and search engines, so let’s take a look at some advice which can help to achieve this.

Simple text based navigation is the most search engine friendly. Apart from being easily followed by spiders, text based navigation gives an opportunity to include the keywords in the page content.
The keywords included in the navigation are valued by all search engines and the anchor text used for each link can help other pages to rank well.

Search engines will still be able to crawl image based links, but you will lose the keyword based targeting you can achieve with text.

Generally speaking, it is best to keep your navigation along the top or left hand side of your website. There are a couple of main reasons for this-

  • It keeps your navigation links towards the top of your HTML which is good for search engines.
  • It keeps navigation links in the eye path which is makes it easy to locate for users.

It’s also important to keep a standard location for your navigation across all pages on your website. If navigation switches location on various pages it may become confusing for users to locate.

Avoid Using Javascript: Search engines are getting better at reading JavaScript, but it can still cause crawling issues. The best option is to avoid JavaScript based linking as it may result in search engines missing some pages and not performing a complete crawl of your site.


The Definitive Guide to Robots.txt for SEO

A robots.txt file provides restrictions to search engine robots (known as “bots”) that crawl the web. Robots are used to find content to index in the search engine’s database.

Surprisingly, many website owners forget to maintain, let alone create a robots.txt file for their websites, so here’s a guide on what a robots.txt file is and how best to use it for SEO purposes.

These bots are automated, and before they access any sections of a site, they check to see if a robots.txt file exists that prevents them from indexing certain pages.

The robots.txt file is a simple text file (no HTML), that must be placed in your root directory, for example - There are 3 primary reasons for using a robots.txt file on your website:

  • Information you don’t made public through search
  • Duplicate Content
  • Manage bandwidth usage

The robots.txt file is just a simple text file. To create your own robots.txt file, open a new document in a simple text editor (e.g. notepad).

The content of a robots.txt file consists of “records” which tell the specific search engine robots what to index and what not to access.

Each of these records consist of two fields – the user agent line (which specifies the robot to control) and one or more Disallow lines.

For SEO purposes, you’ll generally want all search engines indexing the same content, so using “User-agent: *” is the best strategy.

If you want to check your Robots.txt file is implemented correctly, visit your Google Webmaster Center. It allows you to check your robots.txt. Google will automatically and in real time retrieve the robots.txt from your website.


Wednesday, September 15, 2010

Is your website leaking page rank?

PageRank leakage, you must first understand how PageRank or ‘link juice’ is distributed between various links on a page. Taking a very simplistic view, let’s say that a page has 20 ‘PageRank points’ and passes 100% of its link power to 5 other pages.

Let’s say that your homepage now has 5 additional external links to other web pages. Using the formula above, each link now only carries 2 ‘PageRank points’ with half of your link juice being distributed to other external web pages.

PageRank leakage is when outbound links to external websites give away PageRank that could be better distributed among pages within the website.

The most effective way to prevent PageRank leakage is to eliminate redundant external links. By doing this you will focus PageRank distribution amongst other pages on your website and improve their overall ranking in search results.

Here are some common sources of external pagerank leakage:

  • Footer links – this may contain links to the company who designed your website or the website template you are using
  • Blog roll links – your blog may contain many external blog roll links
  • Advertising links – links to other advertisers can also be a source of PageRank leakage.

Some webmasters have suggested adding a rel=”nofollow” attribute to each external link to prevent this link from passing any PageRank and effectively funneling all link juice internally. Unfortunately, this will not have any real impact as the PageRank for each no follow link is evaporated, reducing the amount of link juice that can be passed to other pages.


Five tips for greater press release exposure

Press releases are a simple way of getting the latest news from your company out to the masses. Traditionally, you wrote a press release and sent it to the media contact to consider putting your news in their newspaper or magazine. However, the internet has now become the fastest and easiest way to get your news in front of more people’s eyes.

Just like your website, you need to optimize your press release to give it the best exposure in the search engines.

Five 5 ways to optimize your press release to gain the best exposure, so I wanted to share these with you:

1. Meet audience demand: Prior to drafting a release, you need to understand what your audience is demanding. Meeting audience demand is integral to accomplishing your press release visibility objectives.

2. Stay focused: By keeping your keywords and topics focused, your release can rank better in search engines and resonate more with media. As you are writing releases, remember you are writing about one topic per release. By segmenting the message or trying to say too much at once, you dilute your key points and take a risk prospects and media will walk away without taking next steps or remembering the point. Keep it simple, focused and impactful.

3. Use images for search: Images can increase the click through rate on releases in both regular and news search by 15 – 25%. It’s a simple step, but can’t be stressed enough. Additionally, using images creates more traction in media – journalists and bloggers both love images as it helps them tell their story.

4. Use videos to engage visitors: By using video in news releases, we have seen up to a 500% increase in time on pages. As the web shifts to a rich media experience, bloggers, media and end users are becoming more accustomed to video. In the future, it may be common that video is included with releases. But since today it is not as frequently used, it’s a chance to make your news stand out.

5. Optimize your release:

a. Anchor text links, b. Alt-tag, c. URL Keyword, d. Description Tag & e. Title of release

And there you have it, five very useful tips to ensure your next press release gets the attention it deserves.


Thursday, September 9, 2010

Five SEO Tips To Improve Your Search Engine Ranking

Here are my top five tips for making your website easy for those "gazillions" to find it.

1. A picture might be worth a thousand words, but search engines don't read pictures. Make sure your key search terms are written out in text, not part of a graphic title you hire somebody to prepare for you. That also means you should not just show pictures of toys, but also write out the names, and possibly a keyword description with the title.

2. Have several pages of articles related to your website's topic. Use a different keyword search term for each article.

3. What's the URL of your website? Your name won't help you there. Your key search term will. Hire somebody who knows what he is doing to develop the right keyword strategy for you BEFORE you choose your domain name.

4. What's the title of your page? I don't know how many times I see titles such as "Article" or "Contact us". Don't expect the search engine robots to get all excited about that term. And don't expect anybody to search for that term, either. By the way, this is the single most important place to include your keyword phrases.

5. What about that navigation menu that appears on every single page of your website?

In fact, there are dozens, if not hundreds of things you can do to win the search engine race. These top five search engine optimization tips are a great start, whatever your website is about.

Wednesday, September 8, 2010

Improve Your Rankings with Flat Website

Optimizing your website architecture is an important step in laying the foundations for any successful SEO campaign. Site architecture can impact the ranking of your website in search results and also the number of pages that are actually indexed.

Site architecture refers to the structure of your website and the way the each page is connected to another. Creating a ‘flat’ site involves minimising the number of clicks it takes to reach each level of depth across your website.

The major problem with the deep site architecture on the left is that with every level of depth you are losing some amount of link juice. If the majority of your content is two clicks away from the homepage search engines are going to see these as less important than the example on the right where the majority of content is just one click from the homepage.

Using HTML Sitemaps to Improve Crawling - If changing your site’s structure is very difficult, or you’ve got a large website, HTML sitemaps can also be used to help get more pages included in the index.

you have linked to the sitemap directly from your homepage, Google will see this page as important and begin crawling each of the links on this page. Creating a sitemap in the following format can help build out the flat site architecture I’ve mentioned above.

Cross Linking Between Pages: One final tip to maximize the flow of link juice within your site is to use cross linking. This involves linking to related pages from within each level of depth on your website.

For example, if you’re writing a blog, one way to do this would be linking to other articles you have previously written on the same topic. That way, the link juice of the current page is also being shared amongst other content pages on your site which can help to improve the ranking of these pages also.