How to Evaluate a site’s Condition and Effectiveness

There’s been a lot talk lately about how to check your site to make sure there aren’t any problems that could cause a penalty or loss in microscoperankings. When I check out a website, it’s often after something bad has already happened, so in those instances, I’m usually looking for something very specific. That often turns out to be link-related, which requires some significant work with the link profile. I’m not going to get into that in detail today, though.

For a new client, I nearly always have to do a complete site audit, and when I do, I like to look at everything. Here’s the way I go about auditing a site:

Do a site crawl

There are a couple of tools I like: Screaming Frog, which is free and does a very decent job of checking most of a site’s architectural aspects that I need to look at. And MicroSys Website Analyzer, mostly because I like the format it puts out, to cross-check Screaming Frog. There are others that are good, too, of course. But I have yet to find just one tool that tells me everything I want to know about a site (especially when it comes to links).

Google Webmaster Tools and Google Analytics

In order to do a decent site audit, you’ll need to have access to the Webmaster Tools and Analytics accounts. If the site doesn’t have them both already, get them set up now. If you want to be really thorough, you can do the same with Bing’s tools, but I rarely do, unless I see something strange.


Between your crawl tool’s output, GWT and Analytics, you’ll be able to see what the site is targeting for keywords and what success it’s having in ranking for them.


Take a look at the URLs of the site. If they’re not user-friendly, fix them and generate the necessary 301 redirects. If the site has an XML Sitemap, you can sometimes find some interesting discrepancies by taking a look at it against your crawl.

Title tags

Titles should be unique for each page and descriptive of the page’s content. They should include the keyword the page should rank for and should be approximately 60-70 characters.

Meta descriptions

The meta description should accurately describe what the page is about, and should contain the main keyword for that page. Be careful not to stuff keywords, though. Readability and relevance are the keys here, as the description will normally be what appears in the SERPs, and you want it to convince the user to click through.

Meta keywords tag

The meta keyword is no longer heeded by most search engines, so these are unnecessary, unless your specifically trying to rank with one of the engines that still supports them.


Your headings should follow sequential hierarchy, beginning with the H1. Your H1 will normally contain your principle keyword for the page, and your H2, H3, H4, etc. tags can introduce other keywords, provided they are present in the copy they precede.


The content on your pages should be relevant to the topic at hand, original and of high quality. That means entertaining, informative or interesting, well written, with no spelling, grammar or punctuation errors, and above all, WRITTEN FOR THE USERS, NOT THE SEARCH ENGINES.


Take a look at your linking, both internal and outbound external. Anchor text diversity is important, so don’t be bashful about using such anchors as “more” or “click here”. Don’t overdo the use of the same keywords in your anchors. Ideally, any textual link should flow naturally when reading. It’s okay to use a nofollow attribute on internal links to things like your Privacy Policy or Terms of Service, but don’t overdo this either, as it can look like you’re trying to sculpt PageRank. Avoid repeating the same link more than once on a page.

Image text and alt texts

Accessibility standards dictate the use of ALT tags on all images, as this is the text which a screen reader will enunciate for a visually impaired user. It will also display as the marker whenever an image cannot be displayed for some reason. Don’t use labels like “image3214abc” as file names. “East Tower Facade” or an equally descriptive name is much better. Use keywords only when relevant to the image.

URL redirects

Redirects should be present for any pages you no longer want to display. If a page is gone, the user’s browser will return a 404 (not found) error, which, at present, will not hurt your site, but does impact the user’s experience. In nearly all cases, a redirect should be a 301 (permanent redirect) versus a 302 (temporary redirect). 302s pass no link equity, while a 301 passes most of the link equity available (some minor bleed-off occurs).

Duplicate content

Duplicate content issues can arise for a number of reasons and is generally going to have a negative impact in some fashion. If you have duplicate or highly similar content on your site, you need to do some re-writing. If you have content that also appears on other sites, you need to remedy that in whatever fashion is appropriate – rewriting if your copy isn’t the original, getting the other versions taken down if they’ve copied you.

I’ll take a look at the htaccess and robots.txt files and will also check for things like broken links, pageload speed, cross-platform compatibility and general usability and accessibility issues. Sometimes one or more of these will highlight an issue that should be addressed at some point. If the scope of work includes it, things like conversion optimization, competitor analysis and authority/attribution will be looked at separately, as they don’t really fit in an audit.

By the time you’ve gone through all the above items, you’ll have a good idea of the site’s condition and effectiveness. This isn’t an all-inclusive list, by any means, however. But it’ll keep you busy for a while and will probably give you plenty to work on.

Leave a Reply

Your email address will not be published. Required fields are marked *