Where Do I Begin with an SEO Audit ?

As intimidating as it might sound, a site audit can help you improve your user experience and your overall organic search performance.

Wondering how to get started? We’ve got some tips to help you move forward.

Choose a Crawler

We’re starting things off at a crawl, literally. A crawl is essentially a diagnostic test for your website, and it provides a prioritized list of page issues to address. A quality crawl tool acts as a reference guide to draw data from and weigh optimization solutions.

We like SEMRush and GTmetrix. This allows us to see a site like the search engines do. The one caveat to most crawl programs is that they usually require you to register and pay for their services. Xenu’s Link Sleuth is one exception. It is a reliable, free crawl resource, but it only runs on Windows (Drat!). Screaming Frog also offers a free version of their tool, but it is capped at 500 URLs, so it’s best suited for smaller sites.

Check Your Indexing

Google uses “bots” to crawl websites and evaluate their content and usability. Once crawled, a site’s pages are entered into Google’s index. Like a book’s index, Google’s index records the number of pages it “sees” that relate to a website.

Conducting a “site:” search will give you a very rough estimate of the number of pages Google has indexed for your site.

Because both Google’s Penguin and crawl budget can cause the number of results in Google’s index to fluctuate, a “site:” search is not the most reliable way to check your indexing. It does, however, give you an idea of if Google is indexing duplicate content for your site. It’s also the easiest way to check for misuse of the no index tag, which can prevent search engines from adding your content to their index and spell disaster for your SEO efforts.

If you see that the number of results returned is significantly higher than your total number of site pages (based on your site crawl), then Google could be detecting duplicate content for your site.

On the other hand, if a “site:” search indexes a significantly lower number of results than your actual page count, some of your pages could be either penalized or inaccessible. We’ll discuss duplicate content and accessibility later on.

Find Your Organic Landing Pages in Google Analytics

It’s handy to check your site’s indexing in Google, but in terms of auditing and measuring your success it’s best to investigate actual traffic to your website’s pages. Researching site traffic in Google Analytics will give you a more accurate picture of the pages that are generating organic traffic and which ones need to be tweaked/improved.

To access your traffic data in Google Analytics, go the side panel and click the Acquisition drop-down menu. Then, select All Traffic and Channels. Next, choose Organic. Go to the Primary Dimension filter at the top of the table and select Landing Page. The number at the bottom of the table shows how many web pages are receiving traffic from search engines. From here, you can investigate metrics like page drop-off rate or bounce rate and compare date ranges to see how you stood last year vs. this year.

Now that we have useable data from our crawls report and indexing research, we can start on some repair work.

Fix Accessibility Issues

Accessibility issues are usually critical fixes. In order for your site to be ranked and indexed in search engine results, it needs to be accessible to their bots and to users. Be on the lookout for these things:

 

Blocked Crawlers:

Websites have what are known as robots.txt files. These files are used to help direct search engine bots about a site owner’s crawling preferences. For example, if you have sensitive content you don’t want to appear in Google, you can add a “no index” command in your robots.txt file. This disallow command also allows developers to block crawlers from accessing and indexing unfinished sites. The problem is, sometimes this tag isn’t removed when a site goes live. You can check your site’s robots.txt file by typing www.yoururl.com/robots.txt into the search bar.

 

Duplicate Content:

The content “Clone Wars,” as I like to call them, happen when crawlers recognize the same content in multiple areas of your site. This confuses search engines because they cannot figure out which piece of content is the original and the one worthy of ranking. Your crawl tools can help identify areas of duplicate content on your site. Depending on the circumstances, this content can be removed or corrected with a 301 redirect, which permanently sends the duplicate URL to the authoritative page, or a rel=canonical tag, which points the search engines to the authoritative version of the content on your site.

 

XML Sitemap and Friendly URL Structure:

A sitemap is a roadmap for search engines and it must adhere to a specific protocol. If you have a developer, follow up with them to make sure your sitemap is accessible and formatted correctly for search engines. Also, for user friendliness, you should ensure that your site has a clean URL structure. Your URLs can be optimized by cutting out unnecessary clutter and parameters. For instance, a blog URL should read: yoursite.com/blog, and URLs for the entries should read: yoursite.com/blog/entry/your-post. This creates URLs that are more search engine friendly and easier for your visitors to remember.

We’re Just Getting Started

An SEO audit of your website can get very detailed, and we’ve only touched the tip of the iceberg. Check out Amanda’s post, Five Ways Your Site Is Hurting Your SEO, to find more areas to investigate. Moz also offers a handy checklist to help you stay organized and continue your site auditing pursuits. We hope this post is a helpful jumping off point for your site audit!

Don’t spend all your time trying to do digital marketing by yourself. With a digital marketing strategy and Papercut’s expert assistance, you’ll increase your website traffic and conversions and grow your bottom line.

Schedule a Call Today

Join Our Mailing List

Sign up to receive updates and insider info from Papercut.