Your Guide to a Perfect SEO Audit

First of all, thank you for believing in us and downloading this guide. We would like to assure you that by the time you will be through with this guide, you will have a better understanding of SEO, and how you can optimize your website for SEO.

So if you are fresh to the world of SEO, feel free to use this detailed guide brought to you by SEOPro.

Gear up For SEO

Before you begin auditing your site for SEO, you must make sure you have these tools.

  • Google Search Console Tool
  • Google Analytics Tool

Google’s Search Console Tool

This is Google’s way of communicating to you what’s wrong with your site. All you have to do is just sign up on Search Console (formerly Webmasters), if you haven’t already. To give a brief idea, Google Search Console can help you find:

  • What pages are being crawled by the Googlebot
  • How frequently are they being crawled by the bot
  • How many clicks you are getting
  • Where are you getting the clicks from
  • What links are going from your site
  • What sites are linking to your pages

There’s a lot more to Search Console. We suggest you become a part of it today, if you haven’t signed up yet. With these insights, you will be able to make your site better in ways unimaginable. Read on to know more about Search Console/ Webmasters.

Analytics Tool

Google Analytics is another free tool that allows you to see where traffic is coming from. Demographics are an essential element in research, product development and its marketing. Analytics takes the data to next level. It can unclose key statistics like age, gender, region, time, density, browser, device, OS and several others.

Having a firm grip on such valuable data can help in targeting.

Now that you have some knowledge of these two essential tools, let’s get started with outlining the audit.

SEO Audit Checklist

SEO audit can be overwhelming, so we will try to break each section as we move forward. Here’s a checklist that covers the major sections.

  • Make sure the site is accessible for both users and the search engines
  • Make sure the site is being properly indexed by Google
  • Make sure the on-page optimization is complete
  • Make sure off-page optimization is complete
  • Make sure relevant competitors have been analyzed

1. User-Engine Accessibility

A website in this era must serve dual purpose. It must be accessible to both the human visitors and the engine’s crawler. A site that isn’t accessible to either is not a good sign. You can figure out the demerits of user-inaccessibility. But if a search engine can’t properly crawl the pages of your domain, your site might not appear on the search engine the way you want. It might not appear at all.

To make sure the desired pages of your site are crawled by Google, Bing and other engines, the site needs to have:

  • Robots.txt
  • HTTP status code
  • XML sitemap
  • Site Architecture
  • Flash/ JavaScript
  • Performance

Robots.txt:

This is the file that tells crawlers what to crawl and what not to. You can access it by typing “/robots.txt” after the URL. For instance:

www.abc.com/robots.txt

Check the file for any discrepancies and upload the newer version if any changes are made. You can also access it via Search Console to check for inconsistencies.

HTTP Status and Redirections

If your site’s query returns a 400s or 500s code, this means users will not be able to see the content. The solution to this problem is simple. The error is because there’s no content, so make sure the next time visitor or crawler goes there, there’s something worth the time.

If you would like, you can also redirect the visitor and the crawler using 301 redirects, which are permanent redirects. Make sure the page you are redirecting to is relevant.

Sitemap for Algorithm

XML sitemap assists crawlers in their tour. It lists the pages in order so the crawler knows about the most important pages. Sitemap is accessible via Search Conosle as well. Make sure it has all the pages you need and that it’s free from errors. Also make sure that it’s up to date and conforms to your new structure (if you have made any changes).

JavaScript and Flash

Search engines have gotten smarter with time, but to be on the safe side, ensure that if JavaScript and Flash have been used, both the crawler and the user are able to access the content.

Site Loading time and Performance

Sites that are fast and easy get more views – it’s simple. Crawlers are running on a tight schedule – so are the users. To encourage more visits from both Google and your visitors, optimize your site’s speed and performance.

2. Indexing issues and How to Deal with Them

With first accessibility issues out of the way, your site is one step closer to perfection. Now let’s see how many of the site’s pages are actually being indexed by Google or the engine of your preference. To optimize your site’s indexability, here is a brief checklist:

  • ‘Site:’ command yields ideal result (same number of indexed and actual pages)
  • Manual verification whether the desired pages are being indexed or not
  • Making sure whether Google recognizes your brand
  •  

‘Site:’ Command

site:www.abc.com

Enter the above command in Google’s search bar and the engine will provide you with all the pages it has indexed from your domain.

Note that this is a rough count and if:

  • The number of indexed pages and actual pages are same; you have nothing to worry about.
  • Actual pages are more while indexed pages are fewer; Google isn’t crawling every page of your site. There could be various reasons for it, most of which have been dealt in accessibility section. However, you should also check your Search Console for any error or penalty notifications.
  • Actual pages are fewer and the indexed pages are greater; there could be a duplication issue with the site.

This is how the warning message will look like if there is any duplication.

Manual Verification

Search a specific URL on Google or Bing to see if their crawlers recognize and list the URL in their search results. If it’s not there, it’s either set to “no index” in robots.txt, or the site is penalized.

Google’s Recognition of Your Brand

One of the biggest problems for starters or sites with little popularity is that Google and Bing are unable to recognize them as a brand. Imagine if Google returned you a link to an e-mart every time you searched the term ‘Apple’.

What about your brand? Is it the first thing Google shows when you type the brand’s name? if not, then either Google doesn’t recognize it, or it is penalized if it is nowhere in Google’s results.

Now seems to be a good time as any to bring penalty to light.

What are Search Engine Penalties?

A penalty is when Google stops indexing the page(s) of your site. A penalty effectively means anyone searching the web will never see your site in SERP (search engine result pages).

How to Deal With a Penalty?

To successfully deal with a search engine penalty, otherwise known as ‘manual action’ (because Search Console will notify you), you need to follow these 4 steps:

  • Ensure it is a penalty – remember when we said it could be robots.txt? There are hundreds of people who claimed they were penalized but in fact, the pages were set to ‘noindex’ in robots.txt. Algorithm update is not a penalty. But if it has shuffled your rankings, you should consider it as one.
  • Find out why you got ‘hit’ – If you receive a message or a notification in Search Console, you can be dead-sure of a manual action. If not, it is most likely due to algorithm update. Luckily, or unluckily, you are not alone. You will find hundreds of people on the internet complaining about the same problem post-update. So head to those forums and find out the cause Sherlock.
  • ‘Fix’ the issue and submit your reconsideration request – Do what the engines want you to do. Once that is done, use Search Console tools to file a request to reconsider your site. Remember that reconsideration is only for manual request; the ranking issues arising due to algorithm update will only be resolved with the next update.

3. On-Page Ranking Factors

Okay, now that your site is free from indexing and accessibility issues, let’s move on to on-page SEO factors that have a direct effect on your site’s traffic and rankings.

The term ‘on-page’ is pretty self-explanatory. These are the little things on your site that add up to offer immense value to the visitor. There are two types of on-page factors you need to put in your checklist:

  • Page Level – factors that enhance quality of concerned page(s)
  • Domain Level – factors that amplify the presence and appearance of your domain as a whole

On-page Factors Checklist

Engines as well as users tend to prefer sites that offer value. In terms of on-page, the real value comes from:

  • URL
  • Content
  • HTML

Optimize URLs

Your URL is going to be the first thing user sees. It’s only reasonable if you begin on-page optimization from there. So make sure when it’s page level on-page:

  • The site’s URL is precise and has fewer than 110 characters.
  • It is self-explanatory and has the right keywords
  • The URL contains subfolders instead of subdomains as folders are SEO-friendly.
  • Always prefer ‘-’(hyphen) over ‘_’(underscore).

And when it’s domain-level on-page, check if:

  • The URL is in line with best practices
  • The page links are optimized (see page-level factors)
  • The URLs follow a pattern
  • The URL is under-optimized, over-optimized or perfectly optimized?

Optimize URL-based Content

If two URLs in your site are leading the user to same page, it fools Google’s algorithm into believing both URLs lead to different page. Similarly, there are other alarms that indicate duplicate content on your site, which is what Googlebot despises.

Why Content is The King?

Frankly, nobody like to waste time reading lifeless, valueless content. A piece of work that doesn’t connect, does not sell. Users don’t like it, at all!

It’s not just users. Google and Bing engines are always on the lookout for poor or duplicated content so they can purge their engine off it. To really make your content feel like a king:

  • Offer substantial content. At least 300 words should be your aim.
  • Don’t waste your words. Deliver quality.
  • Never stuff keywords. Always optimize the keywords in your content.
  • Don’t make spelling and grammatical errors. It’s out of question.
  • Make it immersing, interesting and engaging.
  • Make it search engine friendly. Don’t complicate it by stuffing in too much of JavaScript and Flash.

How to Optimize the Content?

Here’s your checklist to ensure your content delivers high value.

  • Plan before you act.
  • Remove keyword cannibalism
  • Remove duplication

Planning the Content

The content and keywords should follow a set and a predefined path. URLs, the on-page content and the keywords, all must follow a plan.

Absolutely No Cannibalism

One keyword should never lead users to different/ multiple pages. It is called cannibalism and offers worst experience to user and the engine.

Make a rough list of your keywords. See if two similar keywords are directing the users and the crawler to different pages. If this is the case, either repurpose the other page or merge the two into one page with unified content.

No Duplication

Google’s crawler cross checks your content, within the site and outside of it. The bot matches phrases with its existing database. The page offering nothing new or valuable to Google (or user) falls in the danger zone.

To make sure Google’s Panda doesn’t penalize your site, here’s a simple tip – keep the content simple, original and valuable.

HTML Optimization

HTML is the skeleton of your site. Without it, it just wouldn’t be the same. The first thing to check in HTML is if all the coding is valid. Any links that appear broken or HTML tags that aren’t working – you need to be on the lookout for little things.

Following is your mini checklist for better HTML.

  • Perfect the titles
  • Align meta tags with best practices
  • Make most of other ‘head’ tags
  • Optimize the images
  • Optimize the out-links
  • Amplify the effect of other ‘body’ tags

Improving the Title

The title for any page gives the user an idea of the content. A title is what most people read to get an idea about the content that follows. So for each page in your domain:

  • Make sure your title describes the content of the page.
  • Make sure the title is fewer than 70 characters so it can be easily tweeted and read by engines.
  • Including keywords is recommended, stuffing isn’t.
  • Again, don’t go overboard with optimization.
  • Make each title unique from the other in your domain.

Aligning the ‘Meta tags’

Meta title, meta description and meta keywords collectively form what we call, ‘meta tags’.

These tags aren’t the sole factor but they sure are important. While title gives the user a hint of what’s to come, the description lays out the brief overview of things suer can expect in th main body. So like meta titles:

  • Make each description unique (Search Console will notify you if it isn’t)
  • Make it short and sweet (150-160 characters is good enough)
  • Make it relevant
  • Use keyword only if it fits in naturally

Optimizing Rest of Head Section

With meta title and description out of the way, your job in head section is half done. Now, onto other essentials of head.

  • Meta keywords have been a part of spamming in the past. Now, engines don’t really pay much to them. If you still wish to use them, be cautious.
  • Make most of rel=”canonical”. This command is very useful in freeing the site from duplication.
  • The commands rel=”next” and rel=”prev” are your friends in pagination issues.

Images

One day, Google’s crawler might get smart enough to read the image like it scans the content. Until then:

  • Provide meta data for the image
  • Use appropriate alt text
  • Upload the file with a name Google can understand and relate to

Outlinks

One key factor in optimizing your site for the search engine is to provide link(s) only to quality sites. Here are things to remember when outlinking.

  • Link to trustworthy high-quality domains
  • Link only to relevant sites so users and engines can navigate naturally
  • Optimize the anchor text, the hyperlinked part of text that leads to other sites, by keeping it relevant.
  • Google’s Search Console can notify you of any internal broken links. Make sure there are no internal links that are broken. Manually check your outlinks to make sure the sites you are referring to are not broken either. Be on the lookout for 4xx and 5xx codes.
  • Don’t make your site a maze for search engine algorithms or your users. Optimize your redirection.
  • “Nofollow” your external links. Optimize the link juice for you internal pages.
  • Prioritize your internal links and pass the most juice to most important pages.

Optimizing rest of the page’s Body

Body of your content is what will convert a visitor into lead. So it’s always crucial to stick to best practices in body optimization. Here are some of them.

  • Use an H1 tag. Don’t go for overkill.
  • Include a keyword in H1if possible.
  • Embedded content isn’t considered a part of your site – make sure not to embed too much of external content.
  • Optimize content/ad ratio
  • Optimize text/HTML ratio

4. Off-Page Ranking Factors

The last part, on-page factors, showed how to optimize your site’s content for user and algorithm. This part will focus on off-page factors, elements that determine how useful external resources in uplifting your site are. The key areas to focus for better off-page SEO are:

  • Making sure the site is popular and influencing
  • Ensuring the site is trustworthy
  • Ascertaining the site receives only relevant and quality backlinks.
  • Ensuring your site is as popular on social platforms as it is on search engines.

Increasing Popularity and Influence of the Domain

If your site is popular, it means more people know about it. When people are more likely to visit your site, it’s more likely to make its way in the Google’s radar of popular domain. The benefits of being a popular site are many, but the primary ones include better crawling rate and more traffic.

So, if you would like your domain to be more visible and more popular, ascertain:

  • It is gaining traffic in an increasing trend, not the other way around. Optimize use of Google Analytics and Search Console to monitor traffic on your site.
  • Match your domain’s popularity with the sites you link or receive links from. If they’re more accessible and useful, you might like taking a couple of notes from them.

Making the Site Trustworthy

The word ‘trustworthy’ can mean a couple of things here. It can mean that it’s free from malware and spam. And it can mean that the site is trusted by users, as in, they rely on your site for information.

To make your site free from malware integrate Google’s Safe Browsing API. To ensure it’s free from spam, make sure:

  • The site and the contained pages aren’t stuffed with keywords.
  • There is not hidden or invisible text running at the back. The crawler can read it and penalize the site for it.
  • Avoid using cloaking techniques. Cloaking means showing crawler content that differs from the one shown to certain user. Crawlers of 2015 can spot that out too.

The best way to build a trusted relationship with Google, Bing and human-visitors is to get backlinks from sites that both engines and users trust.

Backlink Analysis

We’ve already mentioned baclinking a couple of times and it seems like the best place to elaborate the concept.

Backlinks to your site means the hosted content (or part of it) at other domains which link to your page(s). If your site is being linked from low-quality or poor domains, you to make amends.

To analyze the backlink structure of linking and the linked sites, there are plenty of tools available online. Some of them are free, but the best ones come at a price. General things that help in analyzing and strengthening the backlink structure are:

  • The more the merrier. A single link from multiple high-quality domains is better than having hundreds of links from one domain or several low quality domains.
  • Smart Nofollow/ Dofollow ratio. Keep most of the links dofollow, but not all of them. Setting all links to dofollow will get the site (or blog) penalized.
  • Search engines will penalize your site if the anchor text isn’t distributed naturally. To keep it more natural and real, diversification is the key.
  • And of course, relevant keywords help in rankings. So try to focus on those keywords.

Promoting and Optimizing Social Engagement

Social media is the next generation of internet. With nearly a billion and a half users making regular appearances on social media, connecting and socializing get more important with every passing day.

Popular statuses and posts in a social media are fast to get blessings of Google and Bing algorithms. And thanks to the recent deal with Twitter, Google has made the intentions of promoting social media through its algorithm, clearer.

Thus, if you would like a better following you will have to:

  • Get social first of all
  • Monitor and quantify your social signals.
  • If possible, perform a background check of some of the followers. They may be more useful in promotion of your brand than you think.
  • Analyze and enhance your social approach.

 5. Competitive Analysis

The great Chinese military strategist Sun Tzu once said, “Keep your friends close, and your enemies closer.” He couldn’t have been more accurate.

Although it’s nice to have all your eggs in the basket, there’s one egg you might be ignoring. While analyzing your competitor in the search rankings warfare is difficult, never think of backing down from this challenge – even if it means going through all possible steps in aforementioned lines.

The more you know about the competitor, the better. Benchmark a few of your competitors and use the same backlinking tools to find out the backlinks of your competitors. Try to monitor their social engagement as well.

Once you know what works for your opponent, you can adapt, maybe even improve on those tactics. For the things that aren’t working for your competitors, be thankful you got to know them before you blindly followed their footsteps.

SEO Audit Report

SEO audits and analyses are hectic. Carrying out all of these steps seamlessly is tough – arranging these steps in a manageable, presentable way is tougher. So here are some tips to remember as you arrange the information for future referencing.

  • Think of everyone when compiling the report.
  • Rank the issues/ solutions with respect to priority.
  • Be more specific.

Making an Easy-to-Comprehend Report:

The formulated report must be easy to grasp, for newbie and professional alike. Avoid jargons and terms that will raise more questions than answers. Many senior level executives might be unaware of SEO terms and tactics so it’s best to keep the report simple and thorough.

Prioritizing your Findings

Managers and executives might already have more pressing concerns so it’s a good idea to keep it as precise as possible. Keep the most urgent issues up front; less-important ones at the end.

Being Accurate and Specific

Don’t offer vague recommendations because anyone can come up with ‘rework the content, it’s all wrong’. You need to identify key points and give concrete suggestions with reasoning.

Endnote

When it comes to SEO, there are more ways than one to get the job done. Every business has different audience, different site, and different requirements. SEO is a far cry from on-size-fits all approach. It must be tailored to meet individual needs. So as you begin your site’s audit, remember to analyze and take notes along the way. Apply, learn, adopt, then apply some more of what you learned.

Questions? Comments? Did we miss something? Is there something else you would like to learn about? Browse for our other guides or send us your queries at [email protected]

Leave a Reply