This post accompanies my talk at the Staffs Web Meetup on the 16th of November 2016.
Following on from my talk, this post goes into a little more detail about each of my 7 top SEO elements to keep an eye out for when you start auditing any site.
Here's what you'll learn today:
7 of the most important on-page elements to look for when conducting an audit.
As well as:
How to download and use my Free Excel Auditing Workbook
Other tools to use
Let’s get started:
If your website doesn’t load, that’s a big cause for concern, right?
Over time almost every site will accumulate internal links to pages that no longer exist – through countless redesigns, page removals, URL renames and other factors, unless you redirect every old URL and update old internal links, you are guaranteed to be affected.
Let’s say you built your site at the peak of fab-web-design – the late 1990’s;
I couldn't bare to add the original, animated version of this...
Your pages were built in Microsoft Publisher (*shudder*) and are filled with frames inside of frames, alongside every animation ever created. Your visitor’s cursor also changes into a neon-pink pulsating arrow just to make them go “WOW!”.
You build 100 pages and each of those uses the URL format: mysite.com/home/index.html
After some years you come to your senses, realising your window cleaning business really shouldn’t try to reimagine the web with all those hip, modern design elements, and you hire a WordPress developer to create a brand new, professional looking site.
But, they copy and paste all of your old content onto your pages (which now use the URL format: mysite.com/example/), complete with internal links to your old pages.
Google (including your favourite engines of the time like Altavista) now sees the new design, but when crawling your site they find links to pages that no longer exist.
Window Cleaning Stock photo time!
Search engines see this as you purposefully trying to tell them about pages that are no longer there, which eventually leads to your site losing any rankings it previously built to those pages.
Now, let’s get serious a little less sarcastic:
A page that no longer exists will most often show a 404 error in your browser.
What you want to happen is for your browser to redirect you to a page that does exist instead, using a 301 status code.
This 301 means that the page has been permanently redirected to another, which by modern search engine standards will result in the old page being removed from Google’s results and being replaced by the page it redirects to.
Remember to use httpstatus.io
After being redirected, the page you land on should report a 200 status code, meaning that the page was found and is OK.
For best practice, you should both redirect an old URL to a new one and change all of your internal links to reflect the new URL.
Pretty simple.... right?
Just look at it like this, imagine that if you try to visit this link, it won't take you anywhere except for an error page. If your site has links like this, you need to change them. If that link used to work, then redirect that URL to a new one that does work. You'll notice my error pages just redirect you to my home page, which is also OK to do.
The first thing both visitors and search engines will see about your page is the Meta Title – this is the one-liner snippet that appears at the top of your listing on Google.
Some say I'm a little harsh on others who charge for SEO, I just say that they shouldn't be charging you for it
if they're just inserting some tags into your template alongside your design
This is an area that most sites have trouble with and require constant testing to find the best variations.
However, some best practices for you to follow are;
Use CTR Enhancers
There’s really not that much to titles, it’s all about considering what will gain the most interest and cause the most click-throughs, as well as how you can fit a keyword in there without it looking unnatural.
Just remember though, if your main keyword isn’t in your title somewhere, you’re missing out on an immediate increase in ranking potential, as this is something Google does use in their algorithm unlike the Meta Description.
Heading tags were all the rage in the mid 2000s, where everyone except for me we all excessively piled in our keywords into as many H2-H6 tags as possible, as well as the H1 tag (which would almost always be no more than the keyword itself).
Although this no longer works as it did, Heading tags still hold a lot of importance to search engines.
Best practice says that you should use your keyword in your H1 tag, as well as similar or related keywords in both the H2 and H3 tags – H4 and onward have faded out of use due to SEOs declaring them of little value, so are no longer used except for in rare cases.
But what are heading tags?
I like to think of them as headings in a text book, but that can be a little confusing to explain. So here’s an example with cats and dogs:
Let’s say you have a page… about Cats and Dogs…
<h1>Cats and Dogs: Our Fluffy Friends</h1>
This page is going to separately talk about both species, their feeding, grooming and healthcare basics.
Below your H1, you can have some intro text:
<p>I like cats and I like dogs, so welcome to my page! :D</p>
Or, something a little less simple…
Below this, you start your first section, covering everything about Cats first, followed by your section about Dogs. Now, this is where you start to structure heading tags under heading tags. Like folders on your computer, your C: drive (H1) contains folders (H2s), which all contain folders (H3s) and so on:
<h2>All about Cats</h2>
<p>Introduction to cats</p>
<p>Feeding cats entails buying minced reformed grains and meat we wouldn't dare touch…</p>
<p>Grooming cats entails waiting until the fleas take over before brushing…</p>
<p>No one really knows how healthcare works with cats...</p>
At this point, you can now repeat the above, but for your section on Dogs.
The hierarchy for these, is that from H1 and onwards you have increasing usage but decreasing importance and ranking potential:
However, what I see quite often is that a designer might use these tags for sidebars and footers, which is totally the wrong thing to do. These tags are only used in the main content of your page, for search-engine’s benefit.
It’s a common occurrence and the reason why you shouldn’t ask a web designer to optimise your website for you (not hating on designers here, although there are exceptions most web designers who offer SEO as an addon (not usually those who offer consulting services) will generally only know the very basics). Instead, elements outside of the main content should be text that is styled with CSS.
Canonicals are small tags in the <head> of your page that say to search engines:
Hey there Googlebot! How’re you doin’? Listen, I know this page is similar to another on this site, so get this, how about we don’t show this page in search results, but we show the other one in your results instead? If that’s cool with you, here’s the url.
If you have duplicate versions of one page that for some reason you can’t remove, or you have multiple pages all with very similar content, just add this tag to every page except for the one you want Google to index and point the links all to the main page:
<link rel="canonical" href="http://www.mywebsite.com" />
It’s always good practice to have this tag on your pages anyway. Just point the URL to the same page, then if anyone finds a way to generate that same page from a different URL (these things can happen if you have a Content Management System), you’ll be covered and won’t have any issues with duplicate content.
One word of warning though, a canonical on the wrong page and with the wrong link can cause a world of pain – make sure that every canonical leads to the correct page and that (like I’ve seen before) not every page on your site canonicalizes to your home page!
Noindex tags (like canonicals) tell search engines not to index the page they’re on, but this time they simply don’t suggest an alternative page.
<meta name=”robots” content=”noindex“>
These absolutely should not be used on any page you want search engines to see and rank, unless you don’t mind wasting your time optimising those pages (because they will have no chance of ever ranking).
When your agency is building a site, they will most often use one of several methods to stop the development site from being found by search engines. The most common being a noindex tag placed on every page.
Most agencies will remember to remove this when the site launches, but I have seen in the past some sites were launched and only managed to discover why their organic traffic died after seeking external help.
Please don't let this go on for any more than a day or two
When auditing your site, just confirm that this useful – but harmful – tag isn’t used, except for on pages that aren’t supposed to be indexed by Google.
Now, Google can still find and crawl these pages, the tag just tells them not to show it in their results. Because they can still see it, they can crawl any pages that it links to.
If you want to both stop the page from being indexed and you don’t want Google to crawl any of its links, the full tag would be:
<meta name=”robots” content=”noindex, nofollow“>
Oh boy, this is a big one. I won’t cover everything about site speed in this post (I don’t have 12 hours of writing time to spare sadly), but I’ll instead briefly cover a few of the most important factors:
If your site as a whole is slow, this section is a big one for you.
Did you know that the location of your server impacts your loading time?
Did you know that there is a BIG difference between your <£10/month hosting plan and a VPS costing £40+/month?
Did you know that the way your server is set up can cut loading times in half?
First up, I used to love HostGator – a whole year of hosting for $100? Heck yes! But my visitors in the UK suffered by trying to connect to a cheap, budget server in the US that also hosted 25,000 other websites.
Download times were dreadful, but it used to cost almost nothing, except for the drastic loss of affiliate commissions from the sites I ran at the time.
Now, I pay £45/month to share an 8 core, 32gb Intel Xeon Virtual Private Server in Warwickshire, running on Samsung SSDs in one of the best datacentres in the country – and that’s sharing the server with just 5 other people.
Not only that, but the server is set up to cache pretty much every piece of data, meaning that pages aren’t re-generated from the database every time they’re accessed, halving the download time for you.
For Google, the most important thing is the Time To First Byte (TTFB), the time it takes for the first byte of data to leave my server and hit your computer. This does correlate with rankings, but you need to get this down to below 400ms. A fast server, close to your visitors, will accomplish this.
Google likes to rank pages with lots of content, which is why a large total page file size correlates well with rankings (within reason). But, having unnecessary padding in your page size gives you no benefit and only serves to decrease your user experience and satisfaction.
What can you do?
Just go through your pages and identify anything you can remove:
The cats eyes say "Why do you have a 12,000px image of a cat on your site? Resize and compress it!"
This is still in the realm of site speed, but images have a special hate place in my heart.
Look for pages that have a lot of images, or that have a lot of image data loading (or both).
Back when I provided SEO consulting, I worked with several startup companies organised by the West Midlands Manufacturing Consortium. One of these, a new photography duo, wanted a rundown of what they could do to get more traffic from Google.
You can imagine the site, being a photography company it was very much focused on visuals. So much so in fact that the home page, in all its photographic glory, took well over a minute to load and even included a Loading Screen.
A loading screen, on a website.
This isn’t Dino Crisis on the PS1, where every time you entered a room a loading screen would appear and suck the enthusiasm out of you for the next 40 seconds!
Anyway, their home page featured hundreds of images and must have been well over 30mb (thank you WordPress for loading thumbnails instead of the full things). Every inner page was much the same, but loaded the full images, pushing some pages well beyond the 30mb of the home page.
This is what happens when you upload non-resized, uncompressed images to a site straight from your £4,000 DSLR.
Quite simply, I resize every image to the maximum it will display on the site, then just run every image I ever upload through FileOptimizer.
This is a free tool that once saved over 60GB of space on my home computer (I think it’s actually more than this, but I’m not sure, I have too many HDDs full of photography and blockchains…), and cut the file size of TCB’s home page by 800kb.
My example below shows how I took a 16mb, 12,000px wide image, resized it down to just 600kb and 600px wide, then compressed 80% of that data down to just over 100kb.
Now that is called efficiency. Grab Fileoptimizer here.
For those of you using WordPress, a word of warning – do what you can before uploading, then WP will create thumbnails of varying sizes which you will need to recompress.
Unless you want to go through the trouble of downloading the thumbnails from your server, compressing them and then reuploading them, install EWWW.io and run them through this. It’s not free for the (recommended) cloud version, but it definitely helps.
I previously quoted a post from Moz that showed no / very little correlation between total page load time and rankings, which might make you question why you would optimise your images when considering SEO…
Just bear in mind, their study is limited and isn’t conclusive, and that if your pages take ages to load and display anything, your visitors are going to leave, sending quite a negative signal to Google about the quality of your pages (and likely resulting in a drop in rankings).
As promised, you can download a copy of my presentation below, complete with notes (some of which I may have skipped in a blur of my first talk), as well as my auditing workbook.
Enter your email address to get:
This workbook does require the paid version of SEOToolsForExcel to be able to function properly (as well as Microsoft Excel). This purchase will set you back £70+ for a 1 year license, but it is a seriously good investment - for those of you who are Excel geniuses (genii?) you will get even more value out of this tool.
One small part of the audit workbook requires you to input an API key from Moz (you can get that here). It's quite simple, once you load up the workbook with SEOTFE installed it should pop up asking for you to input this.
You can also download the workbook here, but unless you enter your email address above you won't get future updates or anything else I release for free.
If you can't use Excel, or you don't use Windows, you can't use my audit tool I'm afraid.
But you're in luck, as there are some other tools I can recommend.
I'd highly suggest to check out SEOProfiler, which combines auditing with link analysis, linkbuilding and plenty of other tools that can help your site reach new customers.
If you want a tool that just helps you audit your site, use ScreamingFrog.
Input your website's homepage (complete with http / https), your sitemap file URL (if you don't have a simple list of URLs already) and your Google domain, all where indicated (this should be a little obvious I hope).
If you have entered a sitemap, this will now scrape a list of pages from that file, which will start to populate the URL List below.
If you didn't enter a sitemap, you can just paste in a list of URLs in B7, just below the URL List heading.
If you are analysing pages for keywords (I'm not sure why you wouldn't) then enter the main keyword for each page under the Main Keyword heading.
Seroundtable is just a random example,
but definitely a site you should check out
Here you can select different analyses, such as whether you want to see data about keywords, whether you want to analysis page content and so on.
Both analysing Page Authority and Size / Load can take a while, particularly if you have entered a large number of URLs (the workbook is capped at 1000, but you can change this), so you might want to leave these turned off for now.
In the settings section, choose your advice type.
This is a little confusing, but: If your skill set is Advanced and you know what you're doing, choose Simple advice, if your skill set with SEO is more simplistic, choose Advanced advice.
This just means that simple advice will point out what's wrong and isn't too wordy, where advanced advice will go into just a little more detail to make it easier to understand, but you'll have to read a little more.
Title tag width isn't something you'll really need to change unless Google change their search layout. But keep an eye on the industry chatter in the future.
Content Analysis Xpath... this is where you'll need to crack open Developer Tools in your browser to grab the xpath that your page content is in. Here's a quick tutorial.
Paste your xpath in here and if you turn the Content analysis above to on, then the tool will collect the content from each of your pages - cool huh?
By this time, this sheet will already be populated, or should be finishing shortly.
From here, you can see all of the following:
Phew! The hours put into this document...
There's only one thing you need to do on this sheet: Get your title widths
Google now use pixel width, rather than character count, to determine where to cut off your titles in search results. At the top of this sheet is a link to where you can get a list of your title widths, and there's a small note on the sheet that tells you how to do this.
This is where you'll find practically everything you need to know about your pages, with some summarised advice in the last column.
Of course, you wouldn't leave it here, as you can learn more about your site by going in manually, just think of this sheet as a guide, not the be-all-end-all of auditing.
There's also the Text to Code ratio (an old metric, but semi-useful, the higher this is the better) and the response time (look for ones above the average, as this depends on how many pages you're analysing).
The PageSpeed score at the end gives you a rating up to 100 (the higher, the better) from Google's tool.
Think of this as saying "You've done X% of everything you could do to make this page as fast as possible on a technical level".
You can click your score to take you to Google's tool (which should auto-fill out, damn I love Excel) so you can see what Google recommends you do next.
Before you use this sheet though, bear in mind the Analysis Type selector and the Pagespeed Score on/off button at the top.
You can change this if you absolutely need to be completely accurate.
Obviously, if you want to see the PageSpeed score, turn this on too, but this does take a while to load.
So, you've had a good introduction to what to look for on your first on-site SEO audit, you now have the tools you need to get started, and the rest is up to you.
Thanks for reading, I really hope you found this useful to your sites / businesses, please remember to leave a comment below and let me know what you think!