fbpx

Technical SEO Guide

Whilst Google’s algorithms have gotten significantly better at understanding context within the English language, grouping topical relevancy and identifying spammy backlinks.. The core algorithm and many of the ways Google’s bots read your website are still the same as they were in 2010, as they are today.

Google’s analysis is still mostly within the realm of HTML and XML with things like your CSS files only really affecting your load time – Which is still a ranking factor, just not a major one.

So when it comes down to actually being able to “Master” technical SEO, you don’t need to be a teenage wizkid or senior developer with extensive knowledge in 9 different computing languages to know it all. You just need a few hours, basic programming knowledge, how HTML works and the final, most important key to it all? It’s obvious.. This guide.

What is Technical SEO?

Whilst SEO as a whole is simply attempting to influence a site’s rankings within the SERPs.. Technical SEO specifically is optimizing the more complicated aspects of your website and it’s surrounding factors, such as optimizing the load time on a server level or fixing broken internal links and redirects.

In this guide, I’m hoping to show you exactly how we carry audit technical SEO analysis, audits and implementation to a masterful level that has allowed us to rank thousands of clients over the past decade.

Starting With Analysis

A good SEO always begins with analyzing the things on the site that you can fix straight away and that are core, common knowledge to any SEO with their head screwed on. I’m talking about fixing too long meta titles, redirect chains and 404s.. The stuff you’ll be able to run a tool like Screaming Frog or SiteBulb on and get your dug in within 10 minutes of pressing go on your prefered crawler.

There is also 2 different styles of auditing that you need to be proficient in AND be carrying out within your SEO strategies –

Site Wide Auditing

Unless you’re dealing with a 5 page local business site, you’re going to want to be using a scraping/auditing tool that gives you all the insights within a few minutes or hours of sticking in the site URL and pressing go.

There are a TON of tools out there on the market, and it can be hard to choose which one is good for you, so here is an easy list to base your decision off:

  • Screaming Frog– All round easy, desktop based application that has everything you need for a cheap yearly license fee but in an old school style windows UI.
  • Ahrefs Site Audit– Built-in to the SEO tool suite Ahrefs, the site audit tool offers most of what you need but lacks content related recommendations.
  • SEMRush Site Audit– Same as Ahrefs but with a cleaner UI though slightly slower auditing time.
  • SiteBulb– Monthly, premium tool that’s desktop based with a cleaner UI but less settings than Screaming Frog. If you’re fairly new to SEO this the tool for you, as it offers a “Hint” based recommendation format that gives you mini tutorials on what the recommendation means and how to carry it out.
  • DeepCrawl– Working with high end enterprise clients that need cloud based auditing? DeepCrawl offers THE premium solution for this, but won’t come cheap.

Once you have your prefered tool it’s just a matter of making sure the host allows that level of scanning (You may have to ask your host or whitelist the scanning IP address yourself) and start.

We’ll be going through the individual meanings and how to implement them later on, but in this example we ran Web20Ranker.com through Ahrefs site audit to show you some real world examples of what we could fix on one of our own properties, according to one of these tools at least.

Ahrefs tells us a TON of information, so I’ll try to format what we need to do into some useable lists:

Site Wide Problems 

OnPage Problems 

  • 41 Pages with no H1 header
  • 33 Pages with too long meta descriptions & 40 Pages with no meta descriptions
  • 92 Pages with too short meta titles & 2 with meta titles too long.
  • 5 Pages of duplicate content (with no canonical tag)
  • 269 “Slow” pages
  • 289 Images with no alt text & 1 broken image

Ahrefs also found there were also 22 pages with the NoIndex tag in our sitemap and we had a page missing social data such as OpenGraph.

This gives us a lot of things to go off and initially fix on the site, before we even get into optimizing the specific pages for there individual keywords.

Page Level Auditing

Page level auditing is looking at the individual key elements that differentiate you from being able to be #1 and where the page is currently, if it ranks at all. This doesn’t necessarily mean changing your elements to be like competitors, as all sites have different factors playing into there roles. Though competitive analysis can be a good insight into finding ways to better optimize your page for Google.

Implementing Your Work

Once you’ve audited the core issues with your overall sites and it’s pages, it’s time to actually start implementing the work so you can start seeing results.

There’s a few issues you’ll need to resolve before diving straight into optimizing pages though – Things like how to actually edit your site, does hard coding break things and others.

What CMS?

Most websites on the internet are running on WordPress (About ~20%) but if you’re working in the ECommerce world it could be on any number of a hundred different CMSs.

WordPress is easy, as there is a TON of WordPress specific content out there from sites like WPBeginner or plugins themselves like Yoast & RankMath.. If you’re on any other CMS, then it becomes a research game of finding plugins, hardcoding what you can (or paying a developer to do it for you) and figuring out the best solution for each one you stumble across.

For WordPress, we recommend the following setups –

Hosting Providers:

  • WPX Hosting – Looking to run small to medium sites en masse for a cheap monthly price with insanely good support?
  • CloudWays – Looking to run larger sites that require VPSs and much more server side configuration?
  • NameCheap WordPress Hosting – Looking for a cheap alternative that allows annual or bi-annual subscriptions for $20 – $40?

When it comes to which plugins to run on your WordPress site, there is a fierce debate running throughout the SEO community.. But we have a best practice set of plugins:

  • RankMath or Yoast SEO – Whichever is your preference, Yoast is more established but RM is coming for their throat.
  • Google XML Sitemaps– A lot of users prefer having more control over their sitemaps, and this plugin gives you very detailed settings vs RM or Yoast.
  • Broken Link Checker– Scans your site for broken links and missing images.
  • All In One Schema Snippets– Want to add all types of schema to your site? This plugin has most of them.
  • Table of Contents Plus– Makes an easy, Wikipedia style microdata formatted table of contents that creates the “Jump To” feature in the SERPs.

Then you also need to look at speed optimization plugins such as AutoOptimize, Smush and W3 Total Cache.

Setting Up Search Console

One of the easiest ways to consistently scan your site for problems, review organic traffic and test out things like site speed is to link Google’s search console to your site.

Optimizing Your Meta Data

It’s wildly considered to be bad to try to keyword optimize your meta descriptions since Google stopped using it as a ranking factor. However, optimizing your meta titles can still have a significant impact on your rankings.

We have gone to work with local clients before who have 10 years domain age, a decent topically established site but every meta title on the site is just the company name… We spend 5 minutes re-optimizing titles and they do 3x the phone calls next month.

There is 2 systems we use to optimize our meta titles, depending on the different SEO situations that we have.

CRO Meta Titles

CRO stands for “Conversion Rate Optimization” and what this means is to optimize your titles in such a way as to get as many people to click thru to them as possible – Using capitals, symbols/emojis and punctuation like exclamation marks has always been a tradition amongst SEOs trying to entice their users.

It’s very hard to give you an exact formula that’ll work in your niche, and certain niches consider things like emojis unacceptable.. I very much doubt your car accident lawyer client is going to be very happy with emojis before their multi-generational brand name.

Google also actively works to filter out certain emojis and other symbols/characters (Things like the Swastika are obviously banned) so you may want to do some testing before committing to implementing symbol based strategies site-wide or client-wide.

From our testing, things like these have done very well:

  • Adding the current month onto an article (You can use various WordPress plugins or shortcodes within Yoast/RankMath to achieve this) such as “Best SEO Tools April 2020
  • Capitalizing and adding exclamation marks – Though it’s seen as bad practice to capitalize the entire title, you can capitalize the first letter or add an exclamation mark on to words that you want to draw the users attention to.
  • Adding pricing – Things like “From $XX” and buyer intent words like “Cheap”, “Sale” and “Discount” can not only have SEO benefits (for users searching these additional keywords) but also have very high click-thru rate potentials with users on a budget.

SEO Meta Titles

This is where you want to get as many factors going for your page as possible, and optimizing your meta title for a specific keyword or a group of keywords can be a powerful signal.

If you have don’t already have the keyword/s or mind for your page then you can use Ahrefs to find them.

Editor’s Note: You can also use the keyword difficulty, position and volume filters to find optimal keywords you’re yet to be ranking top 3 for.

In this example, we’ve found the below keywords:

  • GMB SEO – 23 Keyword Difficulty, 150 US Monthly Searches & Currently Ranked#2
  • google my business optimization service – 47 Keyword Difficulty, 50 US Monthly Searches & Currently Ranked #9
  • google my business seo – 48 Keyword Difficulty, 300 US Monthly Searches & Currently Ranked#22
  • local maps optimization – 7 Keyword Difficulty, 150 US Monthly Searches & Currently Ranked #28

With the above keywords in mind: There correlating current rankings vs monthly search potential (and it’s potential to generate sales) vs keyword difficulty

We can generate an SEO’s idea of which are the best keywords we can target for the page to grow the best.

We can then build an optimized meta title along the lines of:

Google My Business Optimization Service (GMB SEO) – Web 2.0 Ranker

This uses all the keywords we’re looking to target but moves the relevancy away from the “GMB SEO” keyword (Which is one of the most generic ones) towards the “service” and “optimization” related keywords which offer much more relevancy for the user.

Editor’s Note: You see why it’s really quite impossible to create a software to automatically generate titles as good as an SEO that can learn how to do this.

Optimizing URL Structure

Now, it’s not recommend to change existing URL structure unless it is extremely detrimental to the rankings of the website, things like:

  • URLs that contain wrong topics/keywords that potentially cannibalize the page.
  • URLs that contain too long, and potentially random number/character sequences that offer 0 benefit to informing Google what the page is about.
  • URL structures that lead users and bots to 404s, bad redirects and so on – A great example of this is when websites create additional name directories that don’t actually exist just to group content, such as example.com/doesnt-exist/seo/technical-seo/ with the first folder not actually ever being a real page.

The optimal URL structure is going to be grouped topics with as short as possible keywords to shortly describe the page we’re talking about.

If we owned a business that was about Plumbing in New York City and specifically wanted a page about the Harlem area, we could have a URL structure such as:
johndoeplumbing.com/new-york-city/harlem/

Now, you may ask why we haven’t included the word plumbing within the URL structure – Because it’s in the domain, and the topical relevancy of plumbing itself should be built within your OnPage and content SEO.

Internal Linking

Whilst over optimizing your anchor text on backlinks can lead to serious consequences, it’s pretty hard to over optimize your internal links. We still don’t recommend repeating anchors all the time though or using automated internal linking software/plugins without any sort of human input to stop it from going wild.

Editor’s Note: You can use Link Whisper for WordPress sites that allows automated suggestions within your content and allows you to edit the anchors and change URLs at will.

In terms of the anchor text strategies we employ for our own websites is to to it on a page by page basis:

Create a list of target keywords for the page or a list for each page you’re trying to optimize.

Use Google’s command site operator to find pages relevant on your site to the core subject you’re trying to internally link your page for. As an example, if I wanted to link to our Web20Ranker’s Guest Post page I could use the Google search:

Site:web20ranker.com “guest post

Which gives me back 128 pages I can eye for relevancy from. On the first page alone, I found these pages to go off of:

All we have to do now is right click each page and “View Page Source” then take our target page URL and Cntrl+F (for find, if you’re on Windows) and input the target URL into the search bar – Watch out for it the page potentially being in the navigation menu, but you’re just looking for any instances of the page being used with an incontext link.

If there is an instance already, you can leave the page or edit the link to be using one of your target keywords.

You simply go down your list of target keywords and use each one at a time, not repeating the one above it until you’ve gone through your entire list.

This gives you a comprehensive list of relevant pages with internal links to your target pages and should only take a matter of minutes to carry out per page.

Schema & Structured Data

One of the easiest ways to tell Google about your site and it’s pages is with formatted code such as schema and microdata. With new schema elements being added to the SERPs every year, if you aren’t utilizing it you’re likely missing out.

Luckily for us, Google has their own totally free (no signup required either) service called the Structured Data Testing Tool. This tool allows you to input a URL and returns back all the structured data (With warnings if it’s got code issues to fix) on a page.

Editor’s Note: You’ll need to run each page you’re wanting to test the data on individually – e.g. the structured data for your about page should be very different from the ones used for your blog posts.

Google has actively taken measures to stop schema spam, but there is no “limit” on how much structured data you can use on your pages just be careful and clever with which ones you use and where you put them.

The most common types of structured data you’ll want to employ are:

  • SiteNavigationElement – This schema element allows you to tell Google how your navigation menu is formatted, which can lead to much better branded results.
  • Article / Blog Post / News Article – This is essentially 3 pieces of schema that do the same thing, just for 3 different types of websites.
  • Review Stars– If you’ve ever seen the 5 star box in the SERPs, this is how you get it.. Though Google tends to remove this schema the most and it requires a lot of testing to get them to even show up.

Then there is also structured data for recipes, events, courses, fact checking, jobs, podcasts, recipes, videos, apps and a whole load more too.

Google also announced last year the additions of the FAQ, Q&A and How To structured data formats which can be used for additional SERP elements around your site.

Redirects & 404s

Dealing with rogue 404 pages, multi-level redirect chains and pages that aren’t supposed to be there can be a massive pain in the ass.. Especially if you’re dealing with sites that have tens or even hundreds of thousands of pages.

Luckily tools like Google’s Search Console and most of the site scanning software we talked about above gives an easy review of the current state of your sites URLs, however there can be some things to always improve upon.

Dealing With Redirect Chains

Google says you lose between 20% and 70% of the link juice between each redirect hop. So if your internal redirects are bouncing between 5 different pages before landing on the target destination, you’re going to be losing most of the link juice that page originally had.

Simply cutting a few links out of the chain and making sure each redirect is going to it’s specific page is the best way to make sure you’re utilizing as much link juice as possible.

When To 301, 404 or Something Else

Sometimes a product goes out of stock, a service is no longer offered or a blog post needs to be removed due to internal company politics.

Deciding these pages’ fate is up to the SEO.

  • If the page has links and is relevant enough, then you want to be 301’ing the page (and it’s accompanying signals within Google) at the most relevant page you’re trying to rank.
  • If the page doesn’t have anywhere to be pointed to and will never exist again or be redirected anywhere else, then you want to use a 410 HTTP code.
  • If the page needs to be deleted but may have a redirect home or a new page in the future, then you want to 404 the page and hold the link juice for future use.

Robots.txt & .htaccess

Your robots.txt file tells Googlesbot on the frontend where and what it should be able to access. Your .htaccess file does the same thing, but without telling Google straight up, it does it behind the scenes.

Mostly, you just want to use your robots.txt file to block out pages or parts of your site that you don’t want indexed – Things like tags pages and admin login areas are obvious ones to disallow.

Your .htaccess file can just be used to block spammy bots that are over crawling your website, but otherwise we don’t recommend you touch this file without extensive rewrite engine knowledge.

Validating Code

One of the biggest problems we run into with clients sites is having broken schema, messed up CSS, responsive spaghetti code and other majors code based issues that can cause serious SEO issues.

Validating HTML Tags

More times than not there’ll be some HTML code problem on your site – A malformed href link, badly coded headers and the like can cause serious problems for your SEO. Freeformatter has a free tool that allows you to check each page for invalid HTML code.

Validating SSL Certificates

There’s a really simple tool to check if your SSL is working 100%: Why No Padlock. Just input the specific page URL your SSL certificate isn’t showing the green padlock for and it’ll give you the exact reasons and how to fix them.

Site Speed Optimization

Google announced nearly 3 years ago now that it’d be using site speed as a ranking factor, and we’ve seen many cases of sites with lackluster load times underperforming in the SERPs. Google’s set it’s own Search Console tools to report slow load times of anything over 3 seconds, so we tend to see this as a solid baseline to try and beat every time.

Optimizing your site speed massively depends on your hosting provider, the CMS your site is running on the ability of access you have to edit things – If you’re running on Shopify for example, you’re limited to a set few plugins that will only be able to optimize theme code and images, which are just a few large cogs in the much larger machine.

There are several core principles to optimizing your site speed though:

  • Optimizing your server load time and connection speed (Generally has to be done on a host level)
  • Optimizing your sites code – Generally done by minifying HTML & CSS code by removing unnecessary lines and language.
  • Optimizing your sites images – Generally done by first compressing the images and then serving them on a CDN.
  • Caching your site – Caching assets within your visitors own memory allows for revisits and new page opens to be substantially faster.

Though all of these things sound fairly complicated, but if you’re using a CMS like WordPress they can be achieved with just a few plugins.

Waiting For Your Results

One of the biggest questions we get from the SEO community and clients alike is how long is it going to take to see the results play out from my SEO work? Well, luckily for you with OnPage SEO as opposed to the usual link building strategies we talk about, you’ll be able to get ranking results normally the next day as long as you’re pinging the updated pages & sitemaps within Google’s search console.

We’ve spent a day updating internal linking, re-doing load times and optimizing metadata/alt tags for every single keyword we were optimizing for to jump the very next day.

Conclusion

Whilst technical SEO isn’t rocket science, it isn’t the easiest to get your head around and takes a lot of time to figure out the various different tools, CMSs and languages that go into being able to optimize any site you come across.

This guide hopefully gave you a real insight into how to execute analysis, auditing and implementation of SEO work that can lead to real results for both your own sites and clients alike. If you had any further questions around this post or around any topics we didn’t cover, please drop a comment down below.

seo profile image

Craig Campbell

I am a Glasgow based SEO expert who has been doing SEO for 18 years.

  • social media icon
  • social media icon
  • social media icon

Online Courses