Whilst Google’s algorithms have gotten significantly better at understanding context within the English language, grouping topical relevancy and identifying spammy backlinks.. The core algorithm and many of the ways Google’s bots read your website are still the same as they were in 2010, as they are today.
Google’s analysis is still mostly within the realm of HTML and XML with things like your CSS files only really affecting your load time – Which is still a ranking factor, just not a major one.
So when it comes down to actually being able to “Master” technical SEO, you don’t need to be a teenage wizkid or senior developer with extensive knowledge in 9 different computing languages to know it all. You just need a few hours, basic programming knowledge, how HTML works and the final, most important key to it all? It’s obvious.. This guide.
Whilst SEO as a whole is simply attempting to influence a site’s rankings within the SERPs.. Technical SEO specifically is optimizing the more complicated aspects of your website and it’s surrounding factors, such as optimizing the load time on a server level or fixing broken internal links and redirects.
In this guide, I’m hoping to show you exactly how we carry audit technical SEO analysis, audits and implementation to a masterful level that has allowed us to rank thousands of clients over the past decade.
A good SEO always begins with analyzing the things on the site that you can fix straight away and that are core, common knowledge to any SEO with their head screwed on. I’m talking about fixing too long meta titles, redirect chains and 404s.. The stuff you’ll be able to run a tool like Screaming Frog or SiteBulb on and get your dug in within 10 minutes of pressing go on your prefered crawler.
There is also 2 different styles of auditing that you need to be proficient in AND be carrying out within your SEO strategies –
Unless you’re dealing with a 5 page local business site, you’re going to want to be using a scraping/auditing tool that gives you all the insights within a few minutes or hours of sticking in the site URL and pressing go.
There are a TON of tools out there on the market, and it can be hard to choose which one is good for you, so here is an easy list to base your decision off:
Once you have your prefered tool it’s just a matter of making sure the host allows that level of scanning (You may have to ask your host or whitelist the scanning IP address yourself) and start.
We’ll be going through the individual meanings and how to implement them later on, but in this example we ran Web20Ranker.com through Ahrefs site audit to show you some real world examples of what we could fix on one of our own properties, according to one of these tools at least.
Ahrefs tells us a TON of information, so I’ll try to format what we need to do into some useable lists:
Ahrefs also found there were also 22 pages with the NoIndex tag in our sitemap and we had a page missing social data such as OpenGraph.
This gives us a lot of things to go off and initially fix on the site, before we even get into optimizing the specific pages for there individual keywords.
Read more:
Setting up a campaign on microworkers
Page level auditing is looking at the individual key elements that differentiate you from being able to be #1 and where the page is currently, if it ranks at all. This doesn’t necessarily mean changing your elements to be like competitors, as all sites have different factors playing into there roles. Though competitive analysis can be a good insight into finding ways to better optimize your page for Google.
Once you’ve audited the core issues with your overall sites and it’s pages, it’s time to actually start implementing the work so you can start seeing results.
There’s a few issues you’ll need to resolve before diving straight into optimizing pages though – Things like how to actually edit your site, does hard coding break things and others.
Most websites on the internet are running on WordPress (About ~20%) but if you’re working in the ECommerce world it could be on any number of a hundred different CMSs.
WordPress is easy, as there is a TON of WordPress specific content out there from sites like WPBeginner or plugins themselves like Yoast & RankMath.. If you’re on any other CMS, then it becomes a research game of finding plugins, hardcoding what you can (or paying a developer to do it for you) and figuring out the best solution for each one you stumble across.
For WordPress, we recommend the following setups –
When it comes to which plugins to run on your WordPress site, there is a fierce debate running throughout the SEO community.. But we have a best practice set of plugins:
Then you also need to look at speed optimization plugins such as AutoOptimize, Smush and W3 Total Cache.
One of the easiest ways to consistently scan your site for problems, review organic traffic and test out things like site speed is to link Google’s search console to your site.
It’s wildly considered to be bad to try to keyword optimize your meta descriptions since Google stopped using it as a ranking factor. However, optimizing your meta titles can still have a significant impact on your rankings.
We have gone to work with local clients before who have 10 years domain age, a decent topically established site but every meta title on the site is just the company name… We spend 5 minutes re-optimizing titles and they do 3x the phone calls next month.
There is 2 systems we use to optimize our meta titles, depending on the different SEO situations that we have.
CRO stands for “Conversion Rate Optimization” and what this means is to optimize your titles in such a way as to get as many people to click thru to them as possible – Using capitals, symbols/emojis and punctuation like exclamation marks has always been a tradition amongst SEOs trying to entice their users.
It’s very hard to give you an exact formula that’ll work in your niche, and certain niches consider things like emojis unacceptable.. I very much doubt your car accident lawyer client is going to be very happy with emojis before their multi-generational brand name.
Read more:
Google also actively works to filter out certain emojis and other symbols/characters (Things like the Swastika are obviously banned) so you may want to do some testing before committing to implementing symbol based strategies site-wide or client-wide.
From our testing, things like these have done very well:
This is where you want to get as many factors going for your page as possible, and optimizing your meta title for a specific keyword or a group of keywords can be a powerful signal.
If you have don’t already have the keyword/s or mind for your page then you can use Ahrefs to find them.
Editor’s Note: You can also use the keyword difficulty, position and volume filters to find optimal keywords you’re yet to be ranking top 3 for.
In this example, we’ve found the below keywords:
With the above keywords in mind: There correlating current rankings vs monthly search potential (and it’s potential to generate sales) vs keyword difficulty
We can generate an SEO’s idea of which are the best keywords we can target for the page to grow the best.
We can then build an optimized meta title along the lines of:
Google My Business Optimization Service (GMB SEO) – Web 2.0 Ranker
This uses all the keywords we’re looking to target but moves the relevancy away from the “GMB SEO” keyword (Which is one of the most generic ones) towards the “service” and “optimization” related keywords which offer much more relevancy for the user.
Editor’s Note: You see why it’s really quite impossible to create a software to automatically generate titles as good as an SEO that can learn how to do this.
Now, it’s not recommend to change existing URL structure unless it is extremely detrimental to the rankings of the website, things like:
The optimal URL structure is going to be grouped topics with as short as possible keywords to shortly describe the page we’re talking about.
If we owned a business that was about Plumbing in New York City and specifically wanted a page about the Harlem area, we could have a URL structure such as:
johndoeplumbing.com/new-york-city/harlem/
Now, you may ask why we haven’t included the word plumbing within the URL structure – Because it’s in the domain, and the topical relevancy of plumbing itself should be built within your OnPage and content SEO.
Whilst over optimizing your anchor text on backlinks can lead to serious consequences, it’s pretty hard to over optimize your internal links. We still don’t recommend repeating anchors all the time though or using automated internal linking software/plugins without any sort of human input to stop it from going wild.
Editor’s Note: You can use Link Whisper for WordPress sites that allows automated suggestions within your content and allows you to edit the anchors and change URLs at will.
In terms of the anchor text strategies we employ for our own websites is to to it on a page by page basis:
Create a list of target keywords for the page or a list for each page you’re trying to optimize.
Use Google’s command site operator to find pages relevant on your site to the core subject you’re trying to internally link your page for. As an example, if I wanted to link to our Web20Ranker’s Guest Post page I could use the Google search:
Site:web20ranker.com “guest post”
Which gives me back 128 pages I can eye for relevancy from. On the first page alone, I found these pages to go off of:
All we have to do now is right click each page and “View Page Source” then take our target page URL and Cntrl+F (for find, if you’re on Windows) and input the target URL into the search bar – Watch out for it the page potentially being in the navigation menu, but you’re just looking for any instances of the page being used with an incontext link.
If there is an instance already, you can leave the page or edit the link to be using one of your target keywords.
You simply go down your list of target keywords and use each one at a time, not repeating the one above it until you’ve gone through your entire list.
This gives you a comprehensive list of relevant pages with internal links to your target pages and should only take a matter of minutes to carry out per page.
One of the easiest ways to tell Google about your site and it’s pages is with formatted code such as schema and microdata. With new schema elements being added to the SERPs every year, if you aren’t utilizing it you’re likely missing out.
Luckily for us, Google has their own totally free (no signup required either) service called the Structured Data Testing Tool. This tool allows you to input a URL and returns back all the structured data (With warnings if it’s got code issues to fix) on a page.
Editor’s Note: You’ll need to run each page you’re wanting to test the data on individually – e.g. the structured data for your about page should be very different from the ones used for your blog posts.
Google has actively taken measures to stop schema spam, but there is no “limit” on how much structured data you can use on your pages just be careful and clever with which ones you use and where you put them.
The most common types of structured data you’ll want to employ are:
Then there is also structured data for recipes, events, courses, fact checking, jobs, podcasts, recipes, videos, apps and a whole load more too.
Google also announced last year the additions of the FAQ, Q&A and How To structured data formats which can be used for additional SERP elements around your site.
Dealing with rogue 404 pages, multi-level redirect chains and pages that aren’t supposed to be there can be a massive pain in the ass.. Especially if you’re dealing with sites that have tens or even hundreds of thousands of pages.
Luckily tools like Google’s Search Console and most of the site scanning software we talked about above gives an easy review of the current state of your sites URLs, however there can be some things to always improve upon.
Google says you lose between 20% and 70% of the link juice between each redirect hop. So if your internal redirects are bouncing between 5 different pages before landing on the target destination, you’re going to be losing most of the link juice that page originally had.
Simply cutting a few links out of the chain and making sure each redirect is going to it’s specific page is the best way to make sure you’re utilizing as much link juice as possible.
Sometimes a product goes out of stock, a service is no longer offered or a blog post needs to be removed due to internal company politics.
Deciding these pages’ fate is up to the SEO.
Your robots.txt file tells Googlesbot on the frontend where and what it should be able to access. Your .htaccess file does the same thing, but without telling Google straight up, it does it behind the scenes.
Mostly, you just want to use your robots.txt file to block out pages or parts of your site that you don’t want indexed – Things like tags pages and admin login areas are obvious ones to disallow.
Your .htaccess file can just be used to block spammy bots that are over crawling your website, but otherwise we don’t recommend you touch this file without extensive rewrite engine knowledge.
One of the biggest problems we run into with clients sites is having broken schema, messed up CSS, responsive spaghetti code and other majors code based issues that can cause serious SEO issues.
More times than not there’ll be some HTML code problem on your site – A malformed href link, badly coded headers and the like can cause serious problems for your SEO. Freeformatter has a free tool that allows you to check each page for invalid HTML code.
There’s a really simple tool to check if your SSL is working 100%: Why No Padlock. Just input the specific page URL your SSL certificate isn’t showing the green padlock for and it’ll give you the exact reasons and how to fix them.
Google announced nearly 3 years ago now that it’d be using site speed as a ranking factor, and we’ve seen many cases of sites with lackluster load times underperforming in the SERPs. Google’s set it’s own Search Console tools to report slow load times of anything over 3 seconds, so we tend to see this as a solid baseline to try and beat every time.
Optimizing your site speed massively depends on your hosting provider, the CMS your site is running on the ability of access you have to edit things – If you’re running on Shopify for example, you’re limited to a set few plugins that will only be able to optimize theme code and images, which are just a few large cogs in the much larger machine.
There are several core principles to optimizing your site speed though:
Though all of these things sound fairly complicated, but if you’re using a CMS like WordPress they can be achieved with just a few plugins.
One of the biggest questions we get from the SEO community and clients alike is how long is it going to take to see the results play out from my SEO work? Well, luckily for you with OnPage SEO as opposed to the usual link building strategies we talk about, you’ll be able to get ranking results normally the next day as long as you’re pinging the updated pages & sitemaps within Google’s search console.
We’ve spent a day updating internal linking, re-doing load times and optimizing metadata/alt tags for every single keyword we were optimizing for to jump the very next day.
Whilst technical SEO isn’t rocket science, it isn’t the easiest to get your head around and takes a lot of time to figure out the various different tools, CMSs and languages that go into being able to optimize any site you come across.
This guide hopefully gave you a real insight into how to execute analysis, auditing and implementation of SEO work that can lead to real results for both your own sites and clients alike. If you had any further questions around this post or around any topics we didn’t cover, please drop a comment down below.
Want to learn more about the technical side of SEO or other techniques? We have many courses and plans available and we also offer consultancy, both online and offline. Use the button below to reach out.