If you’re like me, the first word someone would describe you would not be techie.
Fortunately, we live in a time where information is at our finger tips, and if you want to learn something new, you open up Google and search.
That’s how I learned about SEO (and technical SEO). So I’ll be giving shout outs to all the articles and tools that have helped me learn along the way.
In this article, we’ll be going under the hood, getting our hands dirty, and learning tips and tricks to tune up your website;
for non-techies like myself.
We’ll be covering:
If you’re wondering just what the heck is technical SEO, here’s a helpful description explaining what it is:
Technical SEO refers to website and server optimizations that help search engine spiders crawl and index your site more effectively (to help improve organic rankings).
That’s what it’s all about at the end of the day. Increased search rankings.
Below is a helpful diagram showing you how you where technical SEO lives in the SEO ecosystem:
On-page, off-page, and technical SEO all play an important role in the SEO ecosystem.
Here’s a helpful visual from Hubspot of where technical SEO plays in the role of rankings.
Crawlabilty, indexability, and accessibility are the foundations of every website.
If your website can’t be crawled, it can’t be ranked.
We’ll start off with crawlabilty. What it is meant by this crawlability is how easily and readily your site’s pages and content are able to be crawled by the search engine crawlers.
If you think that crawlers are a weird, creepy analogy, I assure you, there’s not another one.
That’s why they call it the web after all.
Indexing happens once Google has crawled your website’s pages and indexes them in Google search.
So really, if you want to rank, you need to be indexed, and in order for you to be indexed, you need your pages to be crawled
Getting too technical? Fear not, Google has provided a free instrumental tool that helps everyone with their indexing and crawling of their site.
It’s called Google Search Console.
Want to quickly see how your site’s being indexed in Google Search?
Check out the “Coverage” report on the left-hand navigation
In this report, you can see all errors that are occurring on your website, what pages they’re occurring on, and how to resolve them.
In my Coverage report, I have one page with a 404 error code. Too many crawl and index errors on your site can be costly towards your rankings.
You can’t talk technical SEO without talking about site speed.
Site speed is more or less technical, but more of a necessity.
In fact, it’s estimated that for every second it takes for your website to load, you can lose out on 10% of sales.
Even just a few seconds of load time can seem much longer when you’re seeking information.
Fortunately for us, Google has yet another free tool at our disposal.
It’s called PageSpeed Insights and you can check it out here.
Here’s a snapshot of what it looks like:
Okay, okay, I’m muddying the waters between technical and informative. You can work with your developer to work on these issues. But it might be worth poking around in your report and see what you can find and areas of improvement.
But if you want actionable ways of reducing your site speed, here are a few tips:
1. Add a CDN
CDN or Cloud Delivery Network are essentially a group of servers that work together to provide files like your HTML, images, and videos much faster when loading a web page.
Cloudflare is the recommended CDN and has plenty of easy integrations that make setting up seamless.
2. Optimize Images
Images can be make or break when it comes to site speed. Images usually have the largest size of any of your website’s files.
If you run WordPress, download the plugin Smush. You can easily reduce the size of your images easily and for free.
3. Install a Caching Plugin
Caching is the process of storing a version of your webpage and saving it to load much faster for returning users.
There are a number of caching plugins out there, and even most hosting providers have their suite of caching options.
4. Minify HTML, CSS, and JavaScript files
Your caching plugin might also offer the ability to minify existing files on your site to help reduce your page’s size and increase speed.
I know WP Fastest Cache offers this ability and highly recommend it.
Once you have the plugin installed, just go over to your settings and click on these features:
5. Update your hosting
Your hosting is where your website lives and can definitely affect how fast your site loads.
Put simply, you get what you pay for.
When you take a look at Bluehost’s hosting plans, there might not seem like a big difference in their plans, except for one major thing.
A dedicated IP address.
Instead of sharing an IP address with hundreds of thousands of other websites with similar hosting, paying a little extra a month can get you the VIP treatment.
On July 1st, 2019, Google announced that they are now indexing websites mobile-first.
This means that Google will now take into consideration how your website looks and reads on a mobile device over desktop.
Google has even provided us a report in Search Console to provide us feedback on our mobile usability.
Just login into Search Console again, go to the left-hand navigation, and click ‘Mobile-Usability’ report.
There you can see which of your web pages pass the test, and which could use some mobile-friendly updates.
Another (web design) tip is to use Elementor to design your webpages for mobile.
It’s a compact, free plugin you can use for pre-built page templates, or create your own pages from scratch.
But a great note that if you decide to use this plugin, is the feature to view your page on a mobile device.
You may have seen HTTP in front of websites before.
HTTP stands for Hypertext Transfer Protocol. And the S in HTTPS just stands for Secure.
You may have even seen a lock symbol in place of the HTTPS like this:
Back in 2018, Google set a deadline for websites to make their sites secure, or else they could see lower rankings.
Google wants to provide a safe user experience, giving their users the peace of mind that their data has a level of encryption and privacy when browsing the web.
Google has even slandered websites still without a secure domain as such:
With all of the easy and affordable ways to make your domain secure, there really is no reason for you not to have a HTTPS site.
Almost every hosting provider such as Bluehost, SiteGround, and HostGator have a free SSL integration when you sign up for their hosting plans.
There are also a number of free SSL plugins to ensure that your site has a level of encryption.
Check out SSL Zen or WP Encryption if you’re using WordPress.
Finally, there are free online resources to get a free SSL.
https://letsencrypt.org/ is a free resource providing free SSLs. In fact, they’ve provided over a billion SSL certificates to websites worldwide.
301 redirects are a permanent change from one page location to another.
Where as a 302 redirect is meant to be a temporary change in location.
There are a few scenarios you may want to use redirects.
1. The main one being if you’re migrating to a new website, you’ll want to implement 301 redirects from your old URLs to your new ones.
This not only creates a good user experience if users go to your old site, because they’ll simply be redirected to your new website. They may not even notice a change in the domain.
Another very important reason 301 redirects are so important for site migrations, is that they pass over 90-99% of link equity.
2. A second reason you might need redirects is if you have tons of content on your site, you need to set a canonical tag (we’ll get into that in a sec) and redirect to a new piece of content.
3. Redirects can also help your site move to a HTTPS domain (the S in the protocol makes a website secure).
4. You’ll also need redirects if you’re looking to fix broken pages on your site. Redirecting to a page that has similar content will satisfy the search crawlers (404 pages are definitely not a hot commodity) and lead to a better user experience.
If you have multiple pages that have similar content or have similar subject matter, you might want to consider adding a canonical tag to your best piece.
Canonical tags help the search crawlers identify which page or article to index (and which pages that are similar to not index).
Here are some best practices when using canonical tags (because you definitely should).
1. You should add a canonical tag to your home page.
2. When you add that canonical tag, make sure it’s pointing to the HTTPS version of the URL (if you have a SSL (Secure Site Lock).
Not sure if your site is using canonical tags? Install the Moz Bar, go to your home page, click on your shiny new browser extension and see below:
Internal links are links on a page that direct you to another page on your website. They’re essential for websites as it provides utility for users as well as search engine crawlers when learning about all of your site’s pages and content.
Broken internal links are what they sound like. When a user clicks on a link, and it leads to a 404 page. That’s a broken internal link.
Either if a person or a bot crawler comes across a broken internal link, it’s a bad experience. The user may leave your website altogether, and a bot crawler may leave your site as it’s indicated there are no more pages to crawl.
That’s why Google may hinder websites that have a long list of broken internal links. It’s bad user experience.
There are a number of free, easy to use tools online that you can use to check your broken links.
See Dead Link Checker, they have a free tool that crawls all of your pages and links to see if any lead to a dead-end.
You can also use SEMRush’ free version of their tool and set up a project to crawl and audit your site.
They grade your site based on site performance, crawlability, the use of HTTPS, and internal linking.
It might look like a lot at first, but once you play around and feel comfortable with this tool, it can provide tremendous insight to your website.. It’ll look something like this:
Your site structure (also known as site architecture) can play an important role of not only how crawlers interact with your website, but also how humans interact with it as well.
A general rule of thumb is that if it takes more than three clicks to find an article/blog post/web page from your home page, that it won’t be found by either search crawlers, or by your users.
See this example of how your site structure should look like:
Having a site structure that looks like this will make it easy for search crawlers to find and index all of your pages.
Bonus tip: adding internal links to your pages make it even easier for the search crawlers to find all your pages and ensures that they’re being accounted for.
An XML Sitemap’s main purpose is to easily display all of your website’s URLs in a easy to read text file for search crawlers.
In a sitemap, you can also establish how important a page is so search crawlers can take priority when going through your site.
Your home page will naturally have the highest priority. Generally your sub-pages like your about page, service pages, and your mission statement will hold less value as your home page.
Your home page will have 100% priority and your sub-pages will generally have 80% priority.
At this point, you’re probably thinking I’m making this up.
But I assure you that it’s a legit thing and it serves as an XML sitemap, but instead of telling what URLs to crawl and index, Robots.txt tells what URLs not to crawl and index.
You’ll want to use robot.txt for any of your confidential pages. Anything with personal information, business information, or a thank you page that is reserved for users that go through your funnel.
Structured data is last on the list because it’s mostly used for cosmetics.
Here’s what an example of how structured data looks like:
Structured data markup is used to help the search crawlers get the most important information quickly and to transfer that data visually.
Another example of how structured data can work to your benefit are for your brand name keywords.
With structured data, you can own major portion of the SERP (search engine results page) for your branded keywords.
Check out Moz‘ example:
As you can see, you have plenty to gain by using structured data markup.
Now the question is how can I add it to my site?
You can look to see if your website is already using structured data markup by using Google’s free data testing tool.
Here you can see if your website is using structured data, and if so, how the code looks on your site.
It’s a little more advanced, and you may need some coding background, but you can correct your code and make adjustments in the tool. Google will tell you if the markup is properly coded.
That wraps up my spin on technical SEO and hope that I’ve made it a little less technical for you.
If there’s anything I missed, or you’d like to learn more of, leave a comment or send an email and we can continue the discussion!
Thanks for reading 🙂
Automated page speed optimizations for fast site performance