Technical SEO

SEO

Before you get blogging, vlogging or backlinking, a search engine needs to be able to find your pages online. Once your page is found the search engine will crawl the site to get a good understanding of the key topics and their keywords. Once it does this it will use an algorithm to index your page as to where it thinks it fits amongst everything else. 

Whilst this may seem pretty straightforward, Google doesn’t read a site as we do, and just because we can see your site, it doesn’t mean Google can. We see a website as images, text graphics and animation however a website just sees code, text navigation and links. This is where technical SEO comes in. Technical SEO is giving your site the best chance of getting indexed by search engines but ensuring that the crawlers can get all the information they need as quickly and easily as possible. 

These are some of the things you need to consider for great technical SEO.

URL Structures 

Keep it simple stupid. Just like me, search engines hate reading complicated and lengthy strings of words in a URL. So make sure when you’re setting up your site structure to keep your URL’s short and simple. The best way to set up your URL is as logical as possible, which is not much beyond the keyword that the page relates to. For instance www.pulpmedia.com.au/blog/Title-with-keywords. 

One mistake I see often is the use of underscores in URL’s to separate keywords, this is a big no-no. Dashes should be used in a URL as word separators while an underscore in your URL will not be recognised. For instance, a URL with "technical_seo" would be interpreted as "technicalSEO" rather than "technical seo". So to increase crawlability and reduce the likelihood of punishment use hyphens or dashes to separate keywords in your URL’s

Tip: Don’t get this confused with UTM tagging, which makes your URL’s ridiculously long as these URL’s won’t be indexed by google.

Page speed

One of the key metric that a search engine will use to measure the quality of a website is its load speed. Page speed needs to be considered for both desktop and mobile devices. There are many things that can affect the page speed, including the size of the images, or embedded videos but the most common mistake I see is unminified or excessive code on the website. For mobile devices, search engines will be harsher on your site as mobile sites need to be able to load even in poor quality signal areas. Whilst they do add to the value of a website, Animations need to be used sparingly as they can affect load time especially if they’re custom built. To see how google tests your sites speed, use Google’s Page Speed Insights Tool. If your site is running slow check out our 4 tips to improve your page speed blog.

My final piece of advice around page speed and technical SEO is to remember that Google doesn’t care if your developer says your site loads fast for them, Google is comparing you to every other site out there and will punish or reward you accordingly. 

Broken links, or redirects. 

A broken link will send a visitor to a 404 or nonexistent page, and a broken redirect will point the user to a page or resource that may no longer exist. Both of these instances provide poor user experience and also prevent search engines from crawling and therefore indexing your site. If your site has a large number of broken links your site won’t be ranking as good as it should. 

Make sure that when you’re cleaning or deleting old blog posts or pages that you know what other pages they might link to, so you can make sure you’re not creating a bunch of broken links. Tools like SEMrush & Moz can help you to find and fix broken links on your website. 

Sitemap and Robots.txt files

A sitemap gives google the blueprint of your website and simply lays out all of the URLs on your site. Search engines will use it to identify which pages to crawl and index. But what if there are pages that you don’t want google to index, like premium members-only content? That’s where an up to date robots.txt file, will tell search engines what content not to index make sure you have both set up on your site to speed up the crawling and indexing of your website content.

Duplicate content

Copy n paste? No! If your pages contain identical or really similar content, it confuses search engine crawlers. This is because they will not be able to determine the right content to display in search results. Not only do search engines get confused with duplicate content from your own site, but if you’ve copied content from another site. For this simple reason, search engines will penalise your site if it contains duplicate content and not show it in search results. 

In cases where you’ve written a great blog article or video but what to share online via platforms like Medium for example it is important that you add a canonical link to tell google that the blog was originally published on your site. 

Previous
Previous

Onsite SEO

Next
Next

Why is SEO important for your business?