SEO Technical Optimization Explanations

Big Picture Websites SEO program includes technical work under the following headings. Here’s our best at a quick explanation for what each of these terms means.

1. Website Architecture

This is the page hierarchy and the way the various pages related to each other. It creates good user experience, creates good sitelinks in search engine results, improves Googlebot crawling, and makes the site easier to manage. The hierarchy and overall structure should be logical and familiar, not too deep (fewer clicks), with 2 – 7 main categories of pages, and ideally a balance of content within each category. The structure and menu / navigation should match.

2. Mobile

With mobile searches and page views having overtaken desktop within the past few years, search engines have responded by placing a high value on mobile-friendliness. Is the site responsive, and does all the content display on all screen sizes? Is the site easy to use for mobile browsers? Does it have consistent navigation an user experience? All these things matter a lot to Google and Bing.

3. Site Speed

Site loading speed is not only important to user experience – visitors will leave if pages take too long to load – but search engines measure it and reward sites that are not bloated or take a long time to load due to various technical factors. Large images are the main culprit, and sometimes there’s tradeoffs to be made if you want to have lots of high-quality images on a page. But there are ways to optimize images to very small sizes without losing too much quality.

4. Crawl Issues

Crawling is the process where a search engine tries to visit every page of your website via a bot. Crawl errors occur when a search engine tries to reach a page but fails. We want to make sure that every link on your website leads to an actual page. That might be via a 301 redirect, but the page at the very end of that link should always return a 200 OK server response.

5. Sitemap

Giving web crawlers a map of your site is not only helpful to them for indexing all the pages, but essential for good rankings. This is usually submitted to browsers in .xml format.

6. Robots.txt

Robots.txt is a text file that instructs search engine robots how to crawl pages on the site, and whether certain user agents (web-crawling software) can or cannot crawl parts of a website. It’s important not to give web crawlers incorrect information about how to index your site.

7. Canonicalization

www and non-www versions of the site have to point to the same page, and not have two versions of the same page, which you’d be penalized for.

8. Semantic markup

Semantic Elements refer to HTML elements that have a specific meaning. For example

is a semantic element. It tells google bots that the content within the tag is the most significant header contained in the HTML document.