Technical SEO Audit Report for


Technical SEO audit

May 2017


This document has been prepared to provide SEO best practice recommendations for the to consider and reference.  Sub-sections for each part are itemised with explanations and the latest best practice recommendations for each to ensure the Fast Cover website performs well in search engines. Below is a summary of the issues found and their associated priority levels.

Website Technologies Profile

SSL: Thawte SSL

Email: Zendesk, Google Apps for Business, SPF


Ads: AdRoll,, Google Remarketing, FB Audience

JS Libraries: Html5Shiv, jQuery,

Analytics: GA, Bing, FB Pixel, Ve Interactive

CDN: Google Static Content, jQuery CDN

Hosting: Amazon Sydney Region, Dedicated Hosting

Though SEO strategy is same for every website but its implementation varies greatly based on the technology the website uses. Hence understanding these technologies is the first task to start understanding SEO issues of a website from a technical perspective.

High Priority SEO issues             

Page Speed


Based on data from Alexa, pages on fast-loading sites rank significantly higher than pages on slow-loading sites this is confirmed by google in 2010 that it uses  site speed as an official ranking signal. The site is not doing very well in this area.

As currently scored by Google PageSpeed Insights tool, we can definitely see an average overall scoring on both desktop and mobile devices.

Further analysis of your homepage  shows it took  an average of 6.5s for the page to load in full. There are 124 requests and the size is 1.31 MB. With optimization load time can be brought under 1.5s by reducing page size to about 600kb and bringing request below 100.


  • Looking into changing server / hosting plans or providers would be highly recommended and may improve the performance significantly.

  • Minimize Redirects – Remove 45 chain redirects if possible or at least minimize them.
  • Leverage Browser Caching for 19 cacheable resources present
  • Defer parsing of Javascripts – 473.6KiB of JavaScript is parsed during initial page load. Defer parsing JavaScript to reduce blocking of page rendering.

  • Add Expires headers – There are 16 static components without a far-future expiration date.
  • Make Fewer HTTP Requests – This page has 26 external Javascript scripts. Try combining them into one.This page has 3 external stylesheets. Try combining them into one.
  • Use or update CDN – There are 15 static components that are not on CDN.
  • Reduce DNS lookups – The components are split over more than 4 domains
  • Minify JS & CSS – There are 6 components that can be minified
  • Use Cookie Free domains – There are 15 components that are not cookie-free
  • Avoid URL Redirects – There are 3 redirects
  • Reduce the number of DOM Elements – There are 1006 DOM elements on the page

Accessibility / Findability Issues




Indexation Status

More indexed pages don’t mean a higher rank but indirectly it’s a big contributing factor. Larger number indexed pages allow for

  • Larger organic traffic
  • Opportunity to rank for more keywords
  • Opportunity to get larger number of backlinks

Pages Currently Indexed: 281 Pages

Make sure correct sitemaps are submitted. & Monitor Site crawl status for 6 months

HTML Sitemap

Manual check shows there is No HTML sitemap, this sitemap lives on a web page, not an XML file. HTML sitemaps provide an easily navigable view for website users. In addition, they provide a page that can evenly distribute equity to deep and less crawled pages on your website.

Consider adding one for the benefit of visitors.

Success 2xx

Out of random pool of urls only 65% gave 200 OK status

Perform exhaustive site scan for broken links, redirection loops etc

Redirections 3xx

Out of random pool of urls 25% are giving 3XX status. This is quite high

Perform exhaustive site scan and replace all instances of outdated urls.

Image XML Sitemap

Google search shows a lot of site images are indexed on google. Manual search shows  is missing. An image sitemap helps search engines to quickly identify image content and index it in image searches. People actively bypass organic results for image results, depending on the query. Getting your images ranked for image based searches can skyrocket organic traffic.

  • Create and submit image xml sitemap to google
  • Add Image sitemap to robots.txt

Content Checkup & Comparison

We took a sample 250 pages from your site for analysis. Here are some Places that need work.




Internet Average



Average Page Size



The average page size for your site is larger than 62% of all other sites

  • Implement SOP for publishing any new pages on site.
  • Strategically target top 20% of the pages with big size.

Average page load time



Your site takes longer than 77% of all other sites to load

  • Conduct speed test for every new page published on site.
  • Fixing only top 10% of the slow pages will significantly improve user experience and ranking.

Words Per Page



The number of words per page for your site is more than 94% of all other sites.

  • Implement a plan to periodically update old content
  • Automate social media sharing.
  • Add call to actions

Text to HTML ratio



The text to html ratio for your site is a greater than 86% of all other sites.

  • Reducing this ratio will reduce invalid HTML issues
  • Making efficient coded pages improves page speeds and make it readable to crawlers  

Duplicate Content



The duplicate content for your site is more than 86% of all other sites

  • Should keep check on auto generated pages via tagging, browsable search and variable driven paragraphs of text

Internal Links per Page



The internal links per page for your site is more than 82% of all other sites.

  • The number of internal linking is very high this needs to reduced

External Links per Page



The external links per page for your site is more than 78% of all other sites.

  • There appears to be extensive linking on an average.
  • Need to work on reducing this

Inbound Links per Page



The inbound links per page for your site is more than 85% of all other sites.

  • Conduct further analysis to check if the backlinks are not from spammy/shaddy website.
  • We can disavow bad backlinks in google webmaster

Duplicate content


There is serious issue of Duplicate content (identical content found on two or more pages) on your site.The issue is arising from the way website is setup. The current method by which content is served to user is causing content to repeat. Need to work on better content delivery method.  Search engines will penalize your site if this is not rectified. 


You may want to either remove the duplicate content so it appears on only one of your pages, or modify the content on the various pages to ensure that each page is unique.

The policies comparison should be generated dynamically by allowing the visitor to fill in the requirements. The current approach of listing plan benefits on every page is a severe issue.

Thin content

A number of URLs  have very low amount of functional content and could be considered as ‘thin content’. While this is completely natural for pages like About and Contact type of pages, as well as team profile URLs, there is no reason for others to be indexed in Google.

Potential solution could be consolidating particular types pages into one page like merging all /faq/ URLs into one main page FAQs with content of subpages into one single page. Same applies for non-functional admin login / access URLs which could be either no-indexed or executed through JavaScript instead of using dedicated URLs.

Robots.txt file


Robots.txt is present but has a warning. Check screen below.

Medium Priority SEO issues             



  • On urls like the one shown below breadcrumb path is not shown
  • Breadcrumb is a trail, or secondary navigation, clearly visible to website users to help them navigate your website.
  • Search engines crawl from page to page through links. Breadcrumbs enforce page hierarchy and navigation to search engines.
  • Breadcrumbs also help users to navigate content, particularly on eCommerce websites with a number of product categories and high page depth.


  • Consider adding breadcrumbs for subpages

URL Issues


Check Item



Hyphens used as default delimiter in URLs

  • Image file names have underscores and also are not named properly
  • Using _ or , as your URL structure causes search engines to read URL strings wrong.
  • Search engines read – as spaces. Using them ensures your content will be read the right way.

Older images can be updated with The default URL structure using hyphens (” – “)

Overall URL friendliness (short and easy to share)

Consider updating PDF default URL structure using hyphens (” – “) & making names short and meaningful

Overall URL optimization (usage of target keywords) – not sure what is the use of a ‘travel suitcase’ keyword in this URL.

Excessive keywords that don’t relate to the core topic can be replaced or removed

Expired Content

There are expired, outdated pages like

Analysis site in more details to check all outdated pages to update them with new information

Page Title Optimisation


From our research we can point some important issues.

  • First thing observed is that the site does not follow a set methodology for adding <title> <h1> <h2> <h3> tags properly.
  • Also there are large number of titles that are very small and some are same as H1 tags.


  • it is important to KEEP IMPLEMENTATION CONSISTENT across your entire site to maximise usability
  • To Present data in more organized manner to the user as well as to the search engine crawlers a clear method needs to be set.
  • Map out relevant keywords and manually create page titles for individual URLs, with core pages on the site completed as a priority.

Page Headings h1, h2,..h6


  • There are quite a few pages with duplicate titles.
  • Only 44% percent have multiple H2 tags.


  • Its advisable to have multiple h2 to better structure page in different subheadings.
  • It’s not advisable to have multiple pages targeting the same content. Manually check these pages to find the page that is relevant and delete the other after setting permanent redirect.

Link Profile




Backlink Domains














Linked TLDs

Links by Country


USA 55%

112 domains

Australia 14%

29 domains

India 7%

15 domains

Germany 2%

5 domains

Russia 2%

5 domains

Anchors Analysis


  • 47% links don’t have an anchor assigned


  • Devise a link acquisition / outreach strategy to start obtaining more contextual and relevant references to the site.
  • Review the link profile in more depth to disassociate from any low quality backlinks by adding them to a Disavow file in Search Console.

Low Priority SEO issues           

Meta Descriptions


We test meta description various benchmarks . The meta descriptions are currently not fully optimised. While the descriptions have no direct impact over rankings and SEO overall, optimised descriptions could contribute to improved organic click through rates and dwell time on the site.  


  • On priority basis we can initially fix meta descriptions that are too long.
  • As a long term strategy have a set procedure to create meta desc so that any writer working on site can follow the procedure and bring about a uniformity in site wide meta description implementation.

Image Optimization


  • 12.35% of a sample of 81 images from site had size larger than 100kb
  • 92% images are missing alt text


  • Optimize all website media with relevant information to improve UX and increase crawl