Technical SEO audit

http://www.gatherby.org/

1st June 2017

Summary

This document has been prepared to provide SEO best practice recommendations for the http://www.gatherby.org/ to consider and reference.  Sub-sections for each part are itemised with explanations and the latest best practice recommendations for each to ensure gatherby.org performs well in search engines. Below is a summary of the issues found and their associated priority levels.

Website Technologies Profile             

Web Server: Apache 2.4.7,Phusion Passenger

OS: Ubuntu

SSL: None

Email: Zendesk, Google Apps for Business, SPF

Frameworks: Ruby on Rails, Twitter Bootstrap

Ads: None

JS Libraries: Jquery, Mordernizr

Analytics: GA, Quantcast,

Caching: RackCache

Widgets: Facebook, Google Plus, Twitter

Fonts: Google Fonts, Typekit

Though SEO strategy is same for every website but its implementation varies greatly based on the technology the website uses. Hence understanding these technologies is the first task to start understanding SEO issues of a website from a technical perspective.

High Priority SEO issues             

Page Speed

Observations:

Based on data from Alexa, pages on fast-loading sites rank significantly higher than pages on slow-loading sites this is confirmed by google in 2010 that it uses  site speed as an official ranking signal. The site is not doing very well in this area.

As currently scored by Google PageSpeed Insights tool, we can google rates the website poor in both on both desktop and mobile devices.

PageSpeed Score

YSlow Score

Further analysis of your homepage  shows it took  an average of 3.5s for the page to load in full. There are 66 requests and the size is 3.81 MB. With optimization load time can be brought under 1.5s by reducing page size to about 600kb and bringing request below 50.

Recommendation

  • Serving Scales Images can reduce page size significantly. The following images are resized in HTML or CSS. Serving scaled images could save 919.2KiB (88% reduction).

http://d3n8a8pro7vhmx.cloudfront.net/gatherby/pages/1159/meta_images/original/CutComb-SB-Small.png

http://d3n8a8pro7vhmx.cloudfront.net/gatherby/pages/1331/meta_images/original/Food_Security_Earth.png

http://d3n8a8pro7vhmx.cloudfront.net/themes/56985039221393a6b5000001/attachments/original/1455237109/bees-icon-white.png

http://d3n8a8pro7vhmx.cloudfront.net/themes/56985039221393a6b5000001/attachments/original/1455237113/directories-icon-white.png

http://d3n8a8pro7vhmx.cloudfront.net/themes/56985039221393a6b5000001/attachments/original/1455237105/act-icon-white.png

http://d3n8a8pro7vhmx.cloudfront.net/themes/56985039221393a6b5000001/attachments/original/1455237116/learn-icon-white.png

  • Serve resources from consistant URLs – The following resources have identical contents, but are served from different URLs. Serve these resources from a consistent URL to save 1 request(s) and 149.0KiB.

http://d3n8a8pro7vhmx.cloudfront.net/gatherby/pages/1262/meta_images/original/20_.jpg?1463554977

https://d3n8a8pro7vhmx.cloudfront.net/gatherby/pages/1136/attachments/original/1463553763/20_.jpg

  • Optimize Images: Optimize the images to reduce their size by 101.6KiB (22% reduction).
  • Minimize Redirects – Remove 16 chain redirects if possible or at least minimize them.
  • Leverage Browser Caching for 9 cacheable resources present
  • Defer parsing of Javascripts – 612.2KiB of JavaScript is parsed during initial page load. Defer parsing JavaScript to reduce blocking of page rendering.
  • Enable Gzip Compression – Enable compression for the following resources to reduce their transfer size by 51.9KiB (79% reduction).
  • Add Expires headers – There are 37 static components without a far-future expiration date.
  • Make Fewer HTTP Requests 
  • Homepage has 15 external Javascript scripts. Try combining them into one.
  • Homepage has 6 external stylesheets. Try combining them into one.
  • Homepage has 7 external background images. Try combining them with CSS sprites.
  • Use or update CDN – There are 10 static components that are not on CDN.
  • Reduce DNS lookups – The components are split over more than 4 domains
  • Minify JS & CSS – There are 4 components that can be minified
  • Avoid URL Redirects – There are 9 redirects

Accessibility / Findability Issues

Observation

Recommendation

Indexation Status

More indexed pages don’t mean a higher rank but indirectly it’s a big contributing factor. Larger number indexed pages allow for

  • Larger organic traffic
  • Opportunity to rank for more keywords
  • Opportunity to get larger number of backlinks

Pages Currently Indexed: 244 Pages

Make sure correct sitemaps are submitted. & Monitor Site crawl status for 6 months

HTML Sitemap

Manual check shows there is No HTML sitemap, this sitemap lives on a web page, not an XML file. HTML sitemaps provide an easily navigable view for website users. In addition, they provide a page that can evenly distribute equity to deep and less crawled pages on your website.

Consider adding one for the benefit of visitors.

Redirections 3xx

Out of random pool of urls 8.26% are giving 3XX status. These are outdated URLs which are redirected to new URLs. This should be less than 7%. If more than 15% of the pages are redirected, it needs to be investigated

Perform exhaustive site scan and replace all instances of outdated urls.

Image XML Sitemap

Google search shows a lot of site images are indexed on google. But https://gatherby.org/sitemap_image.xml   not found.  An image sitemap helps search engines to quickly identify image content and index it in image searches. People actively bypass organic results for image results, depending on the query. Getting your images ranked for image based searches can skyrocket organic traffic.

  • Create and submit image xml sitemap to google
  • Add Image sitemap to robots.txt

Content Checkup & Comparison

We took a sample 388 pages from your site for analysis. Here are some Places that need work.

Matrix

Result

Internet Average

Observation

Recommendations

Average Page Size

41Kb

50Kb

The average page size for your site is smaller than 60% of all other sites

  • Implement SOP for publishing any new pages on site.
  • Strategically target top 20% of the pages with big size.

Average page load time

411ms

720ms

Your site is slower than han 26% of all other sites we tested

  • Conduct speed test for every new page published on site.
  • Fixing only top 10% of the slow pages will significantly improve user experience and ranking.

Words Per Page

558

595

The number of words per page for your site is more than 45% of all other sites.

  • Implement a plan to periodically update old content
  • Automate social media sharing.
  • Add call to actions

Text to HTML ratio

7%

6%

The text to html ratio for your site is a greater than 54% of all other sites.

  • Reducing this ratio will reduce invalid HTML issues
  • Making efficient coded pages improves page speeds and make it readable to crawlers  

Duplicate Content

27%

15%

The duplicate content for your site is more than 76% of all other sites

  • Should keep check on auto generated pages via tagging, browsable search and variable driven paragraphs of text

Internal Links per Page

57

26

The internal links per page for your site is more than 78% of all other sites.

  • The number of internal linking is very high this needs to reduced

External Links per Page

4

5

The external links per page for your site is more than 36% of all other sites.

  • There appears to be right amount of external links

Inbound Links per Page

50

20

The inbound links per page for your site is more than 81% of all other sites.

  • Conduct further analysis to check if the backlinks are not from spammy/shaddy website.
  • We can disavow bad backlinks in google webmaster

Duplicate content

Observation

Duplicate content is identical content found on two or more pages on your site. Search engines may penalize your site if too much duplicate content is found. Your site was found to have an above-average amount of duplicate content.

Recommendation

You may want to either remove the duplicate content so it appears on only one of your pages, or modify the content on the various pages to ensure that each page is unique.

Thin content

42% URLs  have very low amount of functional content and could be considered as ‘thin content’.  A number of URLs  have very low amount of functional content and could be considered as ‘thin content’. While this is completely natural for pages like About and Contact type of pages, as well as team profile URLs, there is no reason for others to be indexed in Google.

Potential solution could be consolidating particular types pages into one page like merging all team profile URLs into one main Team page with content of subpages into one single page. Same applies for non-functional admin login / access URLs which could be either no-indexed or executed through JavaScript instead of using dedicated URLs.

Medium Priority SEO issues             

Structured Data

Observation- There is no structured data on this site. A snippet of code (HTML) located in the <header> tag that instructs search engines what your page is about. These markups come in different forms (local business, video, etc) and should be used based on your website and content type.

Recommendations – It makes it easier for Googlebot to get to the meat of what your page is about without having to read complex code. For product reviews, having a rich snippet show up on the SERP with stars, which greatly increases click through rate

URL Issues

Check Item

Observation

Recommendation

Hyphens used as default delimiter in URLs

  • Many URLs use underscore as the delimiter which is not good from SEO POV
  • The default URL structure should use hyphens (” – “).
  • Using _ or , as your URL structure causes search engines to read URL strings wrong.
    Search engines read – as spaces. Using them ensures your content will be read the right way.

Pages can be updated with The default URL structure using hyphens (” – “)

Older pages should be redirected to new pages

Sitemap should be resubmitted.

Overall URL friendliness (short and easy to share)

While most URLs look fine, some URLs don’t have the correct keyword based on the content on the page. For Eg http://www.gatherby.org/convenient_analysis .

Convenient Analysis is a very generic term whereas the page is about a more focussed topic on Australian Leptospermum honeys. Need to add focus keyword and matching URL for such pages.

Missing Meta Description

Observation: A large number of pages 69% don’t have meta descriptions. Your pages should all have meta tags, none missing. Your pages should all have unique meta tags, no duplicates. They matter because they act as descriptors for your site. They tell the user what content they can expect to find when they click on the result. While they have no direct ranking impact, they increase SERP click through rate, which is a ranking factor.

We test meta description various benchmarks . The meta descriptions are currently not fully optimised. While the descriptions have no direct impact over rankings and SEO overall, optimised descriptions could contribute to improved organic click through rates and dwell time on the site.  

Recommendation

  • On priority basis we can initially fix meta descriptions that are too long.
  • As a long term strategy have a set procedure to create meta desc so that any writer working on site can follow the procedure and bring about a uniformity in site wide meta description implementation.

Recommendation: Create an excel sheet with list of all top pages and construct proper titles &  meta descriptions for each page.

Page Title Optimisation

Observation

From our research we can point some important issues.

  • First thing observed is that the site does not follow a set methodology for adding <title> <h1> <h2> <h3> tags properly.
  • Also there are large number of titles that are duplicate indicating the content could be consolidated
  • There are titles that very small as well.

Recommendation

  • it is important to KEEP IMPLEMENTATION CONSISTENT across your entire site to maximise usability
  • To Present data in more organized manner to the user as well as to the search engine crawlers a clear method needs to be set.
  • Map out relevant keywords and manually create page titles for individual URLs, with core pages on the site completed as a priority.

Page Headings h1, h2,..h6

Observations

  • There are 83% pages with duplicate titles.
  • There are 73.02% have multiple H1 tags.

Recommendations 

  • Its advisable to not have multiple h1 so search engines can be sure about the topic of the page.
  • It’s not advisable to have multiple pages targeting the same content. Manually check these pages to find the page that is relevant and delete the other after setting permanent redirect.

Low Priority SEO issues           

Social Issue

Observation:

  1. Did a manual check on witter Card Snippet (searched “<custom search in SF for meta name=”twitter:”). There is no meta name = “twitter:..” etc tag. Checked manually
  2. Google + account: The site has no G+ account associated with the site. Formerly Google+ Pages, it’s your online business center for everything Google

Recommendation:

  1. Add Twitter meta correctly
  2. Getting setup and verified as a Google Business is a massive part of SEO for local businesses. Without it, you can’t rank in local search results. For larger businesses, it’s also a verified review platform and allows you to get setup in Google Maps.

Image Optimization

Observation

  • 19% of a sample of 188 images from site had size larger than 100kb
  • Over 70 % images are missing alt text

Recommendation

Leave a Reply

Your email address will not be published. Required fields are marked *