Digital SEO Guide 2019

What has changed?

The guide is covering the scope of core elements of SEO in 2019. Google is continually updating the core SEO factors as it still forms the basis of SEO.

We have covered the necessary scope, along with the updated element of each SEO factor in 2019 by Google.

 
 

SEO Guide 2019
Don’t have time to read the whole guide right now?

 
 
Don’t have time to read the whole guide right now?

 
 

Google market share is 92%

The chart showing the market share of Google in search technology is more than 92% making them the undisputed leader in an internet search.

Google

92.42%

Bing

2.61%

Yahoo!

1.9%

Baidu

1.03%

YANDEX RU

0.53%

DuckDuckGo

0.37%

Search Engine Market Share Worldwide - April 2019

Interesting Facts – www & Google SEO Guide 2019

The search engine is your best friend and searches bot a good helper, only if you understand and feed them correctly.

Ignoring a search technology can affect your business, especially when search trend is considered a part of our day to day lives.

Google Search Bot

Optimising for search bot is imperative for businesses.

Sir Tim Berners
Image Source: webfoundation.org

Let’s go back in the late ’80s to understand where it all started. A little analysis makes it even more enjoyable.

World Wide Web - 1989

Sir Tim Berners Lee invented the World Wide Web in 1989

Sir Tim Berners Lee – Key Facts

A British scientist born in London. I graduated from Oxford University and became a software engineer at CERN.

Sir Lee realised the problem of information sharing when people had to log on to different computers to access information.

He realised the fact that computer technology is evolving each day, at the same time he also realised PC’s are not connected making the information inaccessible.

He then wrote three different technologies, and they still serve the basis of the World Wide Web.

HTML URI HTTP
Internet History

Worked with Steve Jobs

He started working with a NeXT computer designed & developed by Steve Jobs.

Launched his first web page in 1990

Tim began the first web page by the end of 1990; the page is accessible from anywhere on the open internet.

Tim realised the fact, to unleash the full potential, the codes must be provided royalty free. They made the decision back in 1993, which sparked a global wave of creativity & design.

The wall below is the snapshot of tech history since 1990 when the World Wide Web was born.

Google born 1998

Google was born in 1998 and evolving since then.

Google today – with more than 90% search share

And now a technology giant with more than 90% search market share worldwide.

Google First Look
The first release of Google
Machine Learning

Google working on ML & AI

Working on technologies like Machine Learning (ML) & Artificial Intelligence (AI) developing intelligent machines and making them compatible with the human world.

Key takeaway

You can now imagine why ignoring search terminology means suffering from business health.

Digital Assets

Although making a digital asset starts with creating a very own webpage or a website. We technically call it a Website Design. And the process to design a website is entirely technical, especially if it involves a search engine optimisation & search engine marketing processes too.

Search engine optimisation & search engine marketing. SEO Guide 2019

SEO & SEM both are entirely different in working. If you go through the above links, you will realise the fact, both the processes are closely related to each other.

Search Engine Optimisation – The Full Process & guidelines

SEO Process

What is SEO?

Website Modification

Search engine optimisation is a process of making modifications to parts of your website and collectively have an essential impact on the site’s user experience and performance in organic search results.

The above chart will show you the critical elements of the search engine optimisation process

And the process is the same for any new website of any length, old and existing websites as well

Learn each aspect of search engine optimisation with the step by step guide published by Google

Onsite SEO Analysis tools

A plethora of SEO analysis tools, including free & paid both are available online to generate a website optimisation report.Although how far they are reliable is a question?Automated tools produce a report within set coded parameters!SEO analysis tools fail to determine
  • The flow & quality of content
  • The quality of any given anchor text in a link
  • How well are Alt text explaining the images?
  • How relevant title & Meta tags are with the overall page content.
SEO tools reasonably determine the quantitative analysis, for a qualitative analysis either use your intellect or an SEO Consultant for website auditFor in-depth analysis, Google Webmasters provides exhaustive reporting, including website crawl errors and the procedure to fix.
SEO Tools

Role of content in search engine optimisation

Google says

“Creating compelling and useful content will likely influence your website more than any of the other factors.

We share quality content with our friends, family & colleagues. Sharing helps build awareness and attract more visitors, build trust.

Content Writing

When visitors see authority content, they will likely suggest others to see it as well, and that’s the power of sharing.

We use favourite mediums like blogs, social media, and emails to share content.

Take note, only the power of quality content can ignite the user to share it.

And by consistently maintaining the quality content when used with a tactical content marketing strategy produces a resonating effect.

Ripples reach far and wide, which compels search engines to take a note of the emanating authority of the digital asset.

Content Sharing
Social Media

Content must be fresh & unique, error-free, without grammar mistakes, in proper flow, well optimised and top of all compelling enough to hold visitor attention, along with meaningful graphics for a hypnotic effect.

Perhaps to achieve the above require patience and dedication. Consider facing the challenge or hiring a freelancer, or work on a contractual basis with a company.

A website will be likely to fail without compelling content.

If you want to write content for your website that converts visitors into customers or earns you a click through then you need to work on three primary ingredients

Three primary ingredients of content

  • Content Research
  • Content planning
  • Content Strategy

Keyword Research- SEO Guide 2019

Keywords are the search terms and play a significant role in SEO. Visitors use keywords in search engines to find relevant information,

Websites optimised with relevant keywords increases their chances of appearing in higher search results.

Keywords can be short and long. E.g. SEO London & ethical search engine Optimisation Company in London

E.g. if you are manufacturing T-shirts in London

You have to make sure to find the most appropriate keywords to avoid bounces, so if you are producing cotton T-shirts in London and your target audience are wholesalers, the perspective keyword would be,

“Cotton T-shirt manufacturer London.”

Google provides a keywords tool to check keyword volume searches. You will find keyword planner in AdWords.

Keyword Research
Google Keyword Planner

Google showing Avg. Monthly searches are vague!

Take caution while considering ‘Avg. Monthly searches’ from Google Ads.

Vague Data
Vague Data
You will observe an ambiguous metric, try changing the frequency from the top right corner, you will find options like
  • Last months
  • Last 12 months
  • Last 24 months
  • All available
Select each one of them one by one and keep an eye on the searches; you will observe Monthly average searches will remain the same!

And that’s quite a problem

Vague Data in Keyword planner

Does the question remain why Google is showing ambiguous data without disclosing any details attached to it?

And the answer is hidden behind your login status, have a look again if your ads are running or not.

Google serves accurate data only to customers running the live campaign. Sounds a bit weird although it’s a truth.

There is no other way to have an exact historical metric.

Google used to provide exact metric earlier and recently closed the same in 2018,

Next time you are preparing a keyword volume searches reports to your client. Either consider a live campaign on Google, or you can also try ahrefs.

AHREFS keyword research tool

The tool will provide you with the exact volume searches of keywords with critical stats.

Ahref Tool

You can consider the same to find accurate keyword volume searches.

Mobile compatibility (Speed Score) - SEO Guide 2019

Google mobile speed update

After the July 2018 update from Google, which says mobile speed is one of the ranking factors for mobile searches.

Search parameter states, user experience (UX) is the crucial factor in calculating web rankings.

A slow website in mobile ends up in a disappointing experience for users. Mobile searches surpass those of desktops. Back in 2018 in the US alone, a total of 900 billion web searches and out of which 500 billion are mobiles alone.

See why website redesign and 95% of websites suffering speed score

Achieving the speed score in mobile compatibility is completely necessary, then just by having a mobile compatible website.

So make sure your next website project should achieve an appropriate level of speed score.

Page speed insights tools by Google lighthouse

Mobile Compatibility
Google Page Speed Test

No matter your webpages are providing the best user experience, you will always find a niche for betterment.

Google also provides a tool to check if your webpage is mobile friendly

Meta tags for SEO - SEO Guide 2019

Meta tags represent your webpage for search engines. A type of legally hidden content from the user and typically meant only for search bots to display in search results.Meta tags appear in the HTML of your webpage, press Ctrl + U or right click view source, press Ctrl + F to find Meta in a page.HTML provides different types of tags to represent your webpage best to search engine crawlers. They are
  • Meta Title tags
  • Meta Description tags
  • Alt tag
  • Canonical tag
  • Social tags
  • Responsive design meta tag
  • Robots meta tag
  • Heading tags
Let me explain each
Meta Tags
Title Tag

Title tag best practices

The first content visible to the user in a search engine results page is its ‘Title’ defined under the head of HTML.

My first Webpage

Bitvero meta tags

Page title influence visitors more than any other factor in SERP. A compelling page title decides and helps the visitor to click.

A page title is the first impression of your digital asset; every title should be well written, concise and clear.

A well-written title includes

  • User perspective, write to please user not search bot.
  • Keyword, try to write a compelling title starting with your keyword.
  • It needs to be short, unambiguous and concise.
  • Adding modifiers improves click through, e.g. a guide, cost calculator, contact us, nine tips, Do’s & Don’ts etc.
  • A well-written title describes page attributes in a single line.
  • Mentioning your brand name in a title helps in exposure and build trust with search bot & users both.

Things to avoid

  • Avoid keyword stuffing in the title tag.
  • Strictly avoid misleading content. E.g. a freelancer mentioning words like agency or company in the title. A trading company represents manufacturers in their title.
  • Grammar mistakes
  • Writing duplicate titles

Title tag character limit

Keep it under 60 characters is the best bet, Google typically displays 50-60 characters

Moz suggests

Google typically displays the first 50–60 characters of a title tag. If you keep your titles under 60 characters, our research suggests that you can expect about 90% of your titles to display properly. There's no exact character limit, because characters can vary in width and Google's display titles max out (currently) at 600 pixels

Meta Description tag – how to write and best practices

The description is visible for the user and search engines both. Writing compelling descriptions is a critical part of on-page SEO.

My first HTML code

Description tag is considered as crucial for search engines as the label is used to the indexing of web pages. The name includes content for the user, which further increases its importance.

Users correlate descriptions in conjunction with the titles before deciding to click a link. Relevant descriptions from the user’s perspective are considered best.

Bitvero Meta Description

A compelling description tag starts with

  1. A user perspective,
  2. Includes your keyword as well
  3. Attention-grabbing tag lines. E.g. A £10 deal, pay as you go, website Support starting £20.
  4. Include brand name
  5. A well-written description tag always supports and best describe the title and URL.
  6. The description tag should be within the standard length of 150-160 characters.

Length of the meta description tag

Starting with 50 and should not exceed 160 characters. Going beyond the prescribed limit will not help in rankings, the goal is to provide relevant information and drive clicks.

When not to write a Meta description

Consider ignoring to write a Meta description if you are targeting more than 2 or 3 highly searched terms, probably short tail keywords. The basic idea is to give a free hand to search engine, allowing them to crawl the way they want.

Explicit declaration of Meta description may hinder the search engines from taking natural relevance resulting in lower SERP.

Next time if you are bidding on highly competitive keywords, make sure to let the search engine decide and present the Meta to the user.

Meta Keyword Tag

Meta Keyword tag SEO

What’s the news on the Meta keyword tag?

Google doesn’t use the Meta keyword tag in search rankings. People started misusing the keyword tag and abusing it by over stuffing to manipulate search rankings.

Google had to announce in Sep 2009, that they are officially disregarding the Meta keyword tag in search ranking algorithm.

URL optimisation - SEO Guide 2019

How to optimise URL’s for SEO

Uniform resource locator plays a vital role in SEO, making URL’s user and search engine friendly is necessary.

An optimised URL receives a higher click-through and a chance to gain a better position in SERP.

A well-defined URL is readable; the aim is to achieve and serve the intent of the user seeking for targeted content.

In a most basic sense, the purpose of any given URL is to serve the human audience. Hence the more readable is, the better.

Hopefully, the below illustration will help you understand the difference in readability.

URL Optimisation
URL Readability
Image Source: Moz
Here are a few best practices to optimise URL’s
  • Try to achieve readability, keeping in mind user intent. Including dynamic content in URL’s creates a deceiving impression resulting in a lower click-through
  • Keep them short and concise, using multiple folders for the sake of stuffing a keyword to manipulate the search engine is deceiving. E.g. Bitvero/seo/seo-onsite/seoURL/url-optimisation
  • Consider keywords in URL; users feel confident when they observe the required keyword in the URL while hovering the link in the browser, which results in a higher click-through.
Web Browser

Secondly, keywords in URL influence user intent in search, except a higher click-through in URLs including the targeted keywords.

URL Optimisation

4. Try to match keywords appear in title with those in URL’s, to enhance the overall performance, see the above example.

5. Including secondary words are not necessary, e.g. in, but, and, or, because, for, etc. unless it makes sense either.

Bitvero/seo/all-inclusive-guide-of-seo-in-2019

Bitvero/seo/seo-guide-2019

6. Avoid using a hashtag # unless necessary; a fragment identifier is used to redirect the user to the part of the page for a unique content experience, use it with caution.

7. Case sensitive URL’s, the best practice is using a lower case, Linux/Unix servers are case sensitive in interpreting URL’s, they will consider a URL co.uk/sEO unique as compared to Bitvero.co.uk/seo

8. Use of Hyphen (-) as a preferred separator; however, search engines are well versed with an underscore (_) as well. Hyphen gives a better user experience.

9. Avoid stuffing of keywords at all cost, which looks spammy.

Bitvero/seo/best-seo-guide/seo-guide

10. Consider redirection to one hop, multiple redirects degrades the user experience and makes the page slower. They sometimes frustrate mobile users creating a wrong impression.

Header Tags (H1 to H6) – How to use them, SEO Best Practices - SEO Guide 2019

What are header tags in SEO? H1 & H2

A clear heading in page not only increases the readability but also helps the user to understand what the page is all about and compels them to go through the content.

Because compelling headings influence the searchers intent, search engines love headers and keep a special place for them while calculating SERP.

Heading Tags
Heather Tag Hierarchy

Header tags hierarchy

A full heading structure maintains a pyramid upside down.

Always re-check the webpage and blog post for a well-defined heading structure.

Never use headings for the sake of it, try to make the best utilisation of headings keeping in mind H1 to H3 are necessary from a user perspective if content demands.

Another compelling structure

Heading Tags
Heading Tags

Heading structure depends on the flow of content, even compelling content looks and read boring without the presence of meaningful headings and sub-headings.

Heading 1 – The Main Heading

Heading 1 is the main heading of the page, you can keep any number of main headings in the page, and however, the best practice is to restrict the main heading of the page with only one title unless necessary.

In a conversation, John from Google confirmed the fact that we can use as many H1 tags.

John Muller
That’s quite a surprise.Multiple H1 headings lead to overkill, sometimes confuses the user to understand the real intent of the content resulting in bounce and we all know high bounce rate passes strong signals to search bots that the page is not suitable for the user.Always write the main heading concisely and thoughtfully to express your content finely.Try to use your main keyword in the heading if possible without degrading the quality of the title.

Header tags best practices in SEO.

  1. A short and concise H1 is best. However, H2 can be extended/short both.
  2. Using keywords in the heading is a good practice, avoid stuffing and unnecessary pushing them in the headings.
  3. Try to use exciting headings, e.g.
    • Are you bidding on all the wrong keywords?
    • Five rules of page optimisation.
    • How to write a compelling page title?
  4. Provide a meaningful structure
    • H1 is the main title just a title of the book
    • H2 are sub-titles, like headings to define different chapters in a book.
    • H3-H6 serves multiple subsequent headings, like explaining of secondary & tertiary titles in different sub-titles.
  5. Use headings to break up content and increase readability

Image optimisation for the web- SEO Guide 2019

Why is image optimisation important?

Images play a significant role in the development of our web pages. Google has recently started supporting Web-P image format that provides a superior compression without compromising on quality.

Image Optimisation
webp images

Google Web-P image format uses lossless, and lossy compression, Web-P images using lossless are 26% smaller as compared to PNG and images using lossy compression are up to 34% lower in size than JPEG.

See the difference below

webp images vs jpeg

Web-P – bulk image converter

Convert your images to Web-p format; the free tool converts bulk images to web-p format at once.

Bulk Image Converter

Use CDN to upload images

Upload your Web-p images on a content delivery network. Instead of using your web host.

A cloud CDN is used to accelerate the speed of your content.

Imagine serving your customers from multiple locations instead of using a single site.

Yes and that’s the difference. The image gives you a clear picture.

with CDN
without CDN
Image source: blog.uploadcare.com

Why use CDN to upload images?

Using a CDN provides benefits like failover protection, speed acceleration, reliability. CDN surpasses the traditional hosting & maintenance of digital assets.

CDN Cost

Free and paid both options are available. However, the primary difference comes with the fact of how the CDN is handling and delivering the digital asset. E.g. most of the paid ones with an option of a free basic plan provides intelligent transformation of images as per device requirements.

AWS by Amazon is free up to 50 GB data transfer. However, the transformation service is unavailable, and the basic plan of Upload Care starts free and includes the intelligent transformation of images. The basic idea of Cloud Flare is also open and doesn’t support image transformation. Image engine begins at $100 and includes image optimisation.

Image Transformation in CDN

Image optimisation in CDN refers to intelligent resizing or transformation of images as per device requirements.

A CDN is right. However, you may not utilise the full potential if you are missing automatic image optimisation.

Make sure to choose the CDN carefully to get the best of both worlds.

Hyperlink optimisation- SEO Guide 2019

What are hyperlinks?

As we all know the blue links are called hyperlinks, e.g. https://www.bitvero.co.uk/website-design-london

And the text through which we click the link is anchor text, above is a hyperlink without an anchor.

However, we can also present the above link meaningfully. E.g.

London’s best website design company.

In the above example, website design is an anchor for the link.

Hyperlink Optimisation

Significance of anchor text in a hyperlink

Anchor describes the target page not only for the users but also for the search engines. Users can navigate well if link anchors are placed sensibly throughout the website pages.

A clear anchor text with URL passes link juice to the target page, during navigation when visitors encounter anchor text, they either click immediately or later on depending upon the relevance.

Things to avoid while placing anchor text

Avoid placing an anchor text for the sake of passing link juice only, users typically understand the behaviour and lose trust by considering them as spammy, and they may likely avoid the anchor or in worst case close the browser.

Avoid placing too many anchors, which again describes the spammy criteria in a given page.

Avoid deceiving users by placing a wrong target link in an anchor; the practice is unethical.

Like

Anchor text best practices

  • A well placed anchor text in line with the given content
  • A highly relevant target page
  • Includes keywords logically
  • In a given page consider placing not more than one anchor for the same target
Users and search bots both well accept optimised hyperlinks.

Hyperlinks are of 3 types

  • Inbound links, coming to your site
  • Outbound links, going out from your site
  • Internal links, connecting within your site
Be very careful while dealing with hyperlinks, as they can make or break the reputation of your digital assets.
Dislike

Earning an inbound link is challenging and essential for search engine optimisation process. A high-level reputation is necessary to win a quality link, make efforts to make the fame.

Then give a link to someone you trust, the relationship is given and takes, who will link you back if you do not reciprocate. Although it must be done carefully, keeping in mind the relevance and trust of the reference you are using.

Make your webpages accessible with active internal linking site-wide. You can increase user retention rate by doing so. Furthermore, search bot loves crawling and index your webpage better.

Link Reputation

Hyperlink Tier

More than 80% of the website serving the World Wide Web are two or three tiers. Means the link structure is just two or three levels deep.

A high rise building accommodates more and costs higher, likewise more deep the internal link structure the better chances of indexing in search engines, alongside more traffic converts the higher value.

As per Neil Patel

Internal linking is one of the SEO’s most valuable weapons.

Why? Because it works

There are no definitive criteria for link structure in the website. Few, e.g. are

3 tier link structure

3 tier link structure

2 tier link structure

2 tier link structure

Google bots always look for fresh and updated content, you can increase your chances of indexing and crawl by publishing content regularly, and that also increases the site structure and depth, which helps you multiply traffic and acquire more customers.

The image shows the crawling criteria

Google Organic Crawl
Image source: Neil Patel

Typically the home page is the most linked page as compared to subsequent lower level pages.

The upside down pyramid shows the home page keeps the highest link earning potential. And that’s only possible due to the presence of all the lower pages. Which finally makes the value and the sheer presence of it.

Backlink Tier

All about sitemaps- SEO Guide 2019

Sitemap explained

A sitemap is a file that consists of all the URLs of a given website. With the help of a sitemap, search engines can find and index all of your website pages quickly.

Search engine bots crawl the link structure to index website pages. They crawl all the connected links to reach the last one.

A sitemap gives search engines access to crawl the full site structure of any depth using just one file.

Sitemap

Search engine bot crawls which one of them faster?

search engine bot In the image below, the map on the left is the website navigation, in which search bot will crawl four links to reach the European page,

Whereas search bots can index all the webpages by visiting the XML map on the right.

Website Sitemap
Website Sitemap

The difference is enormous, and when it comes to crawling billions of webpage regularly to maintain the fresh index, crawling time becomes crucial.

Importance of a sitemap reflects with websites containing thousands of page when sitemaps assist search bots in accessing the link directory in single file.

A sitemap is especially helpful

  1. For websites containing a large number of pages and a deep link structure
  2. Sites doing frequent updates by adding fresh content
  3. Sites with a weak internal linking structure
  4. Sites lack external links

Including a page in a sitemap cannot guarantee successful indexing. Web pages are crawled and indexed even if they do not exist in a sitemap.

XML sitemap tags,

Location & Last modified, both the tags work brilliantly. However, people often ignore them.

Location tag and last modified

Location tag

Location tag is compulsory and keeps the canonical version of URL’s, with or without ‘www’, the label should also include the exact HTTP or https version.

Last mod tag

Last mod tag is optional although highly recommended as it tells the search bots when you have last updated your website pages.

John Muller acknowledges in a post that search bots refer to last modification date.

XML Sitemap
John Muller Twitter

However, manipulating the time without updating content is unethical for which Google may penalise your website.

Sitemap tools

Numerous tools are available free to generate sitemap; we recommend XML-Sitemaps (online), offline Screaming Frog SEO Spider and Sitemap generator

Google XML sitemaps (for word press), XML Sitemap – Drupal

50000 pagesSitemap Formats

Google recommends using a maximum of 50 MB file size and up to 50,000 URLs in a single sitemap.

However, if the link structure is beyond 50k, it is recommended to divide and create multiple sitemaps, e.g. for a large e-commerce retailer, categories like Apparel, Home & Garden, Kids, etc.

You can optionally create a sitemap index file (a file pointing to a list of sitemaps) and submit to Google.

Sitemap extensions

Google accepts the sitemap syntax of video, images, news.

Video sitemap is an excellent way to invite Google to find and crawl new & existing videos.

Basic guidelines for a video sitemap

  1. You can create and publish a separate sitemap for videos or may also include your video sitemap in your existing one.
  2. Avoid publishing videos not related to the main page
  3. Make optimum use of all the available Metadata for videos; important ones are, URL, video location, title, and description, content, duration, rating, publish date.
  4. Google will ignore the sitemap entry If it fails to discover video content
  5. Make sure to avoid blocking the source file or the video player from accessing Google bot.
  6. Recheck the robots file and optimise it to allow Google to crawl the page, video and thumbnail URL’s
  7. Google verifies the video content by matching it with the given Metadata and the page content; Google will not index if it doesn’t match.

Example video sitemap with tags

Example showing a webpage hosting a video with all the available cards Google uses.

Video Sitemap

How Google crawls a video?

Google Video Crawling

Google returns a highly relevant video search when it understands the video/audio content, along with the Metadata and page text.

Google crawls the video if in a supported video encoding format and extracts the audio and video content in a limited capacity.

Google also uses the structured data, VideoObject tag

Best practices to publish videos

You can make your videos eligible for search results only if you follow the best practices. They are

  1. Google must be able to find your video, embed your videos in HTML using tags video , embed or object
  2. Provide a high-quality thumbnail for the video
  3. Include a video sitemap for an explicit declaration of a video, and try to make the best use of all given tags.
  4. Ensure the relevance of your content with the video you are publishing. E.g. a video showing the recipe of a fried chicken and the page describes the rice chicken.
  5. Make use of structured data VideoObject.

XML image sitemap

XML Sitemap

Images are essential for webpages; a dedicated XML map helps in faster indexation.

However, modern-day SEO approach suggests using structured data image object JSON-LD schema.org/ImageObject mark up, as it well defines image properties requires during search engine callouts.

The exception to the rule applies when the site is selling images like a stock photo website, or an e-commerce portal which includes thousands of copies.

Save your crawling budget as far as you can.

Google News Sitemap

You need to register your site with Google news publisher centre before using this sitemap, publish your most recent articles up to two days old. However, they remain in the Google index for up to 30 days.

Use the Google search console to submit.

Tags using this sitemap are, publication, date, title

HTML sitemap

HTML SitemapThe purpose of an HTML sitemap is to give access to all your site pages to your visitors, and that’s the only relevance to it.

They generally appear in website footers as a tertiary part. However, users rarely click and go through your HTML sitemap if your site internal linking and navigation are acceptable.

Search engine bots don’t bother even if it’s present or not. So if users are not clicking and you still keeping for the sake of providing it on the website, only opens the door of keeping stale information.

A better option is to avoid publishing.

Dynamic XML sitemap

Wordpress Plugin

Generating a sitemap as soon as you publish fresh content is cumbersome unless you love doing so.

Dynamic XML sitemaps are automatically updated when changes occur. You can use a Yoast SEO plugin to dynamically render an XML sitemap if you are using a WordPress CMS.

XML sitemap optimisation

XML Sitemap IndexingMake sure to include only those pages in XML sitemap which you want to be index with Google.Your website may include a thousand pages; not all of them require indexing. E.g. canonical, duplicate, site search pages, archived, comments reply etc.

This way, you can save your crawl budget as well.

From SEO perspective Include only relevant pages in the XML sitemap, although other pages allow crawl and not blocked either.

Sitemap verification & reporting.

Google AnalyticsExcluded and valid both URL’s are available in three different categories Use the Google search console to manually submit the sitemap, its verification and a detailed reporting using coverage in GSC.

  1. Indexed but not submitted in sitemap
  2. Indexed and submitted successfully
  3. Submitted URL is marked no-index
Google Search Console

An optimising sitemap helps understand the more important pages in your website, using no-index and no-inclusion in sitemap enables you to avoid appearing your website on irrelevant searches and clicks.

In the above screen, by including the less critical 841 pages, we are only telling Google to give precedence to the valid ones.

Otherwise, if the site could have included all the 1000 pages in sitemap, the relevancy of the valid pages serving only 20% appears to be much lower as compared to the relevancy of the excluded pages serving almost 80% of the capacity.

Google Analytics- SEO Guide 2019

Google Analytics for SEO

A free and brilliant software by Google, analytics gives you an in-depth report of your audience, their behaviour, conversions & acquisition.

Meaningful data backed with excellent representation, Google knew the fact from the user perspective and prepared the exhaustive metric tool.

We are covering the overview of different sections of Google analytics and their significance. Moz has brilliantly presented a beginners guide to Google analytics.

Google Analytics

Google Analytics journey is pretty straight forward

Google Analytics Journey

The dashboard will show you the report which you may not have encountered before, precise detail related to your website, which is not just limited to users, revenue, conversions etc. but traffic sources, referrals, channels, location, time, devices, top pages, and much more.

Google Analytics Stats
With Google Analytics, you can answer questions related to your website.

Google Analytics

  • Numbers of visitors visiting my website daily/weekly/monthly/yearly?
  • Users are visiting from which location? And how many numbers of areas?
  • How much time users are staying on my website? And what is the monthly average?
  • Who is bouncing from my website means users who didn’t like
  • And exactly which page they didn’t like?
  • Am I getting old or new visitors and their count?
  • What time of day do I receive most hits in a week?
  • Who else is talking about my website on the internet?
  • How many referrals are they sending to my website?
  • Am I popular in my business?
  • Is worth to spend time with social channels?
  • What is the age & gender of my visitor?
  • Do I appear in Google and from what keywords?
  • How my website is performing in search engines?
  • Can I track the activity of my visitor?
  • How can I track conversion?

Much more questions Google Analytics can answer for you Analytics maintains the historical data of your digital asset, which means you can easily compare data within specified periods. Google Analytics dashboard showing three different types of metrics.

Google Analytics Stats

Analytics is into five major sections and their respective subsections.

Google Analytics Overview

Audience overview in Google Analytics will open the door to the insightful view showing users, new users, sessions, page views, Avg. Session duration, bounce rate etc.

Google Analytics Audience Overview

You can share, save, and export the reports, understanding the metrics will give you the real meaning behind the figures

Users are the visitors coming to your website; they include new and returning both. However new users are only those coming to your site for the first time,

Google Analytics Audience OverviewSessions refer to the activity of the user from the start page till the exit page, whatever user does in between counts as a session, inactivity of 30 minutes ends the active session.

A single user can view as many numbers of pages; page views refer to the total number of pages by all the users.

The time spent by users on your website and the average of it are Avg. Session duration.

Bounce RateBounce rate is when a user jumps off or exit from your website either by closing the browser or to a different website. Analytics will take you to step further and will show you more relevant sections. They are

  • Demographics
  • Interest
  • Geo
  • Behaviour
  • Technology
  • Mobile

As the headings suggest, these sections are straight forward and easy to understand.

Acquisitions in Google Analytics

The sections elaborate from where you are acquiring, mainly channels,

Google Analytics Acquisitions

Visitors are coming to your website by directly typing your website URL as a result of your other marketing sources. Direct traffic is higher in big brands due to their popularity. We call it direct traffic as the source is indirect and not from any given digital source

What is organic traffic in Google Analytics?

Website TrafficThe traffic you are acquiring as a result of a natural appearance in Google is organic.

Organic Rankings Google

You can only win natural appearance with SEO. And that’s the only channel. Which is considered best in terms of ROI; competition is fierce for organic rankings on each keyword, and you need to do precisely all the SEO detailing to achieve the results.

Apart from doing SEO, There is no other technique to earn organic traffic for your website.

Referral traffic in SEO

If someone refers you to visit is called a referral, earning a reference is challenging, and Google gives weight to links acquired from a reference.

E.g. site A is referring site D for excellent service or a product. Google understands a paid or earned referral. Do not underestimate Google in that sense. Hi-quality referral links are essential for a digital asset to rank higher in SERP (search engine results page).

Social referrals

Promoting your website socially and the traffic acquired is the accumulation from the common source. Few accessible social sources are Facebook, Twitter, LinkedIn, Instagram, and Pinterest etc.

Social promotion is quite challenging, as it involves the science of understanding human behaviour.

Google Analytics Traffic Source

You can see direct and organic are two significant sources of traffic when you first hit and start building your audience through digital marketing.

Referral and social follow the duo as they significantly depend on your popularity

The section includes numerous other tabs related to the four primary channels, e.g. landing pages & their impressions, user flow etc.

Behaviour section – Google Analytics

Behaviour is all about the response of your digital asset from a performance perspective. How your website is responding is essential, and the metric includes

Google Analytics User Behaviour

Exhaustive details about your site content, like a most served landing page and the avg. Time on page, pages which are giving the most bounce rate, the slowest pages in your website. These are essential during website site audits and performance report.

Page Views

Load time effects performance, site speed section gives you insightful data using which can help you to understand the blockages.

Google Analytics Page load time

Conversions – Google Analytics

As the name suggests, the tab deals with goals completions, goal value, and goal conversion rate. You need to create the goals first before you see conversions.

  • Fill a form can be the goal,
  • Make a booking or purchase is a goal
  • Download an e-book is a goal
  • Staying on the website is a goal that increases Time on site.

You can set an individual goal to each booking stage, e.g. a hotel converts booking in two or three steps, Room selection > billing details & payment Goal Completions

The system also provides a full section for E-commerce to check transactions, revenue, order value etc. interestingly it also recognises coupon codes you offer and provide details attached to it during conversion. Google Revenue & Conversion Stats

You can see individual sources in goal flow

Goal Flow in Analytics

Robots.txt setup & optimisation- SEO Guide 2019

As the name suggests, robots.txt file controls how search engine bots crawl your website, placing a robots.txt file allow, disallow & partially allow search bots from crawling.

And by optimising the robots.txt file, we explicitly declare which files and directories to crawl & index by search-bots.

Robots.txt file is a part of robots exclusion protocol REP, web standards to regulate how search engines crawl & index the content on the web and present the information to the users.

REP also includes the directives for meta-robots, directories, sub-directories and site-wide instructions on how search engines should index links, either follow or no-follow.

Robots Txt

Control Search Bots using Robots txtRobots.txt configuration SEO

Practically, robots.txt file handles the software program (user-agent) on how to crawl and index the website or parts of the website.

A necessary robot.txt file looks like

User-agent: [user-agent name]
Disallow: [Declare URL’s to exclude from crawl]

Example 1 is telling all user-agents not to crawl any directory, sub-directory or web page.

User-agent: *
Disallow: /

Example 2 is reverse and telling all the user-agents to crawl all directories, sub-directory and web pages

User-agent: *
Disallow:

You can see the massive difference in both the above examples by just placing a forward slash the entire website is blocked for user-agents.

It is imperative for webmasters to handle the robots.txt carefully

To further customise,

User-agent: *
Disallow: /wp-admin/
Allow: /wp-content/upload

You can see we can declare explicitly what to crawl and what not on our website.

Popular user-agents

Googlebot is one of the most popular one, User-agent: Googlebot Other user-agents of Google

Web Crawler

User-Agent String

Googlebot NewsGooglebot-News
Googlebot ImagesGooglebot-Image/1.0
Googlebot VideoGooglebot-Video/1.0
Google Mobile (featured phone)SAMSUNG-SGH-E250/1.0 Profile/MIDP-2.0 Configuration/CLDC-1.1 UP.Browser/6.2.3.3.c.1.101 (GUI) MMP/2.0 (compatible; Googlebot-Mobile/2.1; +http://www.google.com/bot.html)
Google SmartphoneMozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2272.96 Mobile Safari/537.36 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)
Google Mobile Adsense(compatible; Mediapartners-Google/2.1; +http://www.google.com/bot.html)
Google AdsenseMediapartners-Google
Google AdsBot (PPC landing page quality)AdsBot-Google (+http://www.google.com/adsbot.html)
Google app crawler (fetch resources for mobile)AdsBot-Google-Mobile-Apps

Data source: KeyCDN

Google fetch tool – Search console

See your webpage from the eyes of a crawlerUse fetch tool in Google search console and test how Google bot crawls and renders a URL.

Google will accurately tell you how your web page is visible to search spiders, what is visible to them and what not. That includes website images, text, scripts.

Live URL testing tool

The coverage report in GSC will also show you the last crawl, agent and indexing

Live URL testing tool

Bing bot, the user agent of bing search

User-agent: bingbot,

bingbot

Facebook

User-agent: Facebot

facebot

Yahoo search agent

Slurp is the user agent of yahoo, user-agent: slurp

Yahoo Slurp

And likewise, DuckDuckGo, Baidu, Yandex, they all keep user-agents to crawl the web,

Baidu works in China, and also have additional web crawlers for images, videos, news, search, wish list.

Alexa from Amazon

Amazon uses Alexa web crawl for internet rankings.

User-agent: ia_archiver

Amazon Alexa

You can also specifically allow / disallow different user agents

User-agent: slurp

Disallow: /

In the above example, you have entirely blocked slurp, a yahoo search agent to crawl & index

Likewise, below example instructs the bingbot, the user agent of Bing, to wait for five milliseconds before crawling each page.

User-agent: bingbot

Crawl-delay: 5

You can note that crawl-delay won’t work with Google-bot and this is by default. The most popular bot is maintaining its monopolies.

Robots.txt file working

Main Function of robots txtSearch engine crawlers execute two critical functions

  • To crawl the webpages by following links, hence we also call them web spiders
  • Index each link they encounter in their database and feed the search engine for those looking for information via search.

Before performing a crawl, by default search engine crawlers looks for a robots.txt file to check for permissions & directives.

And if they didn’t find a robots.txt file, they will crawl the whole website including directories, sub-directories and individual webpages.

It is imperative the robots.txt file should be present in a root directory, and with name ‘robots.txt’, all small case, no sentence or upper case is allowed.

See the image below

Facts to know about the robots.txt

  • The file is publicly accessible, make sure to avoid putting sensitive information, yourdomain.com/robots.txt
  • txt is different for each sub-domain and root domain. Hence blow.example.com will keep its robots.txt and will not share the robots.txt you have mentioned for the root domain. Example.com
  • Robots file is case sensitive as discussed above
  • Malicious crawlers bypass robots file by ignoring them completely, e.g. malware robots

The syntax used in robots.txt

User-agent: generally a crawl bot, using the asterisk (*) means all user agents

Disallow: restrict user-agents to crawl, using forward slash (/) means the entire site is disallowed from the crawling

Allow: the command is only applicable to Googlebot, which means Googlebot can access the file, directory or a sub-directory, even though the parent directory is disallowed.

Crawl-delay: the command does not apply to Googlebot. The command prompts the user-agent to crawl after a specific time delay, e.g. four milliseconds

Sitemap: the command is used to declare the location of a sitemap.

Why robots.txt

To maintain the privacy of website owners, sometimes they don’t want file or directories to be indexed and made public.

Maintain your privacy using robots.txt

Say directory containing your pictures in your blog, sensitive documents in your website you don’t want them to appear in search engine results, CMS directories and files keeps no relevance for search engines, e.g. wp-admin

Webmasters now control which part of the website they allow crawling and in which manner. They can specifically block a particular from entirely form crawling.

The method gives incredible power to webmasters to control the display of information and maintain their privacy.

SEO best practices for robots.txt

  • Ensure that you aren’t mistakenly blocked user-agents of Google, Yahoo, Bing. They are considered primary in search.
  • Always optimise robots.txt file by explicitly declaring what exactly you want search engines to crawl and index
  • Use a dynamic robots.txt, to avoid frequently updating the data with fresh content. If you are using WordPress CMS, All in one SEO or Yoast plugin offers the functionality.
  • Use Google search console to check for a successful submission; coverage report will accurately show you the links crawled and indexed.

Webmasters Setup (Google Search Console)
SEO Guide 2019

Google search console empowers webmasters to check and fix the errors & shortcomings their digital assets are facing through a comprehensive set of powerful tools.

Formally webmasters tools, GSC is famous for its accuracy, the tool is quite popular and one of the favourites among webmasters community.

Reports are insightful with in-depth analysis criteria and easy to understand.

Google has designed it, keeping in mind the users’ perspective.

Sections are divided carefully to cover the website from all the necessary angles required for the user

Google Webmaster
Google Search Console

Coming on board is simple; a sign up is all you need along with the verification of ownership.

Ownership Verification

You can add multiple properties in a single GSC account and must claim the ownership of each one with a verification process.

The performance report helps analyse the traffic, search queries, the most and least searched pages and from which location, on which devices.

Google Search Console Performance Report

URL inspection tool

You can now inspect individual pages of your website by inspecting URL, the analysis includes crawl errors, testing live URL, request indexing,

Find crawl errors, test live URL’s, and check how Google bot sees your web page

Then coverage report will show you statistics related to indexing, crawling and discovery,

Furthermore, enhancements for mobile, search box and logo. You will find if the page is mobile friendly and even view the crawled page along with the exact date of crawl.

URL inspection tool

Coverage

The report will show you the detailed analysis of pages on your website, which includes
  • Pages you have excluded from crawling
  • Total number of valid pages successfully crawled
  • Number of pages indexed with issues
  • Pages not indexed for some reason

Check indexing & crawling issues

Google Crawling and indexing issues

The report is showing valuable data which webmasters can use as key factors to analyse their digital assets. E.g. the above screen shows the website includes more than 1100 URL’s and out of which 841 excludes due to their NON-relevancy with the content of the website, these are PDF’s, admin pages, supporting images, etc.

You can see non-relevant pages are way too high then the relevant ones, imagine the trust of your website with search engines if they could have crawled all of them.

Are your web-pages appearing with all the wrong keywords?

Perhaps it may likely the website may start appearing with all the wrong keywords not related to your business due to the high-ratio of all the non-relevant pages.

The value of even the relevant content can diminish in the scenario. And the bounces due to non-relevancy tells search engines the asset is non-performing on critical phrases and drop rankings.

URL inspection in Google Analytics

Sitemaps

You can add sitemaps to your website in this section. Submitting a sitemap is an excellent way to declare the priority and importance of website pages for search engines.

Add sitemaps to declare priority of your web pages

Although using dynamic sitemap is recommended if content publishing is frequent.

The system will show the last submitted sitemap along with its status and discovered URL’s

Sitemaps for search engines are always submitted in XML format, which you can view any time. E.g. example.com/sitemap.xml

Search engines still crawl the website even though the sitemap is missing; however, best practice is to submit the sitemap for better crawling and indexing of your website pages.

Add a new sitemap in GSC

Mobile First Indexing- SEO Guide 2019

Google first announced in past July 2018 that page speed is one of the ranking signals for mobile searches,

Google has already published on 28th May 2019, that starting 1st July 2019, Google search is moving to mobile-first-indexing.

Now the mobile version of the website will play a significant role in determining the rankings.

Google was testing mobile first indexing when they first announced it on 4 Nov 2016

Mobile First Indexing

Smart phone bot in actionMobile first indexing means Google will crawl the web using a ‘Smartphone Google Bot’ by default.

You can see Google Smartphone bot working in action using the URL inspection tools in search console.

Google Smartphone bot

If the website pages are mobile friendly, it remains unaffected.

By default, the Smartphone bot is active for all the new and previously unknown website.

Google says a single website for desktop & mobile

As per John Muller, Google recommends keeping a single version of your website for both desktop and mobile, basically for three strong reasons.

  1. Information becomes stale if previous versions are not updating.
  2. It creates confusion among users to use which version as sometimes both the versions are different by interface and functionality.
  3. Confusing to search engines as well during indexing.

Now with Mobile first indexing, the confusion of keeping a different sub-domain for a mobile website is quite clear.

Keep a single version of your mobile and desktop siteWebmaster and SEO community faced controversy in past as the scenario was not precise.

Google will keep sending a notification to that website still not mobile friendly; you will find them in GSC.

How Google determines mobile first indexing

According to John Muller, Google determines the mobile first indexing based on content parity, which includes images, text, videos, links, also how the site is using structured data & Metadata as well.

You should double check all the factors if you are about to launch or planning a website redesign.

The web has evolved from being desktop-centric to mobile indexing. Google will keep monitoring the performance of its new user-agents.

Can you learn more about mobile indexing and speed score in our recently published article Website redesign? 95% of websites suffering speed score

Structured data for SEO- SEO Guide 2019

Structured Data for SEO

The code to achieve the criteria of a meaningful web forms the basis of structured data and Symantec web provides the foundation of it.

Logic is simple that ‘Machines works with strings, not things.’

Machine works with strings

Symantec web

Symantec web is the cumulative effort of Google, Yahoo, Bing, and Yandex

And machines don’t know which apple we are talking? And in what context and relation?Symantec deals with the study of the relationship of words, symbols, phrases. E.g. we all know that an apple is a fruit. However these days you can refer to apple as machines too.

Likewise, Mount Everest is the highest peak known to man; the two words bear’s special significance in the dictionary and keeps a relationship of one of nature’s symbol.

We train machines to understand the relationships between words, symbols and phrases.

Symantec web is the combined effort of Google, Bing, Yahoo, and Yandex.

Semantic web

Structured data

Furthermore, structured data is the markup for machines to understand the human language, which includes its vocabulary and grammar.

E.g. The sentence

Eiffel tower is in Paris

We very well know one of the seven wonders and the structure is in a place name Paris. On the contrary.

However, with structured data, a machine only considers 24 characters in memory, including spaces and returns the result only when the exact phrase appears.

Structured data

Search bot without structured data vocabulary?

search bot

We cannot even imagine how the search will look like if considered in that sense.

Search companies realised the fact in early days and knew since starting they require a language to train machines.

Thanks to structured data which helps the machines to understand the human world.

After applying structured data, the same machine is capable of understanding that:

Eiffel tower is a structure, made of iron and located in a place called Paris, which is a part of Europe. And one of the Seven Wonders of the World.

You can now see the difference and the power of structured data,

What exactly is structured data?

Structured data is the mark-up to help machines understands the meaning of a given language; it’s divided into two parts.

And typically we call structured data as the vocabulary with the specific and established rule set.

structured data for search engine

Google defines structured data

Google Search works hard to understand the content of a page. You can help us by providing explicit clues about the meaning of a page to Google by including structured data on the page.

Structured data is a standardized format for providing information about a page and classifying the page content; for example, on a recipe page, what are the ingredients, the cooking time and temperature, the calories, and so on

What is a schema, Microdata, Microformats, RDFa?

Our digital gurus are not unanimous, which results in emanating of different vocabularies & grammar, and thus different formats are born.

In a most basic sense, information is conveyed through a language, whether machine or human, in both the languages two things are primary

  • Vocabulary (set of words)
  • Grammar (set of rules to use the vocabulary)

structured data markup

Example of vocabulary for structured data markup for SEO.

Keeping in mind the above, we can classify digital jargons as follows.

Microformats is the only one to provide both vocabulary & grammar.

To understand further you can go through our detailed article on structured data for SEO & winning Rich Snippets

Keynote

Keynote

Google is on its way of evolving machines, building them to understand the human needs & behaviour, computers in the form of voice search are booking tables for dining, ordering food for refrigerators. Playing the favourite music, driving cars, and much more.

Search is still the most loved tool and widely known from the long list of products & services Google provides.

And they are continually updating it; Google calls it the core algorithm update, the latest one in June 2019 and the full list is here

And the direct impact of these update influences searches optimisation and its core elements.

EndnoteEnding note

Above are all the core elements of SEO, and they are critical for any subsequent digital marketing campaign.

Accuracy of SEO tools is a question as they fail to determine important aspects of SEO, like content quality, the relevancy of Meta titles, link anchor etc.

However, Ahrefs audit tool provides a comprehensive report, the latest and my favourite as well.

The best practice is to use the SEO tool along with opting a manual SEO audit process to ensure the health of your digital assets.

Google search console is indispensable in finding errors in your website, which no other tool can. Consider installing that by default.

Any effort in search engine marketing will prove to be worthless if website pages are missing the critical elements of the search engine optimisation process.

So you need to be extremely careful in executing SEO & SMM with your digital assets.

Ahrefs SEO audit tool

Ahref SEO Audit tool

Sales process optimisation

The optimisation process is critical for your digital assets to perform. However, other sales optimisation metrics are also available.

Google says, chose SEO carefully.

Deciding to hire an SEO is a big decision that can potentially improve your website ranking or may damage its reputation as well.

Online presence analysis

A manual audit report is beneficial for professional SEO analysis.

Companies proactively maintain their brand, apt for online reputation management analysis for brand monitoring and health. The service is specially designed for those companies aggressively making online sales.

The process is beneficial in finding the fault of an affected brand reputation.

Conversion rate optimisation

Process

A detailed report showing metrics of non-performing assets & user behaviour with reasoning with a possible solution.

Audit Process

The conversion rate optimisation is again a brilliant sales metric to see if your digital assets are not working.

Non-performing assets can be a significant hurdle for online sales; NPA’s can seriously damage your website reputation by increasing the bounce rate.

The process signifies quantitative and qualitative both the methods and is reasonably simple

SEO verdict

Verdict

I want to reiterate, consider the web-pages as your digital assets, use your intellect to find out why they are non-performing.

Publishing webpage for the sake of it is entirely different than considering them as your digital asset. And if you want your digital asset to perform entirely depends, upon your marketing approach.

A wrong approach will end up making your investment fragile, whereas you are ready to reap fruits with the correct one. The choice is yours. So be careful with digital marketing.

And that’s what Google says "Chose SEO carefully, says Google"

“Deciding to hire an SEO is a big decision that can potentially improve your site and save time, but you can also risk damage to your website and reputation.

Make sure to research the potential advantages as well as the damage that an irresponsible SEO can do to your site.”

More stuff for you,

I am also impressed with the SEO guides published by MOZ & Backlinko, so looking at them is worthwhile your time, you will find interesting key facts and in-depth SEO knowledge.

    Over to you

    If you have done the reading, check your website pages, fix the errors and you will see them gradually improving in searches over time.

    Don’t forget to chime if you may find the content fascinating. I would love to hear your comments & feed.

    ...........................

    Summary
     Digital SEO Guide 2019 - What has changed?
    Title
    Digital SEO Guide 2019 - What has changed?
    Description
    Welcome to the Digital SEO guide 2019! This guide covers the scope of core SEO elements in 2019. Google is continually updating the core SEO factors as it still forms the basis of SEO.
    Author
    Publisher Name
    Bitvero
    Publisher Logo