10 Most Important Search Engine Ranking Factors (2019 Update)

Posted by on May 16, 2019 in SEO Articles | Comments Off on 10 Most Important Search Engine Ranking Factors (2019 Update)

search engine ranking factors

You may have heard that Google is using more than 200 factors before deciding which websites to show on top of the search results and while this is true, not all ranking factors are equal.

In this article I will explain the most important search engine ranking factors you should know about and show you how to optimize your website to achieve higher rankings in search engines.

Let’s start with a couple of popular questions people ask about search engine rankings.

What do we mean by Search Engine Ranking Factors? SEO ranking factors are rules used by search engines during the ranking process to decide which pages to show in the search engine results pages (SERPS) and in what order.

Why is it important to know about the different SEO ranking factors? The answer is simple. If you want your website to rank high in search engine’s organic results and get traffic, you need to make sure that you adhere to these rules.

Google ranking factors change all the time, how do I keep up with the changes? It is true that search engines, especially Google, are making a lot of changes to their ranking algorithms per year. Their goal is to improve the quality of their search results and keep their users happy.

Nevertheless, there are some ranking signals that are the foundation of Search Engine Optimization for years now. Google might be using more than 200 ranking factors in their algorithm but not all factors are equal.

Most Important Search Engine Ranking Factors

Here is a list of the 10 most important ranking factors you should care about.

  1. A Website Optimized for Technical SEO
  2. Website Security (HTTPS)
  3. Domain Authority
  4. Mobile Friendliness
  5. Webpage Speed (both Desktop and Mobile)
  6. Content Quality
  7. On-Page SEO Ranking Factors
  8. User Experience
  9. Brand reputation
  10. Links from Trusted Websites

1. A Website Optimized for Technical SEO

This is coming first in the list and for a reason. If search engines cannot access your website properly, then you shouldn’t expect much in terms of rankings.

The way search engines work can be grouped into three main stages:

  • Discovery
  • Crawling and Indexing
  • Ranking

During the discovery stage, they find all publicly available web pages. Then during crawling they extract the information they need and add it to their index so that it can be used by the ranking algorithms during the ranking process.

how search engines work, crawling and indexing
How Search Engines Work

It’s your job to ensure that during this process, search engine bots can access your website without any blockings and that you assist them in completing this task as fast as possible.

This is known in the SEO World as Technical SEO.

You can do this by:

Recommended Reading: Technical SEO GuideBest Practices for higher rankings.

2. Website Security (HTTPS)

One of the known ranking factors is website security. Websites that have an SSL enabled and serve links as HTTPS have a comparative advantage over non-secure websites.

By comparative advantage we mean that other things being equal, a website that has HTTPS enabled will rank higher than a website that is not secured.

If your website is not https yet, add this task on top of your list.

Recommended Reading: How to migrate your website to https without losing SEO.

3. Domain Authority

Next in the list is domain authority, a very important ranking factor.

Search engines want to show the best possible web sites in their results and one of the factors they use in deciding, is the authority of a domain.

The authority of a domain is determined by:

Domain age: Older domains tend to rank higher than newer domains. The average age of a page on the top results of Google is 3 years.

Age of Pages and Google Rankings
Age of Pages and Google Rankings

This does not mean that new websites cannot achieve high ranking but it is more difficult and it will take more time than established websites.

Domain status: A domain to be eligible to rank, it has to be free of Google penalties.

If you’ve owned your domain from the beginning then this is not something you should worry about. If you bought a domain that was already registered, you need to check that it’s free of Google penalties.

Recommended Reading: How to check if your website is penalized by Google.

Domain reputation: This has to do with how other websites (and people) perceive your domain.

One of the biggest factors, as we will see below, is incoming links but reputation is also related with what people say about your brand.

A domain with good reputation is more likely to achieve better rankings than a domain with no reputation at all.

Domain Authority: Google is using an internal metric called PageRank to calculate the authority of a webpage.

Websites that are in the top positions of Google have a higher PageRank than websites in the lower positions.

Google no longer gives information about PageRank so a number of companies (,, came up with their own system to calculate the authority of a domain.

You can use these metrics as a guide to improve your domain score but this is not directly related with rankings.

Recommended Reading: Increase the authority of your domain by following these 7 practical steps and how to do SEO for a new website when you have a limited budget.

4. Mobile Friendliness

Another factor that is known to affect search engine rankings is mobile friendliness.

Mobile searches make up more than 60% of the total searches made on Google each month.

A website that is not optimized for mobiles, will not show up in mobile search results and this immediately excludes the possibility of getting any mobile traffic.

5. Webpage Speed (both Desktop and Mobile)

For a long time now, page speed is another known Google ranking factor. Google is obsessed about making the web faster and decided to reward faster websites with better rankings.

Having a fast website is also good for your users. Many studies have shown that slow loading websites are losing customers and driving users away.

Webpage Speed Statistics
Webpage Speed Importance

To make your website faster, follow these proven tips:

  • Upgrade your website software to the latest version
  • Use a caching plugin
  • Use the latest version of PHP (if you are on WordPress)
  • Use a VPS hosting and not shared
  • Use a CDN (Content Delivery Network) Service
  • Optimize and compress your images
  • Optimize and compress your HTMLS, CSS and JavaScript

6. Content Quality

The quality of the content you publish on your website is by far the most important SEO ranking factor of all.

When we talk about content that is published online, quality is measured using these three factors:

  • Uniqueness
  • Expertise – Authority – Trustworthiness (EAT)
  • Relevancy

Content Uniqueness

First in the list is uniqueness. Any piece of content that is published on your website has to be unique and not a duplicate of what is already available on other websites.

This means that re-publishing an article from Forbes because you found it interesting (even if you properly cited the source), won’t help your website rank.

Google knowns that the particular article first appeared on Forbes and it will just ignore yours.

It won’t punish your website for this but it won’t reward it either. But, if you keep publishing duplicate content and have no unique content of your own, this will reduce your ‘Google Trust’ and make it very difficult to rank on Google.

Recommended Reading: What is thin contentA guide on how to find and fix thin content pages.

Expertise – Authority – Trustworthiness (EAT)

Search engines don’t want to show untrusted content in their search results. During the ranking phase they look for signals to help them identify content written by ‘experts’ who have ‘authority’ and ‘trustworthiness’.

This is not a new concept but Google raised its importance by adding it in the Google’s Quality Raters Guidelines.

It particularly states that E-A-T is applicable for all types of websites and it gives some examples of how E-A-T can be applied in practice.

Importance of Expertise, Authority and Trustworthiness for Rankings.
Importance of Expertise, Authority and Trustworthiness for Rankings.

What you should do to improve E-A-T is the following:

  • Make sure that each page has an author. You can add the author bio at the bottom of each page with a link to the full author bio.
  • Make sure that your ‘About’ page explains who you are and why your company and authors are experts on the topics.
  • Showcase any awards or mentions from trusted websites that can prove your authority.
  • Invest in promoting your personal brand and the reputation of your authors.
  • Make sure that your website is secure
  • Try to get good reviews published in other trusted sources. Depending on your niche you may seek reviews in Google My Business, Yelp, Trustpilot, BBB etc.
  • If applicable try to get a Wikipedia page for your website and authors
  • If you are selling products or services online, make sure that you have a privacy policy and a refund policy.
  • Get mentions on authoritative sites like major news publications, big forums, leading industry websites.

Recommended Reading: Google Search Quality Evaluator Guidelines

7. On-Page SEO Ranking Factors

Besides proving to search engines that they can ‘trust’ you, you also need to give them signals through your page structure to help them understand the meaning of your content.

This is what On Page SEO is all about.

This is achieved by sending them the right signals through your content and in particular:

Does your page title match what the user is searching?

The page title is the most important on page SEO factor. This is what users see in the search results and it is one of the elements used by search engines to get a very good idea of what a page is all about.

To optimize your titles for both users and search engines, you need to ensure that they include SEO keywords.

Example of Page Title
Example of an Optimized Page Title

SEO Keywords are the exact phrases users type in the search box. By including keywords in your title, search engines can associate your content with specific search queries and this increases your chances of ranking for those terms.

It also encourages users to click on your search snippet since it closely matches what they are looking for.

Recommended Reading: How to optimize your page titles

Headings and Subheadings

A well-optimized page structure has headings and subheadings. Headings make the page easier to read by both search crawlers and users.

In most cases, a page has one H1 heading and multiple subheadings (h2 and h3). These are laid out in a hierarchical structure i.e. H1 -> H2 > H3.

Headings should also include keywords and phrases users can recognize.

Recommended Reading: How to optimize your website headings

Internal Link Structure

Internal links are links pointing to pages on the same website.

Each and every piece of content that you publish on your website needs to have at least one link pointing to another page on your website.

Internal links should be placed in the BODY of the page and should be relevant to the content.

By adding internal links to your content:

  • You assist both search engines and users discover more pages on your website.
  • Help in building topic relevancy
  • Gives users more information about a topic

Recommended Reading: Internal linking best practices for SEO.

 Image and multimedia elements Optimization

Search engines don’t only index text content but they also index videos, audio (podcasts) and of course images.

The problem is that non-text elements are difficult to be interpreted accurately by crawlers.

It is thus important to optimize them for SEO by giving search crawlers the necessary signals.

For example:

  • For images, you can optimize the filename and also provide for an ALT text.
  • For videos, you can add a video schema.
  • For podcasts you can add the podcast schema.

There are various ways to make non-text elements easier to index. The following guides can help you out:

  • SEO for Images: A complete guide on how to SEO optimize your images
  • Schema Markup – An introduction to schemas and structured data and how they can help you improve your rankings.

Add Semantically Related Keywords in Content

Last but not least, one factor that can affect the position your website appears in Search is the use of the keywords within your content. This is known in SEO as Content SEO.

Content SEO can help you pick up the right keywords to embed in your content so as to make it more relevant to what the user is searching.

Besides adding your main keywords in the titles and headings (as mentioned above), you also need to add semantically related keywords in your content.

Semantically related keywords are words or phrases that are related to each conceptually.

Look at the following example:

Semantic SEO Example
Semantic SEO Example

Google can understand that the meaning of the words ‘replace’ and ‘change’ are the same (in this context).

Recommended Reading: You can read my SEO keywords guide for more information on how to find and use semantically related keywords in your content.

8. User Experience

One of the signals used by the Google ranking algorithm is RankBrain. RankBrain is using artificial intelligence and machine learning to rank websites based on user experience and behavior.

So, the ranking algorithm considers all the factors discussed in this article and it also uses input from RankBrain before making a final decision.

Some of the signals used are:

CTR (clickthrough rate) – CTR is the percentage of people that click on a search snippet that shows in the SERPS.

If the system spots a pattern where more people click on a search snippet that is lower in the results than what is on the top, the algorithm may push that snippet higher in the page.

Pogosticking – When users click on your search snippet, visit your website and immediately go back to the search results, then this is an indication that they are not happy with the what they saw.

Bounce rate and dwell time – Similar to pogosticking, users visit your website and leave immediately because they did not find the information they were looking.

Recommended Reading: How to reduce your bounce rate

9. Brand reputation

This SEO ranking factor doesn’t have to do with backlinks (we will deal with links below), but with the importance of having a good brand reputation.

Brand reputation is this context means to have a community of people talking and mentioning your brand in the online world.

Mentioning doesn’t necessarily mean adding a link back to your website but simply discussing about your brand in the different social networks and forums.

From a webmaster’s perceptive, you should do everything you can to promote your brand online. Whether it’s through Facebook, Instagram, Twitter or popular forums in your industry.

If you get people to talk and search for your brand name in Google, this will eventually translate to a strong signal for the ranking algorithms.

10. Links from Trusted Websites

Back in the 90’s the Google founders, came up with a brilliant idea.

Websites that have backlinks pointing to them from other websites, are more useful and popular than others and thus deserve a higher position on the Google search results.

That’s a very simplified version of how Google ranking used to work back in the early days.

Over the years people took advantage of this and started building hundreds of links to their websites and as a result the quality of the Google search results was diminishing.

To protect their reputation and improve the quality of the results, Google adjusted their algorithm and changed the way backlinks are accounted for.

Links are still one of the most critical ranking factors, but it’s no longer a matter of which website has the most links, but from where the links are coming.

link building importance study
Importance of Link Building for SEO

Links that can influence your rankings are the ones that come from:

  • Google trusted websites
  • Websites that demonstrate Expertise, Authority and Trustworthiness
  • Related Websites
  • Links that don’t have the nofollow attribute (
  • Links that are included in the body of a webpage
  • Links that are naturally added by the webmasters and not a result of link exchanges or buying links.

The following guides can help you build a strong link profile that will help you with rankings:

SEO Ranking Factors Summary

If you are new to SEO and Search Engine Marketing, what you need to understand is that Search Engines are constantly working on improving their search results. They do this by getting feedback from the users and through machine learning.

They gather and analyze this information and then adjust the search engine ranking factors to improve the user experience.

Your job as an SEO expert is to follow these changes and stay in compliance with the new rules.

It should be noted though that while algorithms are changing all the time, the basic ranking factors remained the same since the beginning of Google:

  • Publish great content to satisfy the user intent
  • Make your website accessible to search engines
  • Build fast and secure websites
  • Get ‘votes of trust’ (links), from other websites on the Internet to prove your Expertise, Authority and Trustworthiness

Websites that follow the above rules enjoy high search engine rankings and are more likely to survive the competition for years to come.

Websites that take shortcuts and try to trick search engine algorithms, can achieve temporary success but sooner or later they will get caught and penalized.

Any important SEO ranking factors missing from the list? Let me know in the comments.

The post 10 Most Important Search Engine Ranking Factors (2019 Update) appeared first on

Multilingual Plugins for WordPress

Posted by on May 16, 2019 in Greg's SEO Articles | Comments Off on Multilingual Plugins for WordPress

By sticking to just one language on your website you’re limiting your sales from around the world. As people find new ways to communicate instantly, you’re more and more likely to communicate with people whose first language isn’t English. You’ll be able to reach far more people by translating text into their language. Here are 4 plugins we recommend:


With this plugin you can add as many languages as you need to every page or post that you’re working on. Although it doesn’t do the translation for you, it does make it very easy to scale your content for multilingual purposes – you don’t have to have multiple websites with different languages, because a user logging onto your site merely has to select a language of their choice.


This plugin does more than just translate posts – it uses the readily available variety of languages WordPress has and does the translation from there. It is $5.95 monthly but you can translate 2,000 words for free.


This plugin lets you automatically or even manually translate your content. More than 40 languages are available for auto-translation. It will cost $29 for “Standard” and $79 for “Advanced” but it has a solid reputation.


Last but not least is the Google Language Translator. Not only is it free, but Google Translate has become substantially more accurate, and it can support over 100 languages!

What are SEO Keywords and How to Find Them

Posted by on May 15, 2019 in SEO Articles | Comments Off on What are SEO Keywords and How to Find Them

SEO Keywords

You probably already know that keywords are important for SEO but what do we actually mean when we talk about SEO keywords? How can you find the right keywords for your website and more importantly, how to use those keywords in your content and maximize your SEO?

These are some of the questions I will answer in this post.

What are SEO Keywords?

Let’s start with a definition. SEO Keywords are words or phrases (search terms) that people use when searching for information through a search engine.

By adding those keywords in your content, you increase your chances of appearing in the search results for those terms.

SEO Keywords make it easier for search engines to understand what your content is all about and help users find the information they need.

Why are keywords important?

As you might have already guessed, keywords are very important for SEO.

Without using the right SEO keywords in your content, search engines have a hard time understanding the meaning of your content. And this diminishes your chances of getting organic traffic to your website.

The way search engines work is by matching the user search queries with pages available in their index.

During the crawling and indexing phase, search engine crawlers visit a webpage. They extract the information they need and add it to their index. They use this information later during the ‘matching’ process (also referred to as the ranking process).

Part of the information they extract is the keywords a web page is associated with.

If during this process your website is associated with the wrong keywords, then you have no chances of appearing high in the results for the keywords that matter for your website.

Let me give you an example to understand this better.

Let’s say you want to rank for [electric bicycles]. You create a page and showcase your products (product name, images of the products, etc.).

Unless you mention on the page that they are electric, Google will most probably associate your page with bicycles and this not what you want.

What you should do instead is to find out which keywords people use as search terms when they look for electric bicycles and make sure that these keywords are part of your content.

Different types of SEO keywords

Keywords are classified in two categories: head keywords and long tail keywords.

Head keywords (also known as seed keywords) usually consist of one or two words and have a high search volume.

Long tail keywords consist of more words, have less search volume compared to head keywords but make up 70% of all searches.

Long Tail Keywords - Search Curve
Long Tail Keywords – Search Curve

What is important to understand at this point is that head keywords may have more search volume but they are also highly competitive.

This means that thousands of websites are competing for one of the top positions in the SERPS and this makes it almost impossible for new websites (or small businesses) to rank for these terms.

The solution is to focus on long-tail keywords.

They have less search volume but with the right SEO plan, it is possible to rank in the top positions of Google and get targeted organic traffic to your website.

How to choose keywords for SEO

Let’s see how to choose the “right” SEO keywords to use in your content.

This process is known in SEO as keyword research.

Step 1: Decide for which ‘search terms’ you want to be known for

Your first step is to spend some time thinking about the following:

  • For which search terms do you want to be known for online
  • What words or search phrases people might use in the search engines to find you
  • Which words best describe your products or services
  • Which words describe your niche better

The outcome of the above exercise should be a list of phrases that we’ll use to turn into a keyword list in the next step.

Step 2: Create a Keyword list

To create a keyword list, you need to take the list created above and associate it with the actual keywords people type in the search box i.e. SEO Keywords.

To do that we need the help of keyword research tools. There are a lot of tools you can use but my recommended tools are the Google Keyword Planner (Free) and SEMRUSH (Paid).

Google keyword Planner

The Google Keyword Planner is part of Google Ads and it’s offered by Google for free for the Google Ads customers.

Nevertheless, it’s a great keyword research tool and you can use it to find your SEO keywords.

Go to Google Ads and create an account. Follow the steps to create a draft campaign so that the system will allow you to access the Keyword Planner.

Once you are done with the draft campaign, select TOOLS > KEYWORD PLANNER.

Google Keyword Planner
Access the Google Keyword Planner


For the sake of this example, let’s say that you are a dog trainer selling an online course teaching people how to train their dogs.

If you enter the relevant keywords in the keyword planner, you will be present with a list of keywords ideas matching your selected topics.

Keyword Planner Keyword Ideas
How to Find SEO Keywords using the Keyword Planner

Notice that besides the keywords, the tool has a few more columns.

The ‘Average Monthly Searches’ shows you how many searches are performed on Google per month for that term and the ‘Competition’ gives you an idea of how competitive a keyword is.

By ‘competitive’ we mean how many people are bidding to advertise for that keyword on Google Ads.

Remember that the keyword planner is a tool for Google PPC Ads and not for SEO search results.

Select some of the keywords that are highly relevant to your website and products, both ‘head’ and ‘long-tail keywords’, add them to a spreadsheet and move on to the next step.

Step 3: Find Semantic SEO Keywords

The next step is to find keywords related to your target keywords. These keywords are known as semantic keywords or LSI keywords.

A semantic keyword is a keyword that is strongly related with another keyword.

The reason you want to do this is because Google no longer ranks pages that target individual keywords but it’s more targeted to topics.

So, by finding LSI keywords for your main keywords and including them in your content, you help Google get a better understanding about your content and this translates to higher rankings.

The best way to find LSI keywords is to use the LSI Keyword Generator and Google search.

LSI Keyword Generator

Go to LSI Graph and your head keywords and click search. Take note of the keywords and add them in a second column in your spreadsheet.

LSI Graph
How to Find Semantic SEO Keywords

Google Search

When you search for a keyword in Google, there are 2 ways to find out what Google considers to be the related keywords for a given search term.

The first one is to look at the “People also ask” section.

People Also Ask
People Also Ask – Keyword Ideas

And the second one is the “Searches related to…” section.

Google Related Searches
Google Related Searches


SEMRUSH is my favorite keyword research tool.

Among many other useful features, it has  two very powerful tools for keyword research.

The first is the “Keyword Magic Tool” and the second one is the “Topic Research”.

With SEMRUSH, you don’t have to go through the process of creating an account with the Google Keyword Planner or going to LSIGraph and Google search to find related keywords.

Everything is done within SEMRUSH.

Keyword Magic Tool

The first step is to create an account (there is a 7 Day Free Trial).


Type in your head keyword and click SEARCH.

semrush keyword magic
Find SEO keywords with the Keyword Magic Tool

SEMRUSH will group related keywords together.

Sort the keywords by volume and KD (Keyword Difficulty). Unlike the Google Keyword Planner, KD refers to how difficult to RANK for a particular keyword in Google.

A metric very useful since what you want is to pick the right SEO keywords that have higher search volume and lower KD score.

Topic Research

Remember what I mentioned above that Google now ranks websites based on Topics and not just keywords?

The Topic Research tool will help you find long-tail keywords related to a topic.

Click TOPIC RESEARCH from the left menu and type in our main head keyword.

Topic Research with SEMRUSH
Topic Research with SEMRUSH

What you see on the right (under Interesting Questions), is questions related to your niche. It’s similar to “People also ask” but more comprehensive since it includes questions from various sources.

How can you take advantage of this?

These questions can help you build TOPIC RELEVANCY, which is what you want if your goal is to rank higher on Google.

How can you do this in practice? Optimize your homepage for your head keywords and then create content (through a blog) targeting each of the questions (which in essence it’s the long tail keywords).

Make sure that within your blogs, you link to your main pages.

How to use SEO Keywords in your Content

Doing good keyword research and having a keyword list, is not enough.

In order to benefit from this process, you need to know how to use those keywords in your content.

This is known as SEO Content, which is a subset of On Page SEO.

Here are some tips to follow:

Optimize your Homepage for your main keyword

Homepage SEO is very important. Search engines start the crawling process from the homepage and follow up any links from there.

When it comes to keyword optimization, you should optimize your homepage for your main head keywords (even if their keyword difficulty is very high).

The reason is that you want to make it clear to both crawlers and users, what your website is all about.

Create a separate page for each of your main keywords

Let’s say that your company sells services (like my company), you need to create a page for each of your services, each page to be optimized for a main keyword.

Look at how my services page is organized.

I have a summary page for all my services and individual pages for each of the services we offer. Each page is optimized for a specific keyword.

Create pieces of content to target long-tail keywords

Once you are done with the main keywords, it’s time to utilize the power of blogging and start creating content targeting long-tail keywords.

You can use the results of topic research to decide which keywords to target in your blog posts.

Optimize your content with SEO keywords

When writing the content for both your pages and blog posts, you need to make sure that:

Use keywords in the URL – you include your target keyword in the page URL.

Use keywords in the page title – you include your target keyword in the page title.

Use keywords in the H1 Tag – you include your target keyword (or close variations) is the h1 tag

Use long-tail keywords as subheadings (h2, h3) – you include related keywords in your subheadings

Use LSI keywords in your content – you include LSI keywords within you copy.

The following guides will help you understand this better:

SEO Keywords: Final Advice

Picking up the right keywords for your website is important.

When choosing keywords, try to think outside the box and consider all possible search phrases people might use in search engines to find your products.

Add those keywords in a spreadsheet and take advantage of the data provided by the different keyword tools, to expand your keyword list as much as possible.

Group your keywords into two categories. First are the keywords to use in your homepage and main website pages and second the keywords to use in your blog.

Follow the on-page SEO tips outlined above to intelligently include keywords in your copy but always make sure that you pay special attention to the quality of the content.

Publish content related to your target topics to create content relevancy and watch your rankings and traffic increase.

The post What are SEO Keywords and How to Find Them appeared first on

A Website Crawler for Search and Information Architecture

Posted by on May 14, 2019 in SEO Articles | Comments Off on A Website Crawler for Search and Information Architecture

Distilled is all about effective and accountable search marketing. Part of being effective is being able to gather the data we need to diagnose an issue. For a while, we’ve been using a custom crawler at Distilled to solve technical problems with our clients. Today, we’re making that crawler available to you.

This crawler solves three long-standing pain points for our team:

  1. Unhelpful stock reports. Other crawlers limit us to predefined reports. Sometimes these reports don’t answer our questions. This crawler exports to BigQuery, which lets us stay flex.
  2. Limited crawl scope. When crawling on your own computer, your crawl is limited by how much RAM you’ve got. Our crawler is so efficient that you’re more likely to run out of time than memory.
  3. Inflexible schema. Other crawlers generally export flattened data into a table. This can make it hard to analyze many-to-many relationships, like hreflang tags. This crawler outputs complete, non-flattened information for each page. With this data, the queries our team runs are limited only by their imaginations.

Our team still uses both local and hosted crawlers every day. We break out this custom crawler when we have a specific question about a large site. If that’s the case, this has proven to be the best solution.

To use the crawler, you’ll need to be familiar with running your computer from the command line. You’ll also need to be comfortable with BigQuery. This blog post will cover only high-level information. The rest is up to you!

This is not an official Distilled product. We are unable to provide support. The software is open-source and governed by an MIT-style license. You may use it for commercial purposes without attribution.

What it is

We’ve imaginatively named the tool crawl. crawl is an efficient and concurrent command-line tool for crawling and understanding websites. It outputs data in a newline-delimited JSON format suitable for use with BigQuery.

By waiting until after the crawl to analyze data, analysis can be more cost-effective. If you don’t try to analyze the data at all as you’re collecting it, crawling is much more efficient. crawl keeps track of the least information necessary to complete the crawl. In practice, a crawl of a 10,000-page site might use ~30 MB RAM. Crawling 1,000,000 pages might use less than a gigabyte.

Cloud computing promises that you can pay for the computing power you need, when you need it. BigQuery is a magical example of this in action. For many crawl-related tasks, it is almost free. Anyone can upload data and analyze it in seconds.

The structure of that data is essential. With most crawlers that allow data exports, the result is tabular. You get, for instance, one row per page in a CSV. This structure isn’t great for many-to-many relationships of cross-linking within a website. crawl outputs a single row per page, and that row contains nested data about every link, hreflang tag, header field, and more. Here are some example fields to help you visualize this:

Some fields, like Address, have nested data. Address.Full is the full URL of the page. Other fields, like StatusCode, are simply numbers or strings. Finally, there are repeated fields, like Links. These fields can have any number of data points. Links records all links that appear on a page being crawled.

So using BigQuery for analysis solves the flexibility problem, and helps solve the resource problem too.

Install with Go

Currently, you must build crawl using Go. This will require Go version >1.10. If you’re not familiar with Go, it’ll be best to lean on someone you know who is willing to help you.

go get -u

In a well-configured Go installation, this will fetch and build the tool. The binary will be put in your $GOBIN directory. Adding $GOBIN to your $PATH will allow you to call crawl without specifying its location.

Valid commands

USAGE: crawl <command> [-flags] [args]

help Print this message.

Crawl a list of URLs provided on stdin.
The -format={(text)|xml} flag determines the expected type.

crawl list config.json <url_list.txt >out.txtcrawl list -format=xml config.json <sitemap.xml >out.txt


Print a BigQuery-compatible JSON schema to stdout.

crawl schema >schema.json


Recursively requests a sitemap or sitemap index from a URL provided as argument.

crawl sitemap >out.txt


Crawl from the URLs specific in the configuration file.

crawl spider config.json >out.txt

Configuring your crawl

The repository includes an example config.json file. This lists the available options with reasonable default values.

"From": [
    "Include": [
    "Exclude": [],

    "MaxDepth": 3,

    "WaitTime": "100ms",
    "Connections": 20,

    "UserAgent": "Crawler/1.0",
    "RobotsUserAgent": "Crawler",
    "RespectNofollow": true,

    "Header": [
{"K": "X-ample", "V":"alue"}

Here’s the essential information for these fields:

  • From. An array of fully-qualified URLs from which you want to start crawling. If you are crawling from the home page of a site, this list will have one item in it. Unlike other crawlers you may have used, this choice does not affect the scope of the crawl.
  • Include. An array of regular expressions that a URL must match in order to be crawled. If there is no valid Include expression, all discovered URLs could be within scope. Note that meta-characters must be double-escaped. Only meaningful in spider mode.
  • Exclude. An array of regular expressions that filter the URLs to be crawled. Meta-characters must be double-escaped. Only meaningful in spider mode.
  • MaxDepth. Only URLs fewer links than MaxDepth from the From list will be crawled.
  • WaitTime. Pause time between spawning requests. Approximates crawl rate. For instance, to crawl about 5 URLs per second, set this to “200ms”. It uses Go’s time parsing rules.
  • Connections. The maximum number of current connections. If the configured value is < 1, it will be set to 1 upon starting the crawl.
  • UserAgent: The user-agent to send with HTTP requests.
  • RobotsUserAgent. The user-agent to test robots.txt rules against.
  • RespectNofollow. If this is true, links with a rel=”nofollow” attribute will not be included in the crawl.
  • Header. An array of objects with properties “K” and “V”, signifying key/value pairs to be added to all requests.

The MaxDepth, Include, and Exclude options only apply to spider mode.

How the scope of a crawl is determined

Given your specified Include and Exclude lists, defined above, here is how the crawler decides whether a URL is in scope:

  1. If the URL matches a rule in the Exclude list, it will not be crawled.
  2. If the URL matches a rule in the Include list, it will be crawled.
  3. If the URL matches neither the Exclude nor Include list, then if the Include list is empty, it will be crawled, but if the Include list is not empty, it will not be crawled.

Note that only one of these cases will apply (as in Go’s switch statement, by way of analogy).

Finally, no URLs will be in scope if they are further than MaxDepth links from the From set of URLs.

Use with BigQuery

Run crawl schema >schema.json to get a BigQuery-compatible schema definition file. The file is automatically generated (via go generate) from the structure of the result object generated by the crawler, so it should always be up-to-date.

If you find an incompatibility between the output schema file and the data produced from a crawl, please flag as a bug on GitHub.

In general, you’ll save crawl data to a local file and then upload to BigQuery. That involves two commands:

$ crawl spider config.json >output.txt 

$ bq load --source_format=NEWLINE_DELIMITED_JSON dataset.table output.txt schema.json

Crawl files can be large, and it is convenient to upload them directly to Google Cloud Storage without storing them locally. This can be done by piping the output of crawl to gsutil:

$ crawl spider config.json | gsutil cp - gs://my-bucket/crawl-data.txt

$ bq load --source_format=NEWLINE_DELIMITED_JSON dataset.table gs://my-bucket/crawl-data.txt schema.json

Analyzing your data

Once you’ve got your data into BigQuery, you can take any approach to analysis you want. You can see how to do interactive analysis in the example notebook.

In particular, take a look at how the nested and repeated data fields are used. With them, it’s possible to generate reports on internal linking, canonicalization, and hreflang reciprocation.

Bugs, errors, contributions

All reports, requests, and contributions are welcome. Please handle them through the GitHub repository. Thank you!

This is not a Distilled product. We are unable to provide support. The software is open-source and governed by an MIT-style license. You can use it for commercial purposes without attribution.

A summary of Google Data Studio: Updates from April 2019

Posted by on May 14, 2019 in SEO Articles | Comments Off on A summary of Google Data Studio: Updates from April 2019

April was a big month for Google Data Studio (GDS), with Google introducing some significant product updates to this already robust reporting tool.

For those not familiar with GDS, it is a free dashboard-style reporting tool that Google rolled out in June 2016. With Data Studio, users can connect to various data sources to visualize, and share data from a variety of web-based platforms.

GDS supports native integrations with most Google products including Analytics, Google Ads, Search Ads 360 (formerly Doubleclick Search), Google Sheets, YouTube Analytics, and Google BigQuery.

GDS supports connectors that users can purchase to import data from over one hundred third-party sources such as Bing Ads, Amazon Ads, and many others.  

Sample Google Data studio dashboard

Source: Google

1. Google introduces BigQuery BI Engine for integration with GDS

BigQuery is Google’s massive enterprise data warehouse. It enables extremely fast SQL queries by using the same technology that powers Google Search. Per Google,

“Every day, customers upload petabytes of new data into BigQuery, our exabyte-scale, serverless data warehouse, and the volume of data analyzed has grown by over 300 percent in just the last year.”

BigQuery BI Engine stores, analyzes, and finds insights on your data Image Source: Google

Source: Google

2. Enhanced data drill-down capabilities

You can now reveal additional levels of detail in a single chart using GDS’s enhanced data drill down (or drill up) capabilities.

You’ll need to enable this feature in each specific GDS chart and, once enabled, you can drill down from a higher level of detail to a lower one (for example, country to a city). You can also drill up from a lower level of detail to a higher one (for example, city to the country). You must be in “View” mode to drill up or drill down (as opposed to the “Edit” mode).

Here’s an example of drilling-up in a chart that uses Google’s sample data in GDS.

GDS chart showing clicks by month

Source: Google

To drill-up by year, right click on the chart in “View” mode and select “Drill up” as shown below.

GDS chart showing the option to “Drill up” the monthly data to yearly data

Visit the Data Studio Help website for detailed instructions on how to leverage this feature.

3. Improved formatting of tables

GDS now allows for more user-friendly and intuitive table formatting. This includes the ability to distribute columns evenly with just one click (by right-clicking the table), resizing only one column by dragging the column’s divider, and changing the justification of table contents to left, right, or center via the “Style” properties panel in “Edit” mode.

Example of editing, table properties tab in GDS

Source: Google

Detailed instructions on how to access this feature are located here.

4. The ability to hide pages in “View” mode

GDS users can now hide pages in “View” mode by right clicking on the specific page (accessed via the top submenu), clicking on the three vertical dots to the right of the page name, and selecting “Hide page in view mode”. This feature comes in handy when you’ve got pages you don’t want your client (or anyone) to see when presenting the GDS report.

The new “Hide page” feature in GDS

Source: Google

5. Page canvas size enhancements

Users can now customize each page’s size with a new feature that was rolled out on March 21st (we’re sneaking this into the April update because it’s a really neat feature).

Canvas size settings can be accessed from the page menu at the top of the GDS interface. Select Page>Current Page Settings, and then select “Style” from the settings area at the right of the screen. You can then choose your page size from a list of pre-configured sizes or set a custom size of your own.

GDS Page Settings Wizard

Source: Google

6. New Data Studio help community

As GDS adds more features and becomes more complex, it seems only fitting that Google would launch a community help forum for this tool. So, while this isn’t exactly a new feature to GDS itself, it is a new resource for GDS users that will hopefully make navigating GDS easier.

Users can access the GDS Help Community via Google’s support website or selecting “Help Options” from the top menu bar in GDS (indicated by a question mark icon) then click the “Visit Help Forum” link.

The Help menu within GDS

Source: Google


We hope that summarizing the latest GDS enhancements has made it a little easier to digest the many new changes that Google rolled out in April (and March). Remember, you can always get a list of updates, both new and old by visiting Google’s Support website here.

Jacqueline Dooley is the Director of Digital Strategy at CommonMind.

The post A summary of Google Data Studio: Updates from April 2019 appeared first on Search Engine Watch.

Reading Between the Lines – Three Deeper Takeaways from John Mueller at BrightonSEO

Posted by on May 14, 2019 in SEO Articles | Comments Off on Reading Between the Lines – Three Deeper Takeaways from John Mueller at BrightonSEO

Reading Between the Lines – Three Deeper Takeaways from John Mueller at BrightonSEO

Last Friday I had the pleasure of watching John Mueller of Google being interviewed on the BrightonSEO main stage by (Distilled alumna!) Hannah Smith. I found it hugely interesting how different it was from the previous similarly formatted sessions with John I’ve seen – by Aleyda at BrightonSEO previously, and more recently by my colleague Will Critchlow at SearchLove. In this post, I want to get into some of the interesting implications in what John did and, crucially, did not say.

I’m not going to attempt here to cover everything John said exhaustively – if that’s what you’re looking for, I recommend this post by Deepcrawl’s Sam Marsden, or this transcript via Glen Allsopp (from which I’ve extracted below). This will also not be a tactical post – I was listening to this Q&A from the perspective of wanting to learn more about Google, not necessarily what to change in my SEO campaigns on Monday morning.

Looking too closely?

I’m aware of the dangers of reading too much into the minutia of what John Mueller, Garry Ilyes, and crew come out with – especially when he’s talking live and unscripted on stage. Ultimately, as John said himself, it’s his job to establish a flow of information between webmasters and search engineers at Google. There are famously few people, or arguably no people at all, who know the ins and outs of the search algorithm itself, and it is not John’s job to get into it in this depth.

That said, he has been trained, and briefed, and socialised, to say certain things, to not say certain things, to focus on certain areas, and so on. This is where our takeaways can get a little more interesting than the typical, clichéd “Google says X” or “we think Google is lying about Y”. I’d recommend this presentation and deck from Will if you want to read more about that approach, and some past examples.

So, into the meat of it.

1. “We definitely use links to recognize new content”

Hannah: Like I said, this is top tier sites…  Links are still a ranking factor though, right? You still use links as a ranking factor?

John: We still use links. I mean it’s not the only ranking factor, so like just focusing on links, I don’t think that makes sense at all… But we definitely use links to recognize new content.

Hannah: So if you then got effectively a hole, a very authoritative hole in your link graph… How is that going to affect how links are used as a ranking factor or will it?

John: I dunno, we’ll see. I mean it’s one of those things also where I see a lot of times the sites that big news sites write about are sites that already have links anyway. So it’s rare that we wouldn’t be able to find any of that new content. So I don’t think everything will fall apart. If that happens or when that happens, but it does make it a little bit harder for us. So it’s kind of tricky, but we also have lots of other signals that we look at. So trying to figure out how relevant a page is, is not just based on the links too.

The context here is that Hannah was interested in how much of a challenge it is for Google when large numbers of major editorial sites start adding the “nofollow” attribute to all their external links – which has been a trend of late in the UK, and I suspect elsewhere. If authoritative links are still an important trust factor, does this not weaken that data?

The interesting thing for me here was very much in what John did not say. Hannah asks him fairly directly whether links are a ranking factor, and he evades three times, by discussing the use of links for crawling & discovering content, rather than for establishing a link graph and therefore a trust signal:

“We still use links”
“We definitely use links to recognize new content”
“It’s rare we wouldn’t be able to find any of that new content”

There’s also a fourth example, earlier in the discussion – before the excerpt above –  where he does the same:

“…being able to find useful content on the web, links kind of play a role in that.”

This is particularly odd as in general, Google is pretty comfortable still discussing links as a ranking factor. Evidently, though, something about this context caused this slightly evasive response. The “it’s not the only ranking factor” response feels like a bit of an evasion too, given that Google essentially refuses to discuss other ranking factors that might establish trust/authority, as opposed to just relevance and baseline quality – see my points below on user signals!

Personally, I also thought this comment was very interesting and somewhat vindicating of my critique of a lot of ranking factor studies:

“…a lot of the times the sites that big news sites write about are sites that already have links anyway”

Yeah, of course – links are correlated with just about any other metric you can imagine, whether it be branded search volume, social shares, click-through rate, whatever.

2. Limited spots on page 1 for transactional sites

Hannah: But thinking about like a more transactional query, for example. Let’s just say that you want to buy some contact lenses, how do you know if the results you’ve ranked first is the right one? If you’ve done a good job of ranking those results?

John: A lot of times we don’t know, because for a lot of these queries there is no objective, right or wrong. They’re essential multiple answers that we could say this could make sense to show as the first result. And I think in particular for cases like that, it’s useful for us to have those 10 blue links or even 10 results in the search page, where it’s really something like we don’t completely know what you’re looking for. Are you looking for information on these contact lenses? Do you want to buy them? Do you want to compare them? Do you want to buy a specific brand maybe from this-

This is one of those things where I think I could have figured this out from the information I already had, but it clicked into place for me listening to this explanation from John. If John is saying there’s a need to show multiple intents on the first page for even a fairly commercial query, there is an implication that only so many transactional pages can appear.

Given that, in many verticals, there are far more than 10 viable transactional sites, this means that if you drop from being the 3rd best to the 4th best among those, you could drop from, for example, position 5 to position 11. This is particularly important to keep in mind when we’re analysing search results statistically – whether it be in ranking factor studies or forecasting the results of our SEO campaigns, the relationship between the levers we pull and the outputs we see can be highly non-linear. A small change might move you 6 ranking positions, past sites which have a different intent and totally different metrics when it comes to links, on-page optimisation, or whatever else.

3. User signals as a ranking factor

Hannah: Surely at that point, John, you would start using signals from users, right? You would start looking at which results are clicked through most frequently, would you start looking at stuff like that at that point?

John: I don’t think we would use that for direct ranking like that. We use signals like that to analyze the algorithms in general, because across a million different search queries we can figure out like which one tends to be more correct or not, depending on where people click. But for one specific query for like a handful of pages, it can go in so many different directions. It’s really-

So, the suggestion here is that user signals – presumably CTR (click-through rates), dwell time, etc. – are used to appraise the algorithm, but not as part of the algorithm. This has been the line from Google for a while, but I found this response far more explicit and clear than John M’s skirting round the subject in the past.

It’s difficult to square this with some past experiments from the likes of Rand Fishkin manipulating rankings with hundreds of people in a conference hall clicking results for specific queries, or real world results I’ve discussed here. In the latter case, we could maybe say that this is similar to Panda – Google has machine learned what on-site attributes go with users finding a site trustworthy, rather than measuring trust & quality directly. That doesn’t explain Rand’s results, though.

Here are a few explanations I think are possible:

Google just does not want to admit to this, because it’d look spammable (whether or not it actually is)
In fact, they use something like “site recent popularity” as part of the algorithm, so, on a technicality, don’t need to call it CTR or user signals
The algorithm is constantly appraising itself, and adjusts in response to a lot of clicks on a result that isn’t p1 – but the ranking factor that gets adjusted is some arbitrary attribute of that site, not the user signal itself

Just to explain what I mean by the third one a little further – imagine if there are three sites ranking for a query, which are sites A, B, & C. At the start, they rank in that order – A, B, C. It just so happens, by coincidence, that site C has the highest word count.

Lots of people suddenly search the query and click on result C. The algorithm is appraising itself based on user signals, for example, cases where people prefer the 3rd place result, so needs to adjust to make this site rank higher. Like any unsupervised machine learning, it finds a way, any way, to fit the desired outcome to the inputs for this query, which in this case is weighting word count more highly as a ranking factor. As such, result C ranks first, and we all claim CTR is the ranking factor. Google can correctly say CTR is not a ranking factor, but in practice, it might as well be.

For me, the third option is the most contrived, but also fits in most easily with my real world experience, but I think either of the other explanations, or even all 3, could be true.


I hope you’ve enjoyed my rampant speculation. It’s only fair that you get to join in too: tweet me at @THCapper, or get involved in the comments below.

How Often Does Google Update Its Algorithm?

Posted by on May 14, 2019 in SEO Articles | Comments Off on How Often Does Google Update Its Algorithm?

How Often Does Google Update Its Algorithm?

Posted by Dr-Pete

In 2018, Google reported an incredible 3,234 improvements to search. That’s more than 8 times the number of updates they reported in 2009 — less than a decade ago — and an average of almost 9 per day. How have algorithm updates evolved over the past decade, and how can we possibly keep tabs on all of them? Should we even try?

To kick this off, here’s a list of every confirmed count we have (sources at end of post):

2018 – 3,234 “improvements”2017 – 2,453 “changes”2016 – 1,653 “improvements”2013 – 890 “improvements”2012 – 665 “launches”2011 – 538 “launches”2010 – 516 “changes”2009 – 350–400 “changes”

Unfortunately, we don’t have confirmed data for 2014-2015 (if you know differently, please let me know in the comments).

A brief history of update counts

Our first peek into this data came in spring of 2010, when Google’s Matt Cutts revealed that “on average, [Google] tends to roll out 350–400 things per year.” It wasn’t an exact number, but given that SEOs at the time (and to this day) were tracking at most dozens of algorithm changes, the idea of roughly one change per day was eye-opening.

In fall of 2011, Eric Schmidt was called to testify before Congress, and revealed our first precise update count and an even more shocking scope of testing and changes:

“To give you a sense of the scale of the changes that Google considers, in 2010 we conducted 13,311 precision evaluations to see whether proposed algorithm changes improved the quality of its search results, 8,157 side-by-side experiments where it presented two sets of search results to a panel of human testers and had the evaluators rank which set of results was better, and 2,800 click evaluations to see how a small sample of real-life Google users responded to the change. Ultimately, the process resulted in 516 changes that were determined to be useful to users based on the data and, therefore, were made to Google’s algorithm.”

Later, Google would reveal similar data in an online feature called “How Search Works.” Unfortunately, some of the earlier years are only available via the Internet Archive, but here’s a screenshot from 2012:

Note that Google uses “launches” and “improvements” somewhat interchangeably. This diagram provided a fascinating peek into Google’s process, and also revealed a startling jump from 13,311 precisions evaluations (changes that were shown to human evaluators) to 118,812 in just two years.

Is the Google algorithm heating up?

Since MozCast has kept the same keyword set since almost the beginning of data collection, we’re able to make some long-term comparisons. The graph below represents five years of temperatures. Note that the system was originally tuned (in early 2012) to an average temperature of 70°F. The redder the bar, the hotter the temperature …

Click to open a high-resolution version in a new tab

You’ll notice that the temperature ranges aren’t fixed — instead, I’ve split the label into eight roughly equal buckets (i.e. they represent the same number of days). This gives us a little more sensitivity in the more common ranges.

The trend is pretty clear. The latter half of this 5-year timeframe has clearly been hotter than the first half. While warming trend is evident, though, it’s not a steady increase over time like Google’s update counts might suggest. Instead, we see a stark shift in the fall of 2016 and a very hot summer of 2017. More recently, we’ve actually seen signs of cooling. Below are the means and medians for each year (note that 2014 and 2019 are partial years):

2019 – 83.7° /82.0°2018 – 89.9° /88.0°2017 – 94.0° /93.7°2016 – 75.1° / 73.7°2015 – 62.9° / 60.3°2014 – 65.8° / 65.9°

Note that search engine rankings are naturally noisy, and our error measurements tend to be large (making day-to-day changes hard to interpret). The difference from 2015 to 2017, however, is clearly significant.

Are there really 9 updates per day?

No, there are only 8.86 – feel better? Ok, that’s probably not what you meant. Even back in 2009, Matt Cutts said something pretty interesting that seems to have been lost in the mists of time…

“We might batch [algorithm changes] up and go to a meeting once a week where we talk about 8 or 10 or 12 or 6 different things that we would want to launch, but then after those get approved … those will roll out as we can get them into production.”

In 2016, I did a study of algorithm flux that demonstrated a weekly pattern evident during clearer episodes of ranking changes. From a software engineering standpoint, this just makes sense — updates have to be approved and tend to be rolled out in batches. So, while measuring a daily average may help illustrate the rate of change, it probably has very little basis in the reality of how Google handles algorithm updates.

Do all of these algo updates matter?

Some changes are small. Many improvements are likely not even things we in the SEO industry would consider “algorithm updates” — they could be new features, for example, or UI changes.

As SERP verticals and features evolve, and new elements are added, there are also more moving parts subject to being fixed and improved. Local SEO, for example, has clearly seen an accelerated rate of change over the past 2-3 years. So, we’d naturally expect the overall rate of change to increase.

A lot of this is also in the eye of the beholder. Let’s say Google makes an update to how they handle misspelled words in Korean. For most of us in the United States, that change isn’t going to be actionable. If you’re a Korean brand trying to rank for a commonly misspelled, high-volume term, this change could be huge. Some changes also are vertical-specific, representing radical change for one industry and little or no impact outside that niche.

On the other hand, you’ll hear comments in the industry along the lines of “There are 3,000 changes per year; stop worrying about it!” To me that’s like saying “The weather changes every day; stop worrying about it!” Yes, not every weather report is interesting, but I still want to know when it’s going to snow or if there’s a tornado coming my way. Recognizing that most updates won’t affect you is fine, but it’s a fallacy to stretch that into saying that no updates matter or that SEOs shouldn’t care about algorithm changes.

Ultimately, I believe it helps to know when major changes happen, if only to understand whether rankings shifted due something we did or something Google did. It’s also clear that the rate of change has accelerated, no matter how you measure it, and there’s no evidence to suggest that Google is slowing down.

Appendix A: Update count sources

2009 – Google’s Matt Cutts, video (Search Engine Land)
2010 – Google’s Eric Schmidt, testifying before Congress (Search Engine Land)
2012 – Google’s “How Search Works” page (Internet Archive)
2013 – Google’s Amit Singhal, Google+ (Search Engine Land)
2016 – Google’s “How Search Works” page (Internet Archive)
2017 – Unnamed Google employees (CNBC)
2018 – Google’s “How Search Works” page (

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

The New Moz Local Is on Its Way!

Posted by on May 14, 2019 in SEO Articles | Comments Off on The New Moz Local Is on Its Way!

Posted by MiriamEllis

Exciting secrets can be so hard to keep. Finally, all of us at Moz have the green light to share with all of you a first glimpse of something we’ve been working on for months behind the scenes. Big inhale, big exhale…

Announcing: the new and improved Moz Local, to be rolled out beginning June 12!

Why is Moz updating the Moz Local platform?

Local search has evolved from caterpillar to butterfly in the seven years since we launched Moz Local. I think we’ve spent the time well, intensively studying both Google’s trajectory and the feedback of enterprise, marketing agency, and SMB customers.

Your generosity in telling us what you need as marketers has inspired us to action. Over the coming months, you’ll be seeing what Moz has learned reflected in a series of rollouts. Stage by stage, you’ll see that we’re planning to give our software the wings it needs to help you fully navigate the dynamic local search landscape and, in turn, grow your business.

We hope you’ll keep gathering together with us to watch Moz Local take full flight — changes will only become more robust as we move forward.

What can I expect from this upgrade?

Beginning June 12th, Moz Local customers will experience a fresh look and feel in the Moz Local interface, plus these added capabilities:

New distribution partners to ensure your data is shared on the platforms that matter most in the evolving local search ecosystemListing status and real-time updates to know the precise status of your location data Automated detection and permanent duplicate closure, taking the manual work out of the process and saving you significant timeIntegrations with Google and Facebook to gain deeper insights, reporting, and management for your location’s profilesAn even better data clean-up process to ensure valid data is formatted properly for distributionA new activity feed to alert you to any changes to your location’s listingsA suggestion engine to provide recommendations to increase accuracy, completeness, and consistency of your location data

Additional features available include:

Managing reviews of your locations to keep your finger on the pulse of what customers are sayingSocial posting to engage with consumers and alert them to news, offers, and other updatesStore locator and landing pages to share location data easily with both customers and search engines (available for Moz Local customers with 100 or more locations)

Remember, this is just the beginning. There’s more to come in 2019, and you can expect ongoing communications from us as further new feature sets emerge!

When is it happening?

We’ll be rolling out all the new changes beginning on June 12th. As with some large changes, this update will take a few days to complete, so some people will see the changes immediately while for others it may take up to a week. By June 21st, everyone should be able to explore the new Moz Local experience!

Don’t worry — we’ll have several more communications between now and then to help you prepare. Keep an eye out for our webinar and training materials to help ensure a smooth transition to the new Moz Local.

Are any metrics/scores changing?

Some of our reporting metrics will look different in the new Moz Local. We’ll be sharing more information on these metrics and how to use them soon, but for now, here’s a quick overview of changes you can expect:

Profile Completeness: Listing Score will be replaced by the improved Profile Completeness metric. This new feature will give you a better measurement of how complete your data is, what’s missing from it, and clear prompts to fill in any lacking information.Improved listing status reporting: Partner Accuracy Score will be replaced by improved reporting on listing status with all of our partners, including continuous information about the data they’ve received from us. You’ll be able to access an overview of your distribution network, so that you can see which sites your business is listed on. Plus, you’ll be able to go straight to the live listing with a single click.Visibility Index: Though they have similar names, Visibility Score is being replaced by something slightly different with the new and improved Visibility Index, which notates how the data you’ve provided us about a location matches or mismatches your information on your live listings.New ways to measure and act on listing reach: Reach Score will be leaving us in favor of even more relevant measurement via the Visibility Index and Profile Completeness metrics. The new Moz Local will include more actionable information to ensure your listings are accurate and complete.
Other FAQs

You’ll likely have questions if you’re a current Moz Local customer or are considering becoming one. Please check out our resource center for further details, and feel free to leave us a question down in the comments — we’ll be on point to respond to any wonderings or concerns you might have!

Head to the FAQs

Where is Moz heading with this?

As a veteran local SEO, I’m finding the developments taking place with our software particularly exciting because, like you, I see how local search and local search marketing have matured over the past decade.

I’ve closely watched the best minds in our industry moving toward a holistic vision of how authenticity, customer engagement, data, analysis, and other factors underpin local business success. And we’ve all witnessed Google’s increasingly sophisticated presentation of local business information evolve and grow. It’s been quite a ride!

At every level of local commerce, owners and marketers deserve tools that bring order out of what can seem like chaos. We believe you deserve software that yields strategy. As our CEO, Sarah Bird, recently said of Moz,

“We are big believers in the power of local SEO.”

So the secret is finally out, and you can see where Moz is heading with the local side of our product lineup. It’s our serious plan to devote everything we’ve got into putting the power of local SEO into your hands.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

I/O announcements have some applauding and others shaking their fists

Posted by on May 13, 2019 in SEO Articles | Comments Off on I/O announcements have some applauding and others shaking their fists

Now that this year’s I/O conference is in the books, digital marketers have had a chance to digest Google’s big announcements. Chief among them was Googlebot getting pushed to the latest version of Chromium, Assistant delivering results up to 10 times faster and, perhaps the most contentious, Search supporting FAQ and How-to structured data. As you can imagine, reactions weren’t limited to applause from the live audience.

Googlebot’s long-awaited update has engineers and developers nodding favorably.

Super cool. No more testing in Chrome webmaster tools to verify if your site is crawlable.

— Samar Panda (@samarpanda) May 8, 2019

Biggest unsung news 📰 of #io19…Googlebot now indexes the web using the latest Chromium rather than super old Chrome 42. Use modern features with confidence, without SEO issues. Huge! 🙌

— Eric Bidelman (@ebidel) May 7, 2019

Still, some are keen to point out that this should have come sooner, especially because the update benefits businesses, consumers and Google itself.

it only took half a decade ++ !!

— jameschurchman (@jameschurchman) May 7, 2019

A number of Google Assistant-related announcements were made, but the speed demonstration is what might get users to take advantage of it more often and, by extension, businesses to prioritize integrating with it. Naturally, people drew comparisons with the competition.

Here’s an incredible demo from Google I/O. What they’re doing with the new Google Assistant is light-years ahead of Siri, which is a shame given the couple year head start Apple had.

— Mike (@ekimgary) May 8, 2019

Siri: We now have better Maps integration
Bixby: We can now recognize multiple voices

Google Assistant: We’ll read your email, find the last time you booked a certain car, then book the same one in the same color for your next trip, which we also put in your calendar……..

— Marques Brownlee (@MKBHD) May 7, 2019

The announcement of support for How-to markup in search results received strong reactions. Some were excited to give it a test drive…

Actually pretty excited about this. Will definitely test out ASAP.

— Alexander Juul (@AlexanderJuul) May 8, 2019

…while others were anxious about what it could mean for the industry.

Hurray for more ways to get less traffic to your website and generate free content for @Google to run ads against !

— PaulsSEOstuff (@PaulsSEOstuff) May 8, 2019

Google is becoming a parasite. Not the mutually beneficial kind either, just a leech. You produce nothing, steal content, to make $$ & now even steal the click.

This can’t and won’t go on forever.

— Kristine Schachinger (@schachin) May 9, 2019

“Google started adding ‘features’ to the SERPS. Features whose content is not created by Google, but which operates off the scraped content of the sites in their index,” Schachinger, the digital strategist and SEO consultant quoted above, elaborated in a follow-up with Search Engine Land.

“These features ‘steal the click’ meant for the site because they are meant to keep people on Google’s page, so they will click on Google Ads. Despite a recent study showing users still, by majority, prefer the ten blue links the how-to feature shows these features are just becoming more (and not less) prevalent. The ten blue links now appear, on average, 1000px down the page, where previously they appeared between 300-400px.”

“In ‘stealing the click,’ Google is only benefiting its bottom line. And for those whose content they are using to do this, it fundamentally alters the previously beneficial relationship between Google and site owners,” she points out. “What happens to their business when site owners start putting their money and efforts elsewhere? And this is not just supposition, I can tell you I know of some enterprise level C Suites that are testing just this, right now, because of the perception that Google is becoming less and less beneficial.”

Adding to the assortment of reactions, some see structured data (such as How-to markup) as an opportunity to gain more visibility by leapfrogging the top organic search results. Others, like Greg Finn, digital marketer and partner at Cypress North, acknowledges that the change does convenience users.

“On one hand, users should benefit in the immediate future by having Google surface every bit of helpful content on a site and showing it directly in the search results. Better yet, webmasters that participate may see a boost as they put themselves into the position of offering better content for Google.”

“The other hand is the scarier one,” Finn admits. “One way to look at it is that they are cutting out the middleman, with the middleman being the website itself. Many of the examples shown simply won’t drive traffic. Take a look at the FAQs and the ‘How To Tie a Tie’ example specifically. There is a monumental downside to Google Search changes that bypass your site & your work, so be careful. Make sure you know who is benefiting on your markup. When websites lose visitors & income, the overall content and output inevitably become worse. That’s my fear here.”

Why we should care. Google has been introducing numerous products and features that insert itself between businesses and users under the guise of getting users the info they want faster. The problem is businesses aren’t necessarily seeing the benefits but Google still stands to gain.

Clicks are becoming more scarce, and that’s an indicator that potential customers are getting less contact with our brands. By investing resources and embracing these new features and markups, are we facilitating search engines at our own expense? If that’s the case, at some point brands are bound to get fed up and seek alternative routes to their audiences, or the search engines will have to offer us more for our efforts and ad budgets.

The post I/O announcements have some applauding and others shaking their fists appeared first on Search Engine Land.

The Future of Display Advertising

Posted by on May 13, 2019 in SEO Articles | Comments Off on The Future of Display Advertising

The Future of Display Advertising

Display is a key tool in the digital marketing playbook. But the landscape is rapidly changing, as emerging adtech formats – including in-banner video, dynamic creative and mobile optimization – help marketers achieve greater efficiencies and improved display results.

Are you ready to leverage these new opportunities?

Join our display advertising experts as they discuss new display best practices that can lift both brand awareness and bottom-line conversions. You’ll hear how you can effectively adopt emerging technologies to create more personalized, relevant display ad campaigns.

Register today for “The Future of Display Advertising: New marketing strategies to boost results,” produced by Digital Marketing Depot and sponsored by Bannerflow.

The post The Future of Display Advertising appeared first on Search Engine Land.