Best WP SEO Plugins

Posted by on Jun 21, 2019 in Greg's SEO Articles | Comments Off on Best WP SEO Plugins

There are plenty of great plug-ins, but the ones on this list will enhance your SEO efforts.

1. SEO Yoast

This plugin’s probably one of the most used and popular WordPress SEO plugins by over five million websites. One of its best features is the XML sitemap management which allows you to easily create your own sitemaps. You don’t have to code and then fix it if something in’t working.

For content lovers, there’s the content optimization snippet preview which allows you to add your keyword, meta description and meta title to preview them as they appear on search. You also get tips and indications whether your content needs more on-site optimization, including reduction of keyword stuffing.

Moreover, Yoast SEO helps you identify and avoid duplicate content so you don’t get penalized by Google.

2. SEO Framework

Here’s another great plugin for small businesses instead of big companies. The interface looks like it’s integrated into WordPress, so it delivers fast SEO solutions and it’s time efficient. Not to mention that interacting with it feels very natural.

It has an AI built making it very interesting and it automatically optimizes your pages, so it gives you lots of possibilities to create a better website. It comes preconfigured but also gives you the option to change any settings you want. You can improve search results and the social presence too.

3. Broken Link Checker

This plugin parses your whole website and shows you how many broken links you have. You can find said list in a new tab of the WP admin panel – “Tools” -> “Broken Links”. Whenever you find them, there are some actions you can take: “Edit link”, “Unlink”, “Not broken”, and “Dismiss”.

4. All in One Schema Rich Snippets

This plugin can be used to improve the appearance in search engine results with rich snippets. The plugin can be used at its best for schema implementations, such as Recipes, Events, People, Products, Articles and so on.

Using it will give more accurate information to search engines about your website, help your results stand out in SERP and give you a competitive advantage.

5. Rank Math

This plugin helps you optimize your content and outrank your competitors. One of the coolest things is that it supports schema-based themes and also AMP pages.

With Rank Math you can check lots of errors and get a lot of information for your website:

. easy setup using the step-by-step installation and configuration wizard;
. rank tracking option to follow your keywords positions and LSI keyword integration;
. advanced website analysis section to spot any errors that need to be fixed;
. a modular framework so you can have complete control of your website;
. smart redirection manager;
. 40 monitor that identifies and fixes any 404 pages;
. internal linking management and suggestion;
. Google Search Console Integration;
. Easy configuration for rich snippets and so many more.

6. All in One SEO Pack

Here’s an easy WordPress plugin for beginners and small businesses that want to improve their website and increase their rankings, but it does has advanced features and an API for developers; for example:

. XML Sitemap support;
. Google AMP support;
. Google Analytics Integration;
. Webmaster verification options for Google, Bing, and Pinterest;
. Automatically generated meta tags;
. Built-in API and compatibility with a lot of other plugins;
. advanced canonical URLs and many more.

7. SEOPress

This simple fast and very powerful SEO plugin has loads of features that you can easily enable or disable as per required:

. Discover your suggestion for your content through Google’s suggestion.
. Fine tune with a content analysis tool.
. You can track Google event and traffic from the dashboard.
. It is very easy to create and manage 301, 302 and 307 redirects.
. You will be able to check the performance of your site with Google page speed.
. It allows you to implement Google structured data, such as product, article, event, local business, review, video, course, recipe and so on.

What Your Google Tag Manager Container Should Contain – Whiteboard Friday

Posted by on Jun 19, 2019 in SEO Articles | Comments Off on What Your Google Tag Manager Container Should Contain – Whiteboard Friday

Posted by DiTomaso

Agencies, are you set up for ongoing Google Tag Manager success? GTM isn’t the easiest tool in the world to work with, but if you know how to use it, it can make your life much easier. Make your future self happier and more productive by setting up your GTM containers the right way today. Dana DiTomaso shares more tips and hints in this edition of Whiteboard Friday.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hi, Moz fans. My name is Dana DiTomaso. I am President and partner at Kick Point, which is a digital marketing agency based in Edmonton, Alberta. Today I’m going to be talking to you about Google Tag Manager and what your default container in Google Tag Manager should contain. I think if you’re in SEO, there are certainly a lot of things Google Tag Manager can do for you.

But if you’ve kind of said to yourself, “You know, Google Tag Manager is not the easiest thing to work with,” which is fair, it is not, and it used to be a lot worse, but the newer versions are pretty good, then you might have been a little intimidated by going in there and doing stuff. But I really recommend that you include these things by default because later you is going to be really happy that current you put this stuff in. So I’m going to go through what’s in Kick Point’s default Google Tag Manager container, and then hopefully you can take some of this and apply it to your own stuff.

Agencies, if you are watching, you are going to want to create a default container and use it again and again, trust me. 


So we’re going to start with how this stuff is laid out. So what we have are tags and then triggers. The way that this works is the tag is sort of the thing that’s going to happen when a trigger occurs. 

Conversion linker

So tags that we have in our default container are the conversion linker, which is used to help conversions with Safari.

If you don’t know a lot about this, I recommend looking up some of the restrictions with Safari tracking and ITP. I think they’re at 2.2 at the time I’m recording this. So I recommend checking that out. But this conversion linker will help you get around that. It’s a default tag in Tag Manager, so you just add the conversion linker. There’s a nice article on Google about what it does and how it all works. 


Then we need to track a number of events. You can certainly track these things as custom dimensions or custom metrics if that floats your boat. I mean that’s up to you. If you are familiar with using custom dimensions and custom metrics, then I assume you probably know how to do this. But if you’re just getting started with Tag Manager, just start with events and then you can roll your way up to being an expert after a while. 

External links

So under events, we always track external links, so anything that points out to a domain that isn’t yours.

The way that we track this is we’re looking at every single link that’s clicked and if it does not contain our client’s domain name, then we record it as an external link, and that’s an event that we record. Now remember, and I’ve seen accidents with this where someone doesn’t put in your client’s domain and then it tracks every single click to a different page on your client’s website as an external link. That’s bad.

When you transfer from HTTP to HTTPS, if you don’t update Google Tag Manager, it will start recording links incorrectly. Also bad. But what this is really useful for are things like when you link out to other websites, as you should when you’re writing articles, telling people to find out more information. Or you can track clicks out to your different social properties and see if people are actually clicking on that Facebook icon that you stuck in the header of your website. 

PDF downloads

The next thing to track are PDF downloads.

Now there’s a limitation to this, of course, in that if people google something and your PDF comes out and then they click on it directly from Google, of course that’s not going to show up in your Analytics. That can show up in Search Console, but you’re not going to get it in Analytics. So just keep that in mind. This is if someone clicks to your PDF from a specific page on your website. Again, you’re decorating the link to say if this link contains a PDF, then I want to have this.

Scroll tracking

Then we also track scroll tracking. Now scroll tracking is when people scroll down the site, you can track and fire an event at say 25%, 50%, 75%, and 100% of the way down the page. Now the thing is with this is that your mileage is going to vary. You will probably pick different percentages. By default, in all of our containers we put 25%, 50%, 75%, and 100%. Based on the client, we might change this.

An advanced, sort of level up tactic would be to pick specific elements and then when they enter the viewport, then you can fire an event. So let’s say, for example, you have a really important call to action and because different devices are different sizes, it’s going to be a different percentage of the way down the page when it shows up, but you want to see if people got to that main CTA. Then you would want to add an event that would show whether or not that CTA was shown in the viewport.

If you google Google Tag Manager and tracking things in the viewport, there are some great articles out there on how to do it. It’s not that difficult to set up. 

Form submits

Then also form submits. Of course, you’re going to want to customize this. But by default put form submits in your container, because I guarantee that when someone is making your container let’s say for a brand-new website, they will forget about tracking form submits unless you put it in your default container and they look at it and say, “Oh, right, I have to edit that.” So always put form submits in there. 

Tel: & mailto: links</h4> <p>Of course you want to track telephone links and mailto: links. Telephone links should always, always be tappable, and that&#8217;s something that I see a lot of mistakes. Particularly in local SEO, when we217;re dealing with really small business websites, they don&#8217;t make the telephone links tappable. It&#8217;s probably because people don&#8217;t know how. In case you don&#8217;t know how, you just telephone and then a colon and then the telephone number. </p>

That’s it. That’s all you need to do. Just like a link, except rather than going out to an HTTPS://, you’re going out to a telephone number. That is going to make your visitors’ lives so much easier, particularly on mobile devices. You always want to have those be tappable. So then you can track the number of people who tap on telephone links and people who tap on mailto: links exactly the same way. Now something that I do have to say, though, is that if you are using a call tracking provider, like CallRail for example, which is one that we use, then you&#8217;re going to want to shut this off, because then you could end up in double counting. </p> <p>Particularly if you&#8217;re tracking every call made out from your website, then CallRail would have an Analytics integration, and then you would be tracking taps and you might also be tracking telephone clicks. So you can track it if you want to see how many people tap versus picking up the phone and calling the old-fashioned way with landlines. You can also do that, but that&#8217;s entirely up to you. But just keep that in mind if you are going to track telephone links. </p> <h3>All pages trackingThen, of course, all pages tracking. Make sure you&#8217;re tracking all of the pages on your website through Google Analytics. So those are the tags.&nbsp;</p> <p>Next up are the triggers. So I have a tag of external links. Then I need a trigger for external links. The trigger says when somebody clicks an external link, then I want this event to happen. </p> <h3>Clicks</h3> <p>So the event is where you structure the category and then the action and the label.&nbsp;</p>

The way that we would structure external links, for example, we would say that the category for it is an external link, the action is click, and then the label is the actual link that was clicked for example. You can see you can go through each of these and see where this is happening. <p>Then on things like form submit, for example, our label could be the specific form.&nbsp;

<h4>Tel: &amp; mailto:</h4> <p>On telephone and mailto:, we might track the phone number.&nbsp;</p> <h4>PDFs</h4> <p>On other things, like PDFs, we might track like the page that this happened on.&nbsp;</p> <h3>Page scroll</h3> <p>For scroll tracking, for example, we would want to track the page that someone scrolled down on. What I recommend when you&#8217;re setting up the event tracking for page scroll, the category should be page scroll, the action should be the percentage of which people scroll down, and then the label should be the URL. </p> <p>Really think of it in terms of events, where you&#8217;ve got the category, which is what happened, the action, which is what did the person do, and the label is telling me more information about this. So actions are typically things like scroll, click, and tap if you&#8217;re going to be fancy and track mobile versus desktop. It could be things like form submit, for example, or just submit. Just really basic stuff. So really the two things that are going to tell you the difference are things like categories and labels, and the action is just the action that happened. </p> <p>I&#8217;m really pedantic when it comes to setting up events, but I think in the long term, again, future you is going to thank you if you set this stuff up properly from the beginning. So you can really see that the tag goes to this trigger. Tag to trigger, tag to trigger, etc. So really think about making sure that every one of your tags has a corresponding trigger if it makes sense. So now we217;re going to leave you with some tips on how to set up your Tag Manager account.

<h2>Tips</h2> <h3>1. Use a Google Analytics ID variable</h3> <p>So the first tip is use a Google Analytics ID variable. It&#8217;s one of the built-in variables. When you go into Tag Manager and you click on Variables, it&#8217;s one of the built-in variables in there. I really recommend using that, because if you hardcode in the GA ID and something happens and you have to change it in the future or you copy that for someone else or whatever it might be, you&#8217;re going to forget. </p> <p>I guarantee you you will forget. So you&#8217;re going to want to put that variable in there so you change it once and it&#8217;s everywhere. You&#8217;re saving yourself so much time and suffering. Just use a Google Analytics ID variable. If you have a really old container, maybe the variable wasn&#8217;t a thing when you first set it up. So one of the things I would recommend is go check and make sure you&#8217;re using a variable. If you&#8217;re not, then make a to-do for yourself to rip out all the hardcoded instances of your GA ID and instead replace it with a variable. </p> <p>It will save you so much headaches.&nbsp;</p> <p>It&#8217;s going to save you a ton of time when you&#8217;re setting up containers, because I find that that&#8217;s the most labor-intensive part of working with a new Tag Manager container is thinking about, &#8220;What is all the stuff I want to include?&#8221; So you want to make sure that your default container has all your little tips and tricks that you&#8217;ve accumulated over the years in there and documented of course, and then decide on a client-by-client basis what you217;re going to leave and what you&#8217;re going to keep. </p> <h3>3. Use a naming scheme and folders</h3> <p>Also use a naming scheme and folders, again because you may not be working there forever, and somebody in the future is going to want to look at this and think, &#8220;Why did they set it up like this? What does this word mean? Why is this variable called foo?&#8221; You know, things that have annoyed me about developers for years and years and years, developers I love you, but please stop naming things foo. It makes no sense to anyone other than you. So our naming scheme, and you can totally steal this if you want, is we go product, result, and then what. </p> <p>So, for example, we would have our tag for Google Analytics page download. So it would say Google Analytics. This is the product that the thing is going to go to. Event is what is the result of this thing existing. Then what is the PDF download. Then it&#8217;s really clear, okay, I need to fix this thing with PDF download. Something is wrong. </p> <p>It&#8217;s kind of weird. Now I know exactly where to go. Again, with folders as well, so letR17;s say you&#8217;ve implemented something such as content consumption, which is a Google Tag Manager recipe that you can grab on our website at, and I&#8217;ll make sure to link to it in the transcript. Let&#8217;s say you grab that. Then you217;re going to want to take all the different tags and triggers that come along with content consumption and toss that into its own folder and then separate it out from all of your basic stuff. </p> <p>Even if you have everything to start in a folder called Basics or Events or Analytics versus Call Tracking versus any of the other billion different tracking pixels that you have on your website, it&#8217;s a good idea to just keep it all organized. I know it&#8217;s two minutes now. It is saving you a lifetime of suffering in the future, and the future you, whether it&#8217;s you working there or somebody who ends up taking your job five years from now, just make it easier on them. </p> <p>Especially too, when you think back to say Google Analytics has been around for a long time now. When I go back and look at some of my very, very first analytics that I set up, I might look at it and think, &#8220;Why was I doing that?&#8221; But if you have documentation, at least you̵7;re going to know why you did that really weird thing back in 2008. Or when you&#8217;re looking at this in 2029 and you&#8217;re thinking, &#8220;Why did I do this thing in 2019?&#8221; you&#8217;re going to have documentation for it. So just really keep that in mind.&nbsp;</p> <h3>4. Audit regularly!Google Tag Manager sort of auditing tool.

I’ll make sure to link to that in the transcript as well. You can use that to just go through your container and see what’s up. Let’s say you tested out some sort of screen recording, like you installed Hotjar six months ago and you ended up deciding on say another product instead, like FullStory, so then you want to make sure you remove the Hotjar. How many times have you found that you look at a new website and you’re like, “Why is this on here?”

No one at the client can tell you. They’re like, “I don’t know where that code came from.” So this is where auditing can be really handy, because remember, over time, each one of those funny little pixels that you tested out some product and then you ended up not going with it is weighing down your page and maybe it’s just a couple of microseconds, but that stuff adds up. So you really do want to go in and audit regularly and remove anything you’re not using anymore. Keep your Google Tag Manager container clean.

A lot of this is focused on obviously making future you very happy. Auditing will also make future you very happy. So hopefully, out of this, you can create a Google Tag Manager default container that’s going to work for you. I’m going to make sure as well, when the transcript is out for this, that I’m going to include some of the links that I talked about as well as a link to some more tips on how to add in things like conversion linker and make sure I’m updating it for when this video is published.

Thanks so much.

Video transcription by

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Optimize Your Posts before Publishing

Posted by on Jun 18, 2019 in Greg's SEO Articles | Comments Off on Optimize Your Posts before Publishing

Optimizing for SEO is a process of improvement/s, especially when it comes to your webpage or blog content. Here’s a list of recommendations on how to do that.


You don’t want to write something with a title that’s boring or too long, so make one that’s instantly catchy but is also 50-60 characters long.

It’s easier than you think! If you search “Title Creator” or “Title Generator” you’ll be able to make plenty of good headlines for your post.


This is easy to do as well! Simply go to Google’s very own Keword Planner Tool and find a keyword that’s not too short but not too long, and has plenty of search volume. After that, make sure to place those specific keywords in the title, the post’s URL, and the article body in the first 100 words or first paragraph.


H1 is the tag Google’s search engine looks for when forming its search results, so make sure your title has it! You really only want to use H2 and H3 for distinction within your article.


Metadata explains an image or video to Google’s web crawlers. This info (in the case of an image) is the title, alt text and its caption. Metadata can also help site visitors should an image fail to load. WordPress is able to add all this in the Media Library.


This can’t be stressed enough: make sure your post loads fast for viewers, and the best way to do that is to size your images properly.


Just like your title your content needs to be engaging as well. One way is to add links in your content.


Google and SEO experts say that shorter paragraphs are easier for Google to crawl for. It also makes reading easier too, because most readers scan the article before reading it. Make sure complex sentences are simpler to read (by using Grammarly, Ginger and HemingwayApp) but also highlight key points in your article.


Social media are huge traffic sources to share content, so make sure such buttons are visible (nearest the top as possible) so users can share your posts. A plugin can do this automatically for you!


Yoast is a perfect example! It helps you form SEO-friendly titles and descriptions (metadata as well), recommends where to place your target keyword, and provides tips as you write.

How to build a structured data-powered FAQ page using Yoast SEO

Posted by on Jun 15, 2019 in SEO Articles | Comments Off on How to build a structured data-powered FAQ page using Yoast SEO

Many, many sites have an FAQ page. This is a page where a lot of frequently asked questions get the appropriate answer. It is often a single page filled to the brim with questions and answers. While it’s easy to add one, it’s good to keep in mind that not all sites need an FAQ. Most of the times all you need is good content targeted at the users’ needs. Here, I’ll discuss the use of FAQ pages and show you how to make one yourself with Yoast SEOs new structured data content blocks for the WordPress block editor. You won’t believe how easy it is.

For more information on our Schema structured data implementation, please read our Schema documentation.

What is an FAQ?

FAQ stands for frequently asked questions. It is a single page collecting a series of question and its answers on a specific subject, product or company. An FAQ is often seen as a tool to reduce the workload of the customer support team. It is also used to show that you are aware of the issues a customer might have and to provide an answer to that.

But first: Do you really, really, really need an FAQ?

Usually, if you need to answer a lot of questions from users in an FAQ, that means that your content is not providing these answers and that you should work on that. Or maybe it is your product or service itself that’s not clear enough? One of the main criticisms of FAQs is that they hardly ever answer the questions consumers really have. They are also lazy: instead of figuring out how to truly answer a question with formidable content — using content design, for instance –, people rather throw some random stuff on a page and call it an FAQ.

That’s not to say you should never use an FAQ. Numerous sites successfully apply them — even we use them sparingly. In some cases, they do provide value. Users understand how an FAQ works and are quick to find what they are looking for — if the makers of the page know what they are doing. So don’t make endless lists of loosely related ‘How can I…’ or ‘How to…’ questions, because people will struggle to filter out what they need.

It has to be a page that’s easy to digest and has to have real answers to real questions by users. You can find scores of these if you search for them: ask your support team for instance! Collect and analyze the issues that come up frequently to see if you’re not missing some pain points in your products or if your content is targeting the wrong questions.

So don’t hide answers to pressings questions away on an FAQ page if you want to answer these in-depth: make an article out of it. This is what SEO deals with: provide an answer that matches your content to the search intent.

Questions and answers spoken out loud?

Google is trying to match a question from a searcher to an answer from a source. If you mark up your questions and answers with FAQ structured data, you tell search engines that this little sentence is a question and that this paragraph is its answer. And all these questions and answers are related to the main topic of the page.

Paragraph-based content is all the rage. One of the reasons? The advent of voice search. Google is looking for easy to understand, block-based content that it can use to answer searchers questions right in the search engine — or by speaking it out loud. Using the Schema property speakable might even speed up this content discovery by determining which part of the content is fit for text-to-speech conversion.

How to build an FAQ page in WordPress via Yoast SEO content blocks

The best way to set up a findable, readable and understandable FAQ page on a WordPress site is by using the structured data content blocks in Yoast SEO. These blocks for the new block editor — formally known as Gutenberg –, make building an FAQ page a piece of cake.

All the generated structured data for the FAQ will be added to the graph Yoast SEO generates for every page. This makes it even easier for search engines to understand your content. Yoast SEO automatically adds the necessary structured data so search engines like Google can do cool stuff with it. But, if nothing else, it might even give you an edge over your competitor. So, let’s get to it!

  1. Open WordPress’ new block editor

    Make a page in WordPress, add a title and an introductory paragraph. Now add the FAQ structured data content block. You can find the Yoast SEO structured data content blocks inside the Add Block modal. Scroll all the way down to find them or type ‘FAQ’ in the search bar, which I’ve highlighted in the screenshot below.yoast seo structured data content blocks FAQ

  2. Add questions and answers

    After you’ve added the FAQ block, you can start to add questions and answers to it. Keep in mind that these questions live inside the FAQ block. It’s advisable to keep the content related to each other so you can keep the page clean and focused. So no throwing in random questions.yoast seo structured data content blocks faq add question

  3. Keep filling, check and publish

    After adding the first question and answering it well, keep adding the rest of your questions and answers until you’ve filled your FAQ page. In the screenshot below you see two questions filled in. I’ve highlighted two buttons, the Add Image button and the Add Question. These speak for themselves.

    Once you are done, you’ll have a well-structured FAQ page with valid structured data. Go to the front-end of your site and check if everything is in order. If not, make the necessary changes.

What does an FAQ rich result look like?

We have an FAQ page for our Yoast Diversity Fund and that page was awarded an FAQ rich result by Google after we added an FAQ structured data content block. So, wondering what an FAQ looks like in Google? Wonder no more:

An example FAQ rich result for a Yoast page

Keep in mind that an FAQ rich result like this might influence the CTR to that page. It might even lead to a decrease in traffic to your site since you are giving away answers instantly. It is a good idea, therefore, to use it only for information that you don’t mind giving away like this. Or you have to find a way to make people click to your site. Do experiment with it, of course, to see the effects. Maybe it works brilliantly for you, who knows?

What does this look like under the hood?

Run your new FAQ page through Structured Data Testing Tool to see what it looks like for Google. Yoast SEO automatically generates valid structured data for your FAQ page. Here’s a piece of the Yoast Diversity Fund page, showing one particular question and its answer:

The first question and answer from the structured data graph

It’s basically built up like this. The context surrounding the questions is an FAQPage Schema graph. Every question gets a Question type and an acceptedAnswer with an answer type. That sounds hard, but it’s not. All you have to do is fill in the Question and the Answer and you’re good to go!

This translates to the code below as generated automatically by the Yoast SEO structured data content blocks. Now, Google will immediately see that this piece of content contains a question with an accepted answer. It will also see how this FAQ fits in with the rest of the page and the entities within your site. If you’re lucky, this might eventually lead to a featured snippet or another type of rich result.

<script type='application/ld+json' class='yoast-schema-graph yoast-schema-graph--main'> {
    "@graph":[ {
        "@type": "Organization", "@id": "", "name": "Yoast", "url": "", "sameAs": ["", "", "", "", "", "", ""]
        "publisher": {
            "@id": ""
        "potentialAction": {
            "query-input": "required name=search_term_string"
        "@type": ["WebPage", "FAQPage"], "@id": "", "url": "", "inLanguage": "en-US", "name": "How to Apply for the Yoast Diversity Fund • Yoast", "isPartOf": {
            "@id": ""
        "image": {
            "@type": "ImageObject", "@id": "", "url": "", "width": 1200, "height": 628
        "primaryImageOfPage": {
            "@id": ""
        "breadcrumb": {
            "@id": ""
        "itemListElement":[ {
            "item": {
                "@type": "WebPage", "@id": "", "url": "", "name": "Home"
            "item": {
                "@type": "WebPage", "@id": "", "url": "", "name": "Yoast Diversity Fund"
            "item": {
                "@type": "WebPage", "@id": "", "url": "", "name": "How to Apply for the Yoast Diversity Fund"
    [ {
        "mainEntityOfPage": {
            "@id": ""
        "itemListElement":[ {
            "@id": ""
            "@id": ""
            "@id": ""
            "@id": ""
        "name":"What type of costs are reimbursed?",
        "acceptedAnswer": {
            "@type": "Answer", "text": "Our goal is to reimburse those costs that would keep you from speaking at tech conferences. If you, for whatever reason, have costs, such as child-care or specialized transport, for example, we invite you to share those with us and we'll look at those on a per-case scenario. Examples of costs we're happy to reimburse are:u2013 Travel and transportation, e.g. gas, car rental, taxis or flights.u2013 Accommodation, hotel, AirBNB or similar. u2013 Child-care costs.u2013 Sign language interpreter.u2013 Visa costs."
        "name":"How many times can I apply for the Yoast Diversity Fund?",
        "acceptedAnswer": {
            "@type": "Answer", "text": "Our goal is to assist in increasing speaker diversity as much as possible. This means we'll focus on first-time applications mostly. However, there is no limit to the number of times you can apply."
        "name":"Is the fund available to all?",
        "acceptedAnswer": {
            "@type": "Answer", "text": "Yes. With the exception of Yoast employees, former Yoast employees, and contractors."
        "name":"When should I apply?",
        "acceptedAnswer": {
            "@type": "Answer", "text": "Applicants should apply at least one month before the event."


Structured data is so cool

Structured data is hot. It is one of the foundations on which the web is built today and its importance will only increase with time. In this post, I’ve shown you one of the newest Schema additions, and you’ll increasingly see this pop up in the search results.

For more information on our Schema structured data implementation, please read our Schema documentation.

The post How to build a structured data-powered FAQ page using Yoast SEO appeared first on Yoast.

Build your PPC campaigns with this mini campaign builder script for Google Ads

Posted by on Jun 14, 2019 in SEO Articles | Comments Off on Build your PPC campaigns with this mini campaign builder script for Google Ads

Need to quickly build a campaign or add keywords to an existing one? This script will do the work for you!

All you need to do is input a few keywords and headlines in a spreadsheet and BAM! You’ve got yourself the beginnings of a great campaign.

I’m a firm believer in Single Keyword per Ad Group (SKAG) structure – it increases ad/keyword relevance and therefore improves quality score, makes CPCs cheaper, gets you a higher ad rank and a better CTR.

Sadly, building out SKAG structures is a pretty time-consuming endeavor. You can’t implement millions of keywords and ads without PPC tech powering your builds.

But if a client just needs a couple of new keywords after updating their site with new content, this script is a quick and easy solution.

And that’s exactly what I love about PPC. There’s a special place in my heart for simple scripts anyone can use to achieve tasks that are otherwise repetitive or near-impossible.

What does the script do?

This tool will save a lot of time with small-scale builds where you know exactly which keywords and ad copy you need, for example when you’re adding a few keywords to an existing campaign.

You input your campaign name, keywords, headlines, descriptions, paths and final URL, and it will output three tabs for you: one with keyword combinations, one with negatives, and ads to upload to Google Ads Editor.

It creates one exact and one broad match modifier campaign and creates a list of keywords as exact negatives in the broad campaign to make sure that search terms that match exactly will go through the exact keyword.

I’m sure you’re dying to give it a whirl, so let’s get cracking!

How do you use it?

Make a copy of this spreadsheet (note: you’ll need to authorize the script to run). You’ll find all the instructions there as a future reminder.

Once you’ve got the spreadsheet ready, input the following:

  • The campaign name
  • The campaign name delimiter to distinguish between broad and exact campaigns
  • Headline 1 (if this cell is not specified, then it will be the same as the keyword)
  • Headline 2
  • Optionally, headline 3
  • Description 1
  • Optionally, description 2
  • Optionally, path 1 and path 2
  • The final URL
  • The keywords (you can keep going outside of the box with these!)

You’ll see a handy character counter which will go red if you exceed the character limit. Bear in mind that this tool will assume that you’re using it correctly and so you’ll need to make sure that you’re staying within the limit!

You can also optionally create a second ad variant by choosing the part of your text you want to vary (e.g., headline 2 or description 2) and inputting the copy. Otherwise, just select “None” from the dropdown menu.

Once you’re done, click the gigantic “Go!” Button, and wait for the magic to happen.

It will generate three tabs labelled “Keywords,” “Negatives” and “Ads.” If you want to run the script again with different keywords, make sure you save these tabs elsewhere or rename them to prevent the script from overriding them.

Finally, you can paste these tabs into Editor and update all the relevant settings and adjustments. Job done!

DOWNLOAD: You’ll need to authorize the script to run after you make a copy of this spreadsheet.

The post Build your PPC campaigns with this mini campaign builder script for Google Ads appeared first on Search Engine Land.

How to Write a Blog Post That Gets 304,392 New Visitors (SEO Case Study)

Posted by on Jun 11, 2019 in SEO Articles | Comments Off on How to Write a Blog Post That Gets 304,392 New Visitors (SEO Case Study)

What do you need to do to write a blog post that attracts a ton of traffic?

That’s exactly what I’m going to show you today.

My blog post about “backlinks” has attracted 304,392 new visitors since I originally published it back January of 2016.

traffic growth

265,992 of those users are from organic search:

Organic search traffic

It’s also the most linked to blog post on my website (which has helped other assets perform well):


Now let me show exactly how to write a blog post just like the one I did:

How to Write a Blog Post (The Right Way)

Building a successful blog in any niche is based on how much unique value you can add. It’s not about how frequently you publish. It’s not about how many friends you have.

It’s 100% about the unique value you can add to the marketplace.

What if you don’t have any unique value to add?

Then you need to develop your skills by getting more experience. I believe that truly great content is created by people who have a ton of experience in their given field.

That said:

You can make up for a lack of experience with insane amounts of effort.

Believe me when I say this… the more effort, time, and capital you put into a blog post, the better it will perform. I know this seems obvious.

But the truth is that most businesses think publishing little 400-word fluff pieces is “content marketing”. It’s not. Blog posts that perform at exceptional levels (over the long-term) are the product of enormous amounts of effort.

If you’re not willing to spend an entire month crafting one piece of content, then you’ll never compete with the top dogs in your industry.

If you ARE willing to put in the effort, then keep reading.

Phase 1 – Identify a Keyword

I believe 80% of your blog content should target a keyword. While the other 20% can be structured as a linkable asset or content marketing piece. This strategy is going to focus on keyword-targeted content.

So, how do you find a good keyword to target? Let me show you.

Step #1 – Build a Keyword Database

Every blogger should build a keyword database because it will destroy procrastination and you’ll never need to spend a single second wondering what you should write about.

There are many ways to find keywords, but my favorite technique is to:

Reverse Engineer Your Competitors

Go to Ahrefs and enter your domain into the Site Explorer.

Ahrefs site explorer

Then look under “Organic Search” on the left-hand side. Right click on “Content Gap” and open it in a new window.

Ahrefs Content Gap

Then while you’re still in the “Overview” section, click on the “Organic Search” tab and then on the right hand side you’ll see “Top 10 competitors”.

Top 10 Ahrefs

Copy your top competitors and paste them into the Content Gap tool. I recommend analyzing one what competitor at a time.

Content gap

Once the analysis is complete, filter the list by the following criteria:

  • Volume = From 1000
  • KD = To 50
  • Exclude = Brand Names like “backlinko”

Ahrefs Filter

Export the list and these to your keyword database.

Export ideas

You can replicate this exact process with SEMRush as well.

Step #2 – Qualify Your Keywords

Finding keywords is easy, but it takes skills to know what keywords are worth going after.

Ask yourself this simple question when going through your keyword database:

Is my website capable of ranking for this keyword?

The good news is that you don’t need to guess. You just need to look at the data.

Just create a keyword list using Ahrefs Keyword Explorer. This will give you a 30,000-foot view of your keyword targets.

Ahrefs Keyword Lists

You can then sort this list by KD, Volume, or any other metric.

Your goal at this stage should be to narrow your list to your top 5-10 keyword targets.

Use Ahrefs SERP feature to analyze the top 10 results for your keyword prospects.

Ahrefs SERP Feature

Ask the following:

  • Are there low-authority websites ranking? I define “low authority” as having a DR of 50 or below.
  • Are there YouTube videos ranking? This a good sign for two reasons. First, YouTube video pages are usually not content-rich (which means they’ll be easy to beat). Second, it means your brand can rank in both search engines (Google and YouTube). That means two search results for your brand and double the visibility.
  • Are there subdomains ranking? These include web 2.0s like or
  • Are there forum or Quora threads ranking? These are unstructured pages and can be beat easily.
  • Are there general article websites ranking? Niche sites crush general article websites like Ezine articles, eHow articles, etc.

If you answered “Yes” to these questions the keyword should move to the next phase.

Step #3 – Select a Keyword

Ahrefs gives you plenty of data to narrow your list. However, selecting a keyword requires a manual analysis. That’s what the next phase all about.

Phase 2 – Create Incredible SEO Content

After you’ve selected a keyword, it’s time to create your SEO content asset.

Watch this video to see exactly what to do:

I also recommend looking for opportunities to implement The Cake Technique. The Cake Technique is the process of consolidating similar content asset into a single mother asset.

Here’s how to do it:

Phase 3 – Acquire Links

Now that you’ve created your SEO content asset, you need to promote the heck out of it!

Acquiring quality links is critical your blog post’s SEO success.

Here’s what do you need to do:

That’s How You Write a Blog Post, but What’s Next?

You need to continue acquiring links to your new SEO content asset. Then, measure your performance over 3-6 months.

If your blog post hasn’t reached the top 100, then you either need more quality backlinks or you need to make your content asset better. The good news is that those two factors are always what you should analyze first (when blog post isn’t performing well).

Thanks for reading/watching!

Keyword Not Provided, But it Just Clicks

Posted by on Jun 11, 2019 in SEO Articles | Comments Off on Keyword Not Provided, But it Just Clicks

When SEO Was Easy

When I got started on the web over 15 years ago I created an overly broad & shallow website that had little chance of making money because it was utterly undifferentiated and crappy. In spite of my best (worst?) efforts while being a complete newbie, sometimes I would go to the mailbox and see a check for a couple hundred or a couple thousand dollars come in. My old roommate & I went to Coachella & when the trip was over I returned to a bunch of mail to catch up on & realized I had made way more while not working than what I spent on that trip.

What was the secret to a total newbie making decent income by accident?

Horrible spelling.

Back then search engines were not as sophisticated with their spelling correction features & I was one of 3 or 4 people in the search index that misspelled the name of an online casino the same way many searchers did.

The high minded excuse for why I did not scale that would be claiming I knew it was a temporary trick that was somehow beneath me. The more accurate reason would be thinking in part it was a lucky fluke rather than thinking in systems. If I were clever at the time I would have created the misspeller’s guide to online gambling, though I think I was just so excited to make anything from the web that I perhaps lacked the ambition & foresight to scale things back then.

In the decade that followed I had a number of other lucky breaks like that. One time one of the original internet bubble companies that managed to stay around put up a sitewide footer link targeting the concept that one of my sites made decent money from. This was just before the great recession, before Panda existed. The concept they targeted had 3 or 4 ways to describe it. 2 of them were very profitable & if they targeted either of the most profitable versions with that page the targeting would have sort of carried over to both. They would have outranked me if they targeted the correct version, but they didn’t so their mistargeting was a huge win for me.

Search Gets Complex

Search today is much more complex. In the years since those easy-n-cheesy wins, Google has rolled out many updates which aim to feature sought after destination sites while diminishing the sites which rely one “one simple trick” to rank.

Arguably the quality of the search results has improved significantly as search has become more powerful, more feature rich & has layered in more relevancy signals.

Many quality small web publishers have went away due to some combination of increased competition, algorithmic shifts & uncertainty, and reduced monetization as more ad spend was redirected toward Google & Facebook. But the impact as felt by any given publisher is not the impact as felt by the ecosystem as a whole. Many terrible websites have also went away, while some formerly obscure though higher-quality sites rose to prominence.

There was the Vince update in 2009, which boosted the rankings of many branded websites.

Then in 2011 there was Panda as an extension of Vince, which tanked the rankings of many sites that published hundreds of thousands or millions of thin content pages while boosting the rankings of trusted branded destinations.

Then there was Penguin, which was a penalty that hit many websites which had heavily manipulated or otherwise aggressive appearing link profiles. Google felt there was a lot of noise in the link graph, which was their justification for the Penguin.

There were updates which lowered the rankings of many exact match domains. And then increased ad load in the search results along with the other above ranking shifts further lowered the ability to rank keyword-driven domain names. If your domain is generically descriptive then there is a limit to how differentiated & memorable you can make it if you are targeting the core market the keywords are aligned with.

There is a reason eBay is more popular than, Google is more popular than, Yahoo is more popular than & Amazon is more popular than a or a When that winner take most impact of many online markets is coupled with the move away from using classic relevancy signals the economics shift to where is makes a lot more sense to carry the heavy overhead of establishing a strong brand.

Branded and navigational search queries could be used in the relevancy algorithm stack to confirm the quality of a site & verify (or dispute) the veracity of other signals.

Historically relevant algo shortcuts become less appealing as they become less relevant to the current ecosystem & even less aligned with the future trends of the market. Add in negative incentives for pushing on a string (penalties on top of wasting the capital outlay) and a more holistic approach certainly makes sense.

Modeling Web Users & Modeling Language

PageRank was an attempt to model the random surfer.

When Google is pervasively monitoring most users across the web they can shift to directly measuring their behaviors instead of using indirect signals.

Years ago Bill Slawski wrote about the long click in which he opened by quoting Steven Levy’s In the Plex: How Google Thinks, Works, and Shapes our Lives

“On the most basic level, Google could see how satisfied users were. To paraphrase Tolstoy, happy users were all the same. The best sign of their happiness was the “Long Click” — This occurred when someone went to a search result, ideally the top one, and did not return. That meant Google has successfully fulfilled the query.”

Of course, there’s a patent for that. In Modifying search result ranking based on implicit user feedback they state:

user reactions to particular search results or search result lists may be gauged, so that results on which users often click will receive a higher ranking. The general assumption under such an approach is that searching users are often the best judges of relevance, so that if they select a particular search result, it is likely to be relevant, or at least more relevant than the presented alternatives.

If you are a known brand you are more likely to get clicked on than a random unknown entity in the same market.

And if you are something people are specifically seeking out, they are likely to stay on your website for an extended period of time.

One aspect of the subject matter described in this specification can be embodied in a computer-implemented method that includes determining a measure of relevance for a document result within a context of a search query for which the document result is returned, the determining being based on a first number in relation to a second number, the first number corresponding to longer views of the document result, and the second number corresponding to at least shorter views of the document result; and outputting the measure of relevance to a ranking engine for ranking of search results, including the document result, for a new search corresponding to the search query. The first number can include a number of the longer views of the document result, the second number can include a total number of views of the document result, and the determining can include dividing the number of longer views by the total number of views.

Attempts to manipulate such data may not work.

safeguards against spammers (users who generate fraudulent clicks in an attempt to boost certain search results) can be taken to help ensure that the user selection data is meaningful, even when very little data is available for a given (rare) query. These safeguards can include employing a user model that describes how a user should behave over time, and if a user doesn’t conform to this model, their click data can be disregarded. The safeguards can be designed to accomplish two main objectives: (1) ensure democracy in the votes (e.g., one single vote per cookie and/or IP for a given query-URL pair), and (2) entirely remove the information coming from cookies or IP addresses that do not look natural in their browsing behavior (e.g., abnormal distribution of click positions, click durations, clicks_per_minute/hour/day, etc.). Suspicious clicks can be removed, and the click signals for queries that appear to be spmed need not be used (e.g., queries for which the clicks feature a distribution of user agents, cookie ages, etc. that do not look normal).

And just like Google can make a matrix of documents & queries, they could also choose to put more weight on search accounts associated with topical expert users based on their historical click patterns.

Moreover, the weighting can be adjusted based on the determined type of the user both in terms of how click duration is translated into good clicks versus not-so-good clicks, and in terms of how much weight to give to the good clicks from a particular user group versus another user group. Some user’s implicit feedback may be more valuable than other users due to the details of a user’s review process. For example, a user that almost always clicks on the highest ranked result can have his good clicks assigned lower weights than a user who more often clicks results lower in the ranking first (since the second user is likely more discriminating in his assessment of what constitutes a good result). In addition, a user can be classified based on his or her query stream. Users that issue many queries on (or related to) a given topic T (e.g., queries related to law) can be presumed to have a high degree of expertise with respect to the given topic T, and their click data can be weighted accordingly for other queries by them on (or related to) the given topic T.

Google was using click data to drive their search rankings as far back as 2009. David Naylor was perhaps the first person who publicly spotted this. Google was ranking Australian websites for [tennis court hire] in the UK & Ireland, in part because that is where most of the click signal came from. That phrase was most widely searched for in Australia. In the years since Google has done a better job of geographically isolating clicks to prevent things like the problem David Naylor noticed, where almost all search results in one geographic region came from a different country.

Whenever SEOs mention using click data to search engineers, the search engineers quickly respond about how they might consider any signal but clicks would be a noisy signal. But if a signal has noise an engineer would work around the noise by finding ways to filter the noise out or combine multiple signals. To this day Google states they are still working to filter noise from the link graph: “We continued to protect the value of authoritative and relevant links as an important ranking signal for Search.”

The site with millions of inbound links, few intentional visits & those who do visit quickly click the back button (due to a heavy ad load, poor user experience, low quality content, shallow content, outdated content, or some other bait-n-switch approach)…that’s an outlier. Preventing those sorts of sites from ranking well would be another way of protecting the value of authoritative & relevant links.

Best Practices Vary Across Time & By Market + Category

Along the way, concurrent with the above sorts of updates, Google also improved their spelling auto-correct features, auto-completed search queries for many years through a featured called Google Instant (though they later undid forced query auto-completion while retaining automated search suggestions), and then they rolled out a few other algorithms that further allowed them to model language & user behavior.

Today it would be much harder to get paid above median wages explicitly for sucking at basic spelling or scaling some other individual shortcut to the moon, like pouring millions of low quality articles into a (formerly!) trusted domain.

Nearly a decade after Panda, eHow’s rankings still haven’t recovered.

Back when I got started with SEO the phrase Indian SEO company was associated with cut-rate work where people were buying exclusively based on price. Sort of like a “I got a $500 budget for link building, but can not under any circumstance invest more than $5 in any individual link.” Part of how my wife met me was she hired a hack SEO from San Diego who outsourced all the work to India and marked the price up about 100-fold while claiming it was all done in the United States. He created reciprocal links pages that got her site penalized & it didn’t rank until after she took her reciprocal links page down.

With that sort of behavior widespread (hack US firm teaching people working in an emerging market poor practices), it likely meant many SEO “best practices” which were learned in an emerging market (particularly where the web was also underdeveloped) would be more inclined to being spammy. Considering how far ahead many Western markets were on the early Internet & how India has so many languages & how most web usage in India is based on mobile devices where it is hard for users to create links, it only makes sense that Google would want to place more weight on end user data in such a market.

If you set your computer location to India Bing’s search box lists 9 different languages to choose from.

The above is not to state anything derogatory about any emerging market, but rather that various signals are stronger in some markets than others. And competition is stronger in some markets than others.

Search engines can only rank what exists.

“In a lot of Eastern European – but not just Eastern European markets – I think it is an issue for the majority of the [bream? muffled] countries, for the Arabic-speaking world, there just isn’t enough content as compared to the percentage of the Internet population that those regions represent. I don’t have up to date data, I know that a couple years ago we looked at Arabic for example and then the disparity was enormous. so if I’m not mistaken the Arabic speaking population of the world is maybe 5 to 6%, maybe more, correct me if I am wrong. But very definitely the amount of Arabic content in our index is several orders below that. So that means we do not have enough Arabic content to give to our Arabic users even if we wanted to. And you can exploit that amazingly easily and if you create a bit of content in Arabic, whatever it looks like we’re gonna go you know we don’t have anything else to serve this and it ends up being horrible. and people will say you know this works. I keyword stuffed the hell out of this page, bought some links, and there it is number one. There is nothing else to show, so yeah you’re number one. the moment somebody actually goes out and creates high quality content that’s there for the long haul, you’ll be out and that there will be one.” – Andrey Lipattsev – Search Quality Senior Strategist at Google Ireland, on Mar 23, 2016

Impacting the Economics of Publishing

Now search engines can certainly influence the economics of various types of media. At one point some otherwise credible media outlets were pitching the Demand Media IPO narrative that Demand Media was the publisher of the future & what other media outlets will look like. Years later, after heavily squeezing on the partner network & promoting programmatic advertising that reduces CPMs by the day Google is funding partnerships with multiple news publishers like McClatchy & Gatehouse to try to revive the news dead zones even Facebook is struggling with.

“Facebook Inc. has been looking to boost its local-news offerings since a 2017 survey showed most of its users were clamoring for more. It has run into a problem: There simply isn’t enough local news in vast swaths of the country. … more than one in five newspapers have closed in the past decade and a half, leaving half the counties in the nation with just one newspaper, and 200 counties with no newspaper at all.”

As mainstream newspapers continue laying off journalists, Facebook’s news efforts are likely to continue failing unless they include direct economic incentives, as Google’s programmatic ad push broke the banner ad:

“Thanks to the convoluted machinery of Internet advertising, the advertising world went from being about content publishers and advertising context—The Times unilaterally declaring, via its ‘rate card’, that ads in the Times Style section cost $30 per thousand impressions—to the users themselves and the data that targets them—Zappo’s saying it wants to show this specific shoe ad to this specific user (or type of user), regardless of publisher context. Flipping the script from a historically publisher-controlled mediascape to an advertiser (and advertiser intermediary) controlled one was really Google’s doing. Facebook merely rode the now-cresting wave, borrowing outside media’s content via its own users’ sharing, while undermining media’s ability to monetize via Facebook’s own user-data-centric advertising machinery. Conventional media lost both distribution and monetization at once, a mortal blow.”

Google is offering news publishers audience development & business development tools.

Heavy Investment in Emerging Markets Quickly Evolves the Markets

As the web grows rapidly in India, they’ll have a thousand flowers bloom. In 5 years the competition in India & other emerging markets will be much tougher as those markets continue to grow rapidly. Media is much cheaper to produce in India than it is in the United States. Labor costs are lower & they never had the economic albatross that is the ACA adversely impact their economy. At some point the level of investment & increased competition will mean early techniques stop having as much efficacy. Chinese companies are aggressively investing in India.

“If you break India into a pyramid, the top 100 million (urban) consumers who think and behave more like Americans are well-served,” says Amit Jangir, who leads India investments at 01VC, a Chinese venture capital firm based in Shanghai. The early stage venture firm has invested in micro-lending firms FlashCash and SmartCoin based in India. The new target is the next 200 million to 600 million consumers, who do not have a go-to entertainment, payment or ecommerce platform yet— and there is gonna be a unicorn in each of these verticals, says Jangir, adding that it will be not be as easy for a player to win this market considering the diversity and low ticket sizes.


RankBrain appears to be based on using user clickpaths on head keywords to help bleed rankings across into related searches which are searched less frequently. A Googler didn’t state this specifically, but it is how they would be able to use models of searcher behavior to refine search results for keywords which are rarely searched for.

In a recent interview in Scientific American a Google engineer stated: “By design, search engines have learned to associate short queries with the targets of those searches by tracking pages that are visited as a result of the query, making the results returned both faster and more accurate than they otherwise would have been.”

Now a person might go out and try to search for something a bunch of times or pay other people to search for a topic and click a specific listing, but some of the related Google patents on using click data (which keep getting updated) mentioned how they can discount or turn off the signal if there is an unnatural spike of traffic on a specific keyword, or if there is an unnatural spike of traffic heading to a particular website or web page.

And, since Google is tracking the behavior of end users on their own website, anomalous behavior is easier to track than it is tracking something across the broader web where signals are more indirect. Google can take advantage of their wide distribution of Chrome & Android where users are regularly logged into Google & pervasively tracked to place more weight on users where they had credit card data, a long account history with regular normal search behavior, heavy Gmail users, etc.

Plus there is a huge gap between the cost of traffic & the ability to monetize it. You might have to pay someone a dime or a quarter to search for something & there is no guarantee it will work on a sustainable basis even if you paid hundreds or thousands of people to do it. Any of those experimental searchers will have no lasting value unless they influence rank, but even if they do influence rankings it might only last temporarily. If you bought a bunch of traffic into something genuine Google searchers didn’t like then even if it started to rank better temporarily the rankings would quickly fall back if the real end user searchers disliked the site relative to other sites which already rank.

This is part of the reason why so many SEO blogs mention brand, brand, brand. If people are specifically looking for you in volume & Google can see that thousands or millions of people specifically want to access your site then that can impact how you rank elsewhere.

Even looking at something inside the search results for a while (dwell time) or quickly skipping over it to have a deeper scroll depth can be a ranking signal. Some Google patents mention how they can use mouse pointer location on desktop or scroll data from the viewport on mobile devices as a quality signal.

Neural Matching

Last year Danny Sullivan mentioned how Google rolled out neural matching to better understand the intent behind a search query.

The above Tweets capture what the neural matching technology intends to do. Google also stated:

we’ve now reached the point where neural networks can help us take a major leap forward from understanding words to understanding concepts. Neural embeddings, an approach developed in the field of neural networks, allow us to transform words to fuzzier representations of the underlying concepts, and then match the concepts in the query with the concepts in the document. We call this technique neural matching.

To help people understand the difference between neural matching & RankBrain, Google told SEL: “RankBrain helps Google better relate pages to concepts. Neural matching helps Google better relate words to searches.”

There are a couple research papers on neural matching.

The first one was titled A Deep Relevance Matching Model for Ad-hoc Retrieval. It mentioned using Word2vec & here are a few quotes from the research paper

  • “Successful relevance matching requires proper handling of the exact matching signals, query term importance, and diverse matching requirements.”
  • “the interaction-focused model, which first builds local level interactions (i.e., local matching signals) between two pieces of text, and then uses deep neural networks to learn hierarchical interaction patterns for matching.”
  • “according to the diverse matching requirement, relevance matching is not position related since it could happen in any position in a long document.”
  • “Most NLP tasks concern semantic matching, i.e., identifying the semantic meaning and infer”ring the semantic relations between two pieces of text, while the ad-hoc retrieval task is mainly about relevance matching, i.e., identifying whether a document is relevant to a given query.”
  • “Since the ad-hoc retrieval task is fundamentally a ranking problem, we employ a pairwise ranking loss such as hinge loss to train our deep relevance matching model.”

The paper mentions how semantic matching falls down when compared against relevancy matching because:

  • semantic matching relies on similarity matching signals (some words or phrases with the same meaning might be semantically distant), compositional meanings (matching sentences more than meaning) & a global matching requirement (comparing things in their entirety instead of looking at the best matching part of a longer document); whereas,
  • relevance matching can put significant weight on exact matching signals (weighting an exact match higher than a near match), adjust weighting on query term importance (one word might or phrase in a search query might have a far higher discrimination value & might deserve far more weight than the next) & leverage diverse matching requirements (allowing relevancy matching to happen in any part of a longer document)

Here are a couple images from the above research paper

And then the second research paper is

Deep Relevancy Ranking Using Enhanced Dcoument-Query Interactions
“interaction-based models are less efficient, since one cannot index a document representation independently of the query. This is less important, though, when relevancy ranking methods rerank the top documents returned by a conventional IR engine, which is the scenario we consider here.”

That same sort of re-ranking concept is being better understood across the industry. There are ranking signals that earn some base level ranking, and then results get re-ranked based on other factors like how well a result matches the user intent.

Here are a couple images from the above research paper.

For those who hate the idea of reading research papers or patent applications, Martinibuster also wrote about the technology here. About the only part of his post I would debate is this one:

“Does this mean publishers should use more synonyms? Adding synonyms has always seemed to me to be a variation of keyword spamming. I have always considered it a naive suggestion. The purpose of Google understanding synonyms is simply to understand the context and meaning of a page. Communicating clearly and consistently is, in my opinion, more important than spamming a page with keywords and synonyms.”

I think one should always consider user experience over other factors, however a person could still use variations throughout the copy & pick up a bit more traffic without coming across as spammy. Danny Sullivan mentioned the super synonym concept was impacting 30% of search queries, so there are still a lot which may only be available to those who use a specific phrase on their page.

Martinibuster also wrote another blog post tying more research papers & patents to the above. You could probably spend a month reading all the related patents & research papers.

The above sort of language modeling & end user click feedback compliment links-based ranking signals in a way that makes it much harder to luck one’s way into any form of success by being a terrible speller or just bombing away at link manipulation without much concern toward any other aspect of the user experience or market you operate in.

Pre-penalized Shortcuts

Google was even issued a patent for predicting site quality based upon the N-grams used on the site & comparing those against the N-grams used on other established site where quality has already been scored via other methods: “The phrase model can be used to predict a site quality score for a new site; in particular, this can be done in the absence of other information. The goal is to predict a score that is comparable to the baseline site quality scores of the previously-scored sites.”

Have you considered using a PLR package to generate the shell of your site’s content? Good luck with that as some sites trying that shortcut might be pre-penalized from birth.

Navigating the Maze

When I started in SEO one of my friends had a dad who is vastly smarter than I am. He advised me that Google engineers were smarter, had more capital, had more exposure, had more data, etc etc etc … and thus SEO was ultimately going to be a malinvestment.

Back then he was at least partially wrong because influencing search was so easy.

But in the current market, 16 years later, we are near the infection point where he would finally be right.

At some point the shortcuts stop working & it makes sense to try a different approach.

The flip side of all the above changes is as the algorithms have become more complex they have went from being a headwind to people ignorant about SEO to being a tailwind to those who do not focus excessively on SEO in isolation.

If one is a dominant voice in a particular market, if they break industry news, if they have key exclusives, if they spot & name the industry trends, if their site becomes a must read & is what amounts to a habit … then they perhaps become viewed as an entity. Entity-related signals help them & those signals that are working against the people who might have lucked into a bit of success become a tailwind rather than a headwind.

If your work defines your industry, then any efforts to model entities, user behavior or the language of your industry are going to boost your work on a relative basis.

This requires sites to publish frequently enough to be a habit, or publish highly differentiated content which is strong enough that it is worth the wait.

Those which publish frequently without being particularly differentiated are almost guaranteed to eventually walk into a penalty of some sort. And each additional person who reads marginal, undifferentiated content (particularly if it has an ad-heavy layout) is one additional visitor that site is closer to eventually getting whacked. Success becomes self regulating. Any short-term success becomes self defeating if one has a highly opportunistic short-term focus.

Those who write content that only they could write are more likely to have sustained success.

PPC & SEO Synergy: How to Find Efficiencies Between the Two Channels

Posted by on Jun 7, 2019 in SEO Articles | Comments Off on PPC & SEO Synergy: How to Find Efficiencies Between the Two Channels

This post aims to cover a series of synergies between SEO and PPC that could help your business/clients run the two channels in a more efficient manner and optimise the overall spending.

This post is the first of a series of 3 articles: the last one will include a downloadable checklist.

Part one will cover some basic concepts that are important to reiterate to ensure we are all on the same page (whatever background you have, this post should be simple enough) and 3 synergy ideas that you could try yourself.

PPC and SEO have historically been seen as separate siloes, no matter how much we are trying to think of the opposite. We often see large companies struggle to promote synergies between departments which ultimately impacts knowledge sharing in the industry. However, one thing should be really clear:

Online customers do not care what frictions may exist between different teams in your company, all they want is a seamless, simple user experience.

But why would we do so, Sam? I am already too busy in my normal job, I have no time to spend testing and experimenting on synergies which may as well be not there or not worth my time. Yes, it does take time and patience to dig into this topic but I can assure you that it will provide value to your digital activity – that is the key point I am trying to tackle with this post.

Hopefully, by the end of it, you will have some thoughts for what may work for you and pass them on to your team/business. Some of the activities we will discuss will provide clear monetary advantages that you could benefit from today if done correctly.

Before diving into the first 3 test ideas, let’s go over some basic concepts that will help understand the rest of this post.

What is Google Ads’ quality score?

Google defines it as “an estimate of the quality of your ads, keywords and landing pages. Higher quality ads can lead to lower prices and better ad positions”.

Its score goes from 1 to 10 (1 being the minimum and 10 the maximum) and it is made of 3 main elements: expected click-through rate, ad relevance and landing page experience.

The higher it is, the more relevant your ads and landing pages are to the user. It is also used by Google to evaluate your CPC (cost per click) and multiplied by your maximum bid to establish your ad rank in the ad auction process. As a consequence, a higher QS means a higher ROI (I think some of you might already know where this is going!).

You can easily check your QS in the “Keyword Analysis” field of your Google Ads account and its components’ scores can be seen within 4 status columns: Qual. Score, Ad relevance, Expected click-through rate (CTR) and Landing page experience. For this post, I will focus mainly on the latter.

Example from our Distilled’s Google Ads account

What is Google Ads’ landing page experience?

Google says it is their “measure of how well a website gives people what they’re looking for when they click your ad”. The experience you offer directly impacts your Ad Rank, which indirectly links to your CPC and position in the ad auction.

Pretty simple, right? Wrong. And here is why: Google confirms there are 5 elements you can work on to improve the landing page experience. Without going too much into details (this post is not just about landing page experience after all), let’s go through point by point and highlight a few simple considerations:

1) Offer relevant, useful and original content

Point 1 is nothing new – Google has been advocating that the content is ‘king’ for years. This has been proven on a regular basis with Google’s latest major algorithm updates. What represents the main challenge from a PPC point of view, is the fact that we often use transactional category as landing pages for our ads, which often display zero/thin content or not very relevant. Something to think about!

Enjoying part one? Subscribe and we’ll send parts two and three directly to your inbox.


2) Promote transparency and foster trustworthiness on your site

Point 2 clearly relates to what has been at the centre of attention since August ‘18 with the ‘Medic’ Algorithm update, where Google has been focusing heavily on EAT (expertise, authoritativeness, trustworthiness) of a site – read more on this from Google directly.

3) Make mobile and computer navigation easy

Point 3 related to mobile-friendliness (in case you live under a stone, 2018 was the year of mobile first indexing), information architecture and user-friendliness: needless to say these have become important ‘indirect’ ranking factors nowadays. Check Tom Capper’s presentation on the topic here if you are interested in the subject..

4) Decrease your landing page loading time 5. Make your site fast

Point 4 & 5 clearly relate to the concept of speed for your site. Without going into a page speed rabbit hole, Google has been assessing speed as a direct element for landing page experience, and why wouldn’t they? It makes a lot of sense: how awful would it be for a user if, after clicking on an ad, the landing page takes a very long time to load? Chances are the user will bounce back to the SERP (pogo stick evaluation by Google – read more on it here [link to Will chat with John Muller]) which tells Google that is a ‘bad’ ad.

Without further ado, let’s start going through the checklist we created at Distilled to show you how you can make your PPC & SEO channels work harder together.

This post covers the first 3 synergy ideas of our checklist – the other synergies will be outlined in the following two posts.

Keyword research with SEO and PPC in mind: why synergy works better

The backbone of any PPC and SEO strategy is, without a doubt, keyword research, therefore why not try and find efficiencies from each other methodology?

From an SEO point of view, we tend to start our research by looking at a core group of keywords our client/site should be visible for and then expand on it, building a what some of us call keyword universe (yes I said it, a bit of a buzzword these days). It is a pretty simple concept: by creating buckets of keywords we identify opportunities where we envision our site to rank in the near future, after an adequate amount of work on our end. Sounds familiar?

PPC Keyword Research

From a PPC standpoint, we have the great benefit of testing, in real time, what works and what does not! Google Ads provides a platform with tons of data we can use to experiment on which keywords to include in our ad groups that help us decide how to structure the whole campaign and work towards efficiently using our budget. The approach tends to be slightly different: starting with broad keywords and moving to the specific, the opposite of the SEO one.

By using broader keyword matches, we can test which keywords are bringing clicks & conversion and optimise our account accordingly, with the final goal to have as many precise match keywords as possible (highly targeted and cheaper).

Do you see where this is going? By sharing both lists between the departments/people in charge, there are clear benefits in analysing what has worked for one and what has worked for the other.

Quick recap: why is this worth it?

  • SEO/PPC might have done quite a bit of the keyword research on their own (we all know how dull this task can easily become) so there is no need to duplicate the task/spend too much time to do it again – save your company time and money by sharing the findings from both methodologies.
  • As the two approaches are different (SEO: narrow to broad vs PPC: broad to narrow), chances are that combining the data will provide keywords that a siloed approach would have not brought to the table.
  • SEO can use PPC to easily test certain keywords that have worked well and potentially implement them on metadata (more on this on following posts), page copy and so on.

PPC as a content gap tool

Remember what we said about PPC keyword research methodology? Broad to specific, as opposed to specific to broad in SEO. While running ads with broad matches you can learn a lot about your users and your site. In this particular instance, I am interested in highlighting content gap opportunities which PPC can bring to light.

As most of you might know, last year Google introduced a new exact match type (read this nice breakdown on Search Engine Land to know more) which changed things quite significantly. However, for our test we just need to focus on the least restrictive match types, Broad match & Broad match modifier. Such match types allow your ads to appear whenever a user’s search query includes any words in your key phrase, in any order, with the exception of the + sign (the modifier) which locks individual words – for a search query to trigger an ad it must include that word.

Let’s explain this with an example: imagine Distilled was running PPC ads for SearchLove London, our digital marketing conference based in London. For our keyword of choice, “seo conferences”, we could run the following match types:

Match Type


Example keyword

Example search

Broad match


seo conference

best marketing events 2019

Broad match modifier


+seo conference

seo marketing events 2019

How PPC broader matches can help SEO

Simply put, broader match types have the capacity of triggering results which are going to match your selected keywords with the broadest possible searches – misspellings, synonyms, singular/plural forms, related searches, and other relevant variations. This exercise is extremely useful at the beginning of your PPC activity, when you are trying to figure out what keywords are driving clicks (and conversions) in order to then refine your campaigns with more specific match types – why not use this ‘exploratory’ phase for any potential content gaps?

By examining broad searches that are bringing clicks (even if few) to your account, you can immediately check them against your site’s content and offerings – when you see a ‘gap’, you can decide if it is worth exploring by creating a page/blog post/content accordingly. You will be surprised by the number of quick wins you might come across!

Let’s go back to our Distilled example: are users searching for ‘marketing conferences in 2019’? That gives me an idea to produce an article on this subject, talking about the best marketing conferences that are taking place in 2019, including SearchLove. My article will be topical, I know it will be relevant because I tested it with PPC and I can do a bit of self-promotion for Distilled’s event – isn’t this a win-win?

How to get started:

In case you struggle to get started, follow this process:

  1. Run, for a limited period of time, ads containing broader keyword match types: Broad match and Broad match modifier.
  2. Wait a week or so (the lower your budget, the more you will need to wait to gather data that is significant) and start reviewing the list of keyword such match types are triggering.
  3. Download an SQR (search query report): identify your top opportunities and cross-reference them against your site. Ask questions, such as:
    1. Do I have a page that covers this topic/keyword?
    2. If not, is it worth creating one?
  4. Further test idea: once you have a list of search terms from running the broad match for a period of time, you could pull them out into exact match keyword form, run for another week or so to get impression share data and solid estimates of traffic potential; bear in mind that impression share needs to be above 10% to get an accurate number.
  5. Get to work: create an article/page to “fill that gap” and optimise it. You may want to use it for PPC or simply for SEO purposes.

Quick recap: why is this worth it?

  • By testing broader keyword matches in your PPC campaigns, you could run into interesting gaps that your site’s content is not covering.
  • Create content accordingly and capitalise on these gaps.

Google shopping ads – errors that your SEO friends can help you fix

If you are running Shopping ads, I am sure you have come across several errors that will seriously test your patience. The good news is that some of these issues that might seem hard to understand for someone not familiar with SEO, can be easily flagged and fixed.

Among the long list of issues you may encounter in your Merchant Centre, I will focus on two in particular:

1) “Product pages can’t be accessed and / or Procut pages cannot be accessed from a mobile device”

Simply put, the landing page you picked cannot be accessed by Google, hence your ads will be disapproved. Why would this happen? Here is the list of reasons why you might see this:

  • Page not found (404): The landing page is not live and not found on the server.
  • Too many redirects: Your landing page is the victim of a series of redirects, 2 or more to be precise.
  • Couldn’t connect/HTTP 5xx response: The server cannot process the request for some reason.
  • Hostname not resolvable: When Google is not able to resolve the hostname.

How to prevent this from happening:

Before uploading all your Shopping ads, my advice to you is the following:

  1. Ask your SEO buddy (or do it yourself very easily – Screaming Frog is my go-to tool for any crawl type of job) to do a quick status check on all the URLs that you intend to run ads for in order to spot any anomalies (404s, 5xxs, redirect chains). If you spot these issues in advance you can chat to your devs and get them fixed quickly before running the ads: win-win situation.
  2. If you are not 100% sure that Google can crawl an individual page, then use the URL Inspection Tool in Google Search Console, which is something SEOs are very familiar with and use on a regular basis at URL level: this function will check how Google crawls or renders a URL on your site.

Side note: you have fixed a lot of issues and now your pages are eligible to appear in the Shopping ads, great news! How long do I need to wait until they actually show? If you don’t want to wait a few days, then Google themselves suggest to increase your crawl rate by changing the settings in Google Search Console – your SEO friends can help you with this too!

2) Images cannot be crawled because of robots.txt restriction

The image you selected is being blocked via robots.txt. In case you do not know what that is, ask your SEO friends or read Google’s explanation to understand all the details.

How to prevent this from happening:

If you want to avoid this from happening, my advice to you is the following:

  1. Run (or ask your SEO team to do so) a Screaming Frog crawl for the list of URLs you are planning to use and view which URLs have been blocked by robots.txt in two locations: ‘Response Codes’ tab and ‘Blocked by Robots.txt’ filter (read more here).
  2. If you want to inspect single URLs, you should use the URL Inspection tool on Google Search Console to see whether your pages have been blocked or not. Again, SEOs will be able to help you super easily here!

Quick recap: why is this worth it?

  • Instead of running into a lot of annoying, yet minor issues, when setting up your Google Shopping ads, get your SEO folks to help you prevent them from happening.
  • Start doing checks and using tools you normally would not use to help you with your daily work.

Part 1 of our SEO & PPC synergy series terminates here. Stay tuned to read the following 2 articles on the subject, the last of which will include a downloadable checklist.

If you enjoyed part one, subscribe to our email list and we’ll send parts 2 and 3 directly to your inbox.


How to Rank New Content Faster

Posted by on Jun 6, 2019 in SEO Articles | Comments Off on How to Rank New Content Faster

domain score

If I write a blog post on any topic, what do you think happens?

It typically gets indexed by Google the same day I publish the content and within a week it tends to rank high on Google.

Then again, I have a domain score of 94 and I have 633,791 backlinks. Just look at the image above. (If you are curious what your link count or domain score is, put in your URL here.)

But if you have a lot fewer backlinks and a much lower domain score, what do you think would happen?

Chances are your content won’t get indexed fast and it won’t rank as high as you want.

But there has to be a way to change this, right? Especially without building more backlinks because we all know that’s time-consuming and hard.

To find the most ideal solution, I decided to run a little experiment.

Around five months ago, I sent out an email to a portion of my mailing list asking people if they wanted to partake in an SEO experiment.

As you could imagine, I had well over a thousand websites who were willing to participate. I had to narrow down the list because for this experiment to be effective, a website had to have a domain score of 30 or less and no more than 40 backlinks.

That way it’s at least a challenge to figure out how to rank new content higher.

In addition to that, the site couldn’t be a subdomain, such as It had to be a standalone site.

Once I removed all of the outliers, I was left with 983 people who agreed to participate in the experiment. Of those, 347 stopped replying or backed out of the experiment due to time commitments, which means I was left with 636.

How did the SEO experiment work?

For all of the sites, we had them write a piece of content. We didn’t make it a requirement that the content had to be about any specific topic or that it had to be written a certain way… we just had them write one piece of content that was between 1,800 and 2,000 words in length.

We enforced the minimum and maximum length limit because we needed the post to be long enough to naturally include keywords, but if it was too long… such as 10,000 words, it would have a higher chance to rank on Google.

Each site had 30 days to write the piece of content and publish it on their site. Within 30 days of the content being published, we looked up the URL in our Ubersuggest database to see how many keywords the post ranks for in the top 100, top 50, and top 10 spots.

We also repeated this search 60 days after the article was published to see if there were any major differences.

The Ubersuggest database currently contains information on 1,459,103,429 keywords from around the world in all languages (a lot of keywords have low search volume like 10 searches per month). But for this experiment, we focused on English speaking sites.

We then split the sites up into 9 groups. Roughly 70 sites per group. Each group only leveraged 1 tactic to see if it helped with rankings.

Here’s a breakdown of each group.

  1. Control group – this group just published the article and didn’t leverage any promotional or SEO tactics. Having a control group allows us to compare how specific tactics affect rankings.
  2. Sitemap – all this group leveraged was a sitemap. They added the article to their sitemap, and we made sure the sitemap was submitted to Google Search Console.
  3. Internal linking – this group added 3 internal links from older pieces of content to the newly written article.
  4. URL Inspection – within Google Search Console you can request that they Crawl and index a URL. That feature is called URL Inspection.
  5. Social shares – Facebook, Twitter, LinkedIn, Pinterest and Reddit were the social sites that this group submitted and promoted their content on.
  6. Google Chrome lookup – for each site in this group, we had 40 people type in the URL directly into their address bar and look up the site. This could have been done on either mobile or desktop versions of Chrome. I added this group in there because I was curious to see if people visiting your site from Chrome browsers affects your rankings.
  7. Meta tags – my team optimized the title tag and meta description for everyone in this group. Based on the article, we crafted the optimal meta tags to not only include keywords but also to entice clicks.
  8. URL – with this group we only optimized their article URL to include keywords and we tried to keep the length around 50 characters as that is what they supposedly prefer.
  9. Everything – this group combined all of the tactics above other than the control group as they didn’t do anything.

Before I dive into the data, keep in mind that if someone was in one of the groups, we did our best to make sure that they weren’t leveraging any other tactic. For example, for everyone who wasn’t in the sitemap group, we had them remove their existing sitemaps for Google Search Console (other than the everything group).

Control group

So how many keywords does an average website with a domain score of 30 or less rank for in Google within a month and even two months?


I was shocked at how many keywords a site could rank for when it barely has any links and a low domain score.

But what wasn’t as shocking is how a web page’s ranking can increase over time. The orange line shows the number of keywords that ranked within the first 30 days and the green line shows the number over the first 60 days.

Sitemap group

You know how people say you need an XML sitemap, well it is even more important if you have a low domain score. At least, that is what the data shows.


When your site has very few links and a low domain score, you’ll find that Google may not crawl your site as often as you want. But by leveraging a sitemap, you can speed up the indexing process, which helps decrease the time it takes for your site to start ranking for keywords.

Internal linking group

Links, links, and more links… it’s what every site needs to rank well. Ideally, those links would be from external sites, but that’s hard to do. So, we tested how internal links impact rankings.

When you add internal links from your old content to your newer articles, it helps them get indexed faster and it helps push them up in the rankings.

Especially when these internal links come from relevant pages that have some decent rankings on Google.

internal links

Articles that leveraged 3 internal links had more page 1 rankings than sites that just used an XML sitemap.

URL inspection group

If you aren’t familiar with the URL inspection feature within Google Search Console, it’s a quick way to getting your content index.

Just log into Search Console and type in your article URL in the search bar at the top. You’ll see a screen that looks something like this:

url inspection

All you have to do is click the “request indexing” link.

url inspection

Leveraging this feature has a similar result to using the sitemap.

Social shares group

I’ve noticed a trend with my own website, in which if I create a piece of content that goes viral on the social web, my rankings for that new piece of content skyrocket to the top of Google… at least in the very short run.

And after a few weeks, I notice that my rankings drop.

Now, my site isn’t a large enough sample size and there are many reasons why my site ranks really well quickly.

Nonetheless, it was interesting to see how much social shares impact rankings.

social shares

Getting social shares substantially performed better than the control group, but similar to my experience with, the rankings did slip a bit in month 2 instead of continually rising to the top.

Social shares may not have a direct impact on rankings, but the more people who see your content the higher the chance you build backlinks, increase your brand queries, and build brand loyalty.

Google Chrome lookup group

Do you know how people are saying that Google is using data from Google Analytics and Chrome to determine how high your site should rank?

Well, I wasn’t able to prove that from this experiment.

I had 40 random people directly type in the URL of each new article into Google Chrome. I spread it out over a week, making sure they clicked around on the site and stayed for at least 2 minutes.

google chrome

The ranking results were very similar to the control group.

Meta tags group

Now this group performed very similarly to the group that leveraged internal linking. And the month 2 results outperformed all other groups.

meta tags

User metrics are a key part of Google’s algorithm. If you can create a compelling title tag and meta description, you’ll see a boost in your click-through rate and eventually, your rankings will climb.

If you want to boost your rankings through your meta tags, it’s not just about adding in the right keywords, you’ll also want to boost your click-through rate. Follow these steps to do just that.

URL group

The 8th group tested if URL length impacts how high a new piece of content ranks on Google.


Based on the graph above, you can see that it does. It didn’t have as much of an impact as internal linking or meta tags, but it did have an impact.

The key to creating SEO friendly URLs is to include a keyword or two and keep them short.

If your URL is too long and descriptive, such as:

The article will rank for very long tail phrases but will struggle to rank for more popular terms like “meta tags” compared to URLs like:

The beautiful part about the short URLs is that they rank well for head terms and long tail phrases.


The charts clearly show that little things like meta tags, URLs, internal linking, social shares, and even sitemaps help.

But the key to doing well, especially if you want your new content to rank well is to not just do one of those things, but instead do them all.


As you can see from the chart, doing everything gives you the best results. Now sure, some of the things are redundant like using an XML sitemap and using the URL inspection feature, but you get the point.

You’ll also notice that when you leverage everything together your results aren’t exponentially better… SEO is competitive and has turned into a game where every little thing adds up.

If you want to do well and have your new AND old content rank faster and higher, you need to do everything.

I know the tactics above aren’t anything revolutionary or new, but it’s interesting to look at the data and see how specific tactics affect rankings.

So, what do you think?

The post How to Rank New Content Faster appeared first on Neil Patel.

Top 6 WooCommerce Metrics and KPIs You Need to be Tracking

Posted by on Jun 5, 2019 in SEO Articles | Comments Off on Top 6 WooCommerce Metrics and KPIs You Need to be Tracking

It is soooo important to track WooCommerce Metrics. Statistics show that 48 % of all e-commerce sales come from returning customers, and if the behavior of your returning customers is, statistically speaking, far behind the average, your WooCommerce store definitely needs a do-over.

Nowadays, a lot of valuable data is within your reach, but if we don’t really know what we are looking for and how to give meaning to the data, we can easily get lost in percentage and numbers, and learn nothing.

Now, there are a lot of different metrics out there, measuring everything that can possibly be put in numbers,  some of which have little or almost no value for you.

On the other hand, there are KPIs which can provide you with valuable insights on how to tweak your strategy and advance your WooCommerce store. They can help you make informed decisions, ditch the strategies that are not working for you and employ new ones that will improve your website and overall business performance.

Here are the most important KPIs and metrics you need to be tracking if you want to give your business a boost.

Google Search Performance

In order to determine how well your WooCommerce store is performing, it is important to track how many times your website appears as a result of a Google search. This is known as the number of impressions and it is important because it shows you how many people your website reaches.

Besides impressions, you should also pay attention to how many of those people actually click on your website once they find it via Google search.

You can use Google Search Console to track these metrics, just simply sign in, add your website and verify it. The data you collect can help you determine your average click-through rate (CTR), by dividing the total number of click divided by the total number of impressions.

Keeping track of your Google Search performance can give you valuable information and help you decide on what needs to be changed in order to improve your WooCommerce store and boost sales.

Traffic Source

Another important performance indicator is the traffic source. There are two main reasons why you should know where do your website visitors come from.

First, if you’re running, for example, a paid marketing campaign on Facebook, you want to see how much traffic are you actually getting from it. If it turns out that your sponsored Facebook posts are not making much of a difference, you can stop wasting your money and direct your resources towards a more effective strategy.

That brings us to the second benefit of tracking traffic source. This information can help you determine which channel brings the most traffic to your WooCommerce store, so you focus on the one that has the most potential.

You can track these metrics using Google Analytics or opt for some free WordPress Plugins to get real-time statistics about your WooCommerce store.

Conversion rate

Even though the average conversion rate for a e-commerce business is no higher than 2%, the percentage of visitors who turn into actual customers is of crucial importance for developing your online marketing strategy.

Your social media campaign can bring you a lot of clicks, but if your conversion rate is much lower than the average,  you might want to look into the reasons why and try to fix it.

Driving qualified traffic is the most important goal of your marketing campaigns. To do so, your content needs to be properly optimized for search. In other words, your keywords, images, product pages, and other elements of your WooCommerce website have to be optimized so that it shows up in relevant searches. If you’re not sure how to get more qualified traffic and increase your conversion rates, OMG digital marketing company can help you with SEO optimization and make your business grow.

Besides not driving qualified traffic, some of the most common culprits behind low conversion rates are

  • Complex forms
  • No free shipping
  • Cluttered landing pages

So, make sure that your forms have only 3-5 fields and use error messages to let your prospects know if they made a mistake when filling them out.

Free shipping is one of the best ways to get your prospects to make a purchase because it’s considered a No. 1 incentive.

Your every single landing page should revolve around a single offer. Otherwise, your prospects will be distracted and confused regarding what action you want them to take.

Average acquisition cost

Since investing in traffic is a must for the success of every e-commerce business, you want to make your every dollar count.

Determining how much a conversion truly costs can make it easier for you to realize whether you’re investing in the right place, or the moment has come for you to switch to some other acquisition channel.

Nowadays that organic reach is pretty much dead, investing in paid ads is something that has to be an item in your marketing budget.

Even Mark Zuckerberg recently talked about the Facebook news feed overhaul. Namely, the social media giant will focus more on meaningful social interactions, which means that its users will see more friends and family posts, and less brand and publisher content.

In other words, by expecting to reach your target audience with your quality content simply by posting on social media won’t bring you traffic, so it’s essential to estimate which channels work best for you and invest in them.

Shopping cart abandonment rate

Nearly 70% of the prospects abandon the cart before making a purchase, statistics say.

Although the percentage of those who are just browsing without a genuine purchase intent is high, which is something you can hardly influence, other reasons for bouncing off your site are the ones you should be concerned with.

Maybe your checkout process takes too long or is too complicated?

1 out of 4 potential buyers abandons the cart for this reason, which hardly comes as a surprise if we keep in mind that there are often more than ten form fields one needs to fill in.

Extra costs being too high, the lack of payment methods and late shipments are just some of the issues you might want to have a better look at if you want to increase your sales.

WordPress plugins such as Abandoned Cart Lite can help you track your cart abandonment rate and recover abandoned shopping carts up to 25% by using email re-marketing and automatically sending notifications to those who abandoned orders.

Average order value

How much money on average a customer spends on one order is important information you can use wisely in advancing your business.

By improving this AOV value, your income will increase.

When placing an order your customers are already ready to buy, so you should just find a way to make them spend more and be happy about it.

Think about offering discounts, cross-selling, and up-selling as ways to raise this rate to the satisfaction of both sides. You can install a website plugin which will suggest related products to your customers during the checkout process. That way, they’ll be more likely to additional products or accessories for the product they already bought, and ultimately, increase your average order value.

Customer lifetime value

This complex KPI consists of a few metrics, which when combined provide you with an insight into how much every customer is going to contribute to your business.

This information is of vital importance when making a plan or a strategy. It will help you make a well-informed decisions and create a more successful marketing strategy, aqisition strategy, improve customer retention rates, and decide whether you should use your resources to attract new customers or to retain the existing ones.

You can gather some of this data through Google Analytics, or use WordPress plugins to make the most of it.

Retention rate

Prospects who have returned to your WooCommerce store spend significantly more time browsing through your product pages than new visitors.

If your retention rate is high, this means that you have not only provided your customers with a good product or service, but also with great customer experience, which is the reason why they are coming back.

If your retention rate is low, you may have to think of the ways to keep your customers, because a 5% boost in retention rate may increase your profit from 25 to 90%.

There are many ways you can engage your customers after their first visit, and you should consider those which best fit the needs of your buyer persona – a useful e-book, a  weekly or monthly newsletter, or social proof are just a few options you may consider.

Using these important metrics and KPIs is luckily not a matter of privilege in the digital era, with so many tools available to choose from. Still, the choice to approach them seriously and implement them wisely is yours only, and so are all the consequences of ignoring them.

The post Top 6 WooCommerce Metrics and KPIs You Need to be Tracking appeared first on WP Fix It.